Branding
Since the application is affiliated with the University of Michigan I had to abide by the university's branding guidelines when designing, especially in relation to the logo.
Sketches


Low-Fi Prototype

Design Iterations
After creating a basic layout and flow of the application, we went back in and created various iterations of the home screen that we felt would best layout the three primary functions of the app. Below are our three basic layouts.
Home Page:
Final Layout:
Logo Iterations

Final Logo Design


Style Guide
We aimed to select a style that would reflect the outdoors and give the app a natural look. Below is the color palette and typography. We chose a more blue-green color palette rather than a strictly green color palette to give it neutrality. The darker blue was our primary color (#042940) while the middle blue became our accent color (#588C7E). We felt it matched the natural colors we were aiming, giving it a natural feel.


Mobile Application
We were asked to present our final product at the end of year showcase in front of ITS and University of Michigan Staff. The app is a proof of concept and can run off local devices when given access through the Expo app. We have detailed instructions for both how to run the app locally as well as how others who wish to add to this application can both attain and edit the existing app.
Welcome to the Nichols Arboretum! It’s a beautiful day, and we’re glad you could join us. Enjoy the scenery and Michigan plant life as you walk down one of the numerous trails. I see you stumbled upon an interesting plant! Are you wondering what it is? We have an app that can help you with that!
UM Plantify is a simple, easy-to-use mobile app for plant identification. Its three main functions are clearly displayed on the home page. The help button in the upper-right corner takes you to the Help page that has instructions for taking a good photo as well as descriptions of the app features.
Choosing “Take Photo” brings up your camera and allows you to take a photo of the plant you’d like to identify. After taking a photo, you’re then prompted to confirm the photo in case you’d like to retake it. After choosing “Yes”, the image is processed by the machine learning model, and its prediction is displayed along with information about the species. The displayed information includes a photo of the plant along with helpful information that you can also find in the Database.
The Database feature allows you to scroll through the species that the app supports. Choosing a plant, such as Scarlet Beebalm, in the database brings you to the information page for that plant. The displayed information includes a description of its appearance, information about the plants origins and locations, and a fun fact. Use this database to learn about any of the supported plants without having to take a photo of it.
If you’re using this app while you’re in the Arb, then we included the Map feature to see your live location in the Arboretum. Looking at the screen you can see the location services at work showing the user where they are within the arb. Users also have the option to view the labeled Arboretum map. This allows users to view the static map in comparison to the location services to see what landmarks and trails are in their surrounding area.
Reflection and Next Steps:
We created a simple app that serves as a great proof-of-concept for plant identification. Despite the fact that our app currently identifies 20 species, we have set up an easy process to add more species to the classification model, it’s just a matter of obtaining more photos and re-running the script to train the model. Currently the app's accuracy is over 90%, but it has had trouble getting the correct classification for spider flowers. This can be fixed by adding more data (pictures) of spider flowers to the model. We hope to offer more features such as image recognition, photo tracking on a map, and User profiles. We hope that this app will serve as a strong foundation for a tool that allows anyone to learn more about plant species and identification and we hope for it to be picked up and further integrated in the future.
Ideas for improvements to the app:
- Add the ability to access the camera roll so that the app will support the upload of a photo instead of only taking a new photo.
- Implementing live detection so the user can start the prediction when the app has recognized what in the image is the intended plant.
- Integrating user accounts into the app so that users can save their photos and what species their photos were predicted to be.
- Implement a process to verify user images and utilize them to retrain the TensorFlow model.
- Deploy the app using resources such as AWS.