AR Food Menu: A case study
Updated: Jun 4, 2019
Recently I had the pleasure of working with 302 Interactive on a case study for an unreleased application, which involves using augmented reality for food menus at restaurants. My role was to complete product documentation and define the user experience for it. The project was code named, "Hangry."
Augmented reality has been on the rise due to technological advances and more awareness around the technology. What's great about augmented reality is that it's applicable for many industries. If you're not familiar with AR, it's the overlapping of digital elements on the real world. Below is an example of Electrifly, an augmented reality app which brings pins, murals, stickers, and more to life.
Electrifly can be found on the Google Play store and App store. The app is free to download so definitely check it out! The app was developed by 302 Interactive and Electrifly Collective. You can go to the Electrifly Collective website for image targets to test the app.
Part of this case study was to figure out how we could use AR in a meaningful way in the food industry and not just as a "wow" factor. From a User Experience (UX) point of view, the goal was to provide users with information about their food items, but also make it an immersive experience which evoke feelings of wonder.
"Hangry" was created as an augmented reality food menu application which allows users to preview items on a menu from a restaurant from any location. This allows users to view restaurants near their location, get directions via Google Maps, and view their food menus in AR. We also wanted to provide a special mode for restaurants to use, so we added Kiosk mode. Kiosk Mode allows restaurants to have their menu on their devices quickly for customers to view without having to navigate through the restaurants list.
I started off with documentation, defining what features the app was going to have, user flow, target audience, and other metrics. I also completed a competitive analysis of what was already on the market or had been done before, so we could make sure to differentiate our app against competitors.
Once I got a few passes done on the documentation and an understanding of the overall app, I started working on the user flow chart and wireframes.
You can see the finalized wireframes below. Disclaimer: The coming soon sections were sections that we didn't want to fully flesh out in the demo, but would like to flesh out more in the future. We still wanted to add the sections as "Coming Soon" to show users that the features were in the pipeline.
Due to the nature of the app and it's usage, the audience can vary in age and technical expertise. I kept that in mind while designing the app and user experience. The menus utilize basic app conventions such as the navigation bar on the bottom of the screen, and swiping to switch through pages. This design can be seen in various other apps which are commonly used by the audience and thus should be accustomed to. For the AR mode, I referenced popular apps and their controls so that users who may not have experience with AR could still have a relative idea of how to navigate. For example, users can touch and hold on a food item to view a nutritional information or swipe over the food to view other items. A tutorial at the beginning of the app is also provided so new users can familiarize themselves with the app.
The idea is to have as minimal UI as possible in AR mode. There are two different ways to navigate through food items. The user can swipe on the food to navigate to different items or they can open the drawer and a carousal of items appear, which the user can select form.
During development of the app prototype, some design issues became apparent. One issue that we encountered was when the header that was seen throughout the app would disappear completely when the AR Camera view was triggered. Initially it was decided that header would be hidden so it would not break the immersion of the AR Experience, but instead this resulted in a disconnect between the rest of the app and the AR Camera view.
Although this made sense in Kiosk Mode, since users wouldn't see the regular restaurant view, it was off putting for users to see it disappear when they weren't in Kiosk Mode. We decided that when Kiosk Mode isn't activated, it would add the header in the AR Camera view, but only show it when the user was active on the screen. This solution helps the UI/UX be more cohesive with the overall user flow and still creates an immersive experience for users.
Another issue we ran into was with the nutrition info pop-up not working as expected. During testing, users would press and hold to activate the info pop-up, but the pop-up would not consistently appear. The app had to be pressed and held in a specific spot for the pop-up to show up, this resulted in users becoming frustrated with the experience. Other colliders in the scene interfered with the info pop-up collider that detects when the user presses the food item to see the info pop-up. After completing some adjustments and testing we were able to resolve that issue.
We created an AR food menu application which has brought more meaning to customers when previewing their food options in AR and to their overall restaurant experience. Using "Hangry," customers will be able to enter restaurants and view what their food will realistically look like before ordering or receiving it. For AR applications, the less UI that appears on the screen the better! Too much UI in an AR experience breaks the overall immersive experience which augmented reality can provide.
Hope you enjoyed this post! Feel free to reach out if you have any questions.