Aural Delicacy is an app that that aims to provide visually impaired diners with an enjoyable outdoor dining experience by providing restaurant menus that are accessible to the visually impaired. Visually impaired diners can make use of iPhone's VoiceOver functionality to intuitively browse through the menu items from different restaurants and add them to selection if they are interested in ordering the menu item.
Our app uses crowd-sourcing to increase our collection of accessible menus available to our visually impaired users. Volunteers can snap a photo of a restaurant's menu and submit it together with the restaurant's information, to help us add the restaurant to our app and build a more inclusive dining experience for the visually impaired.
A visually-impaired person faces many challenges when dining out and one of the biggest obstacles is being able to read the menu. According to an blogpost written by a vision-impaired writer, visually-impaired diners usually depend on the friends they are dining with or the waiters at the restaurants to help them read the menu. However, many visually-impaired diners would be afraid of inconveniencing their friends and the waiters into reading the entire menu to them and would compromise by ordering items that are recommended instead of items that they truly want. Newer technological solutions such as using Optical Character Recognition (OCR) also has its drawbacks as visually-impaired diners are not able to skip food sections that they are not interested in and have to spend the time listening to everything on the menu.
We believe that we can build a pleasant dining experience for our visually-impaired companions such that they are able to dine independently and conveniently.
Hence, the objective of our app is to convert restaurant menus into a digital menu that is supported by iPhone's VoiceOver functionality to allow visually impaired users to browse through a menu like how a normal diner would. Information are categorised and organised together to provide visually impaired users with the neccessary information to make their decisions swiftly and conveniently.
To set up the project locally and get it running on your iPhone, follow these few simple steps. While our app can be runned on the iOS simulator on xCode, we recommend running it on your personal iPhone device for you to be able to access the Camera function.
- Clone the GitHub repository
git clone https://github.com/ShiHui21/ReadaMenu.git
- Open up the project in xCode
- Compile and run the app
Video Link of Browsing Process (Voiceover-Mode): https://www.youtube.com/watch?v=27C9skqdx4k
Double-clicking when VoiceOver mode is enabled (or single-clicking when VoiceOver mode is not enabled) will add the item to selection and allow the diner to review what they have chosen again.
Double-clicking when VoiceOver mode is enabled (or single-clicking when VoiceOver mode is not enabled) will remove item from selection.
Video Link of Menu Submission Process: https://www.youtube.com/watch?v=FF1F5tVsN2w
Nanonets Python Script (Google Collab Notebook): https://colab.research.google.com/drive/1Ar9x-mY-a1tCf3ebaBDjabLuzQcjuAf_?usp=sharing
- Ryan Pan Tang Kai (https://www.linkedin.com/in/ryan-pan-27533517a/)
- Luah Shi Hui (https://www.linkedin.com/in/shihuicsd/)
- Visshal Natarajan (https://www.linkedin.com/in/visshal-natarajan/)
- Keshav Natarajan (https://www.linkedin.com/in/keshavnatarajan/)
Project Link: https://github.com/ShiHui21/ReadaMenu
We would like to extend our deepest gratitude to Qi Yan, Nicole and Riley from SUTD Swift Coding Club's ExCo for their invaluable guidance throughout the development of our project.
We would also like to like to thank Kwek Bin from Singapore Association of the Visually Handicapped and Christian from Apple, who put aside their valuable time to provide insightful feedbacks to our idea and prototype during our consultation sessions.