Navigation using Augmented Reality for Users with Low Vision
Eliminate Boundaries
This research was presented at the Visual Languages and Human-Centric Computing (VL/HCC) conference hosted by IEEE in Memphis,USA.
The challenge
Creating Usable Access
Independence isn't something that you should be fighting for, it should be a basic right. Unfortunately, the society that we live in now does not prioritise this.
Having lived with a disabled person for all my life, I have observed how simple day to day tasks can become unnecessarily difficult. These challenges are mainly driven by a lack of foresight and empathy while creating products.
This led to this research about using technology to enable disabled people to lead a much more active and independent lifestyle.
My overarching goals for this project are to:
1. Understand the challenges users with low vision face while navigating
2. Create solutions using technology as an enabler
3. Raise awareness about the importance of a barrier free environment.
Timeline
Accelerated Innovation

What is low vision?
Getting a Grip


Types of low vision

Existing solutions
1. Drishti (Indoor Navigation)
2. Aira
3. Aridane GPS
4. BuzzClip
5. Nearby Explorer
Navigational apps for people with low vision hardly exist on the app store. If they do exist, they rely on physical sensors that you need to buy to use the app. Solutions that share your camera feed with a centralised 'assistant' brings about the question of privacy and trust. Or else, they have a steeply priced subscription based model that is not accessible to everyone.
The approach
Embrace Diversity
Having never undertaken a project solely based on accessibility before, this was a very steep learning curve. It has been a very interesting curve to climb though.
Something that I picked up very early on was the fact that people are really good at adapting to different situations. When something didn't really work the way they wanted it to, they adapted, and made it work the way they wanted it to.
Inclusive design is something that I am very passionate about, and this project allowed me to explore that in greater detail. Understanding how to put people at the center from the very start of the process and continuing to get their diverse perspectives thought the process was a very interesting challenge.
Solve for one, extend to many
By incorporating inclusive design right from the start, the product can be used as a stepping stone for greater solutions. For example, if an application is designed for a user with low vision, design languages from it can be used to improve at glance information in crowded spaces such as airport signages.
The workflow
Plan, learn, adapt, iterate



%20copy.png)
This being my first solo UX project, I found it imperative to sit down and formulate a concise plan that I could follow. It was very helpful because I had essentially created a list of all my to-do's in order to successfully carry out my research project.
From my time at Boeing, I learnt the management principle of lean UX planning and how it can help people stick to deadlines. Even though lean UX is targeted mostly at large teams, I found that it works incredibly well in solo projects as well. It served as a constant reminder of my goal, and every time I thought I was straying away from it, I came back to this plan.
I also designed an initial user survey questionnaire to test out the current scenario and get a basic understanding of the user base. They were circulated among communities locally and on internet platforms such as Reddit and Discord.
It was very interesting to see the response from people with low vision using technology already. Having constant interactions with them about design and how it affects their usage really helped shape the application visually.

+
Local Community

+

Discord
Data from the survey aided by my interactions with the low vision community helped me set up some red routes to identify key workflows. This served as a good halfway point between by UX research and UI design.

The Technology
Checkpoint navigation

Checkpoint based navigation is a type of navigational interface where the software generates a list of checkpoints along a users route. The route planning algorithm gets static and dynamic models, plans an optimised route and sends it to the way point generator. Once the user reaches a predetermined checkpoint, the application intimates the user to pull their phone out and scan their surroundings. The object recognition framework along with tracking data, video input and orientation data checks if the user is indeed in the right spot. This is used as a check mechanism to let the user know that they are on the right path and are actually standing in front of a checkpoint. Feedback from the application is generated in the form of haptic feedback and also voice based commands. The user at any time may choose to use the augmented reality (AR) navigation interface. This tool draws out a path in AR and the user can follow it to reach the next checkpoint/destination.
The number of checkpoints along a route are determined by:
1. Length of the route
2. Complexity of the route
3. Time of day
4. Available data along route
The user can also choose to add favorite routes to help them navigate between known spots easier. Having an SOS button to let others know that the user is lost is also a key user interface element. People around can help the user in case the application is not able to locate itself successfully.
Demo video
Early prototype
What's new?
1. Increased safety
2. Bespoke design
3. Higher accuracy
4. Increased independence
What needs to improve
1. Voice based navigation interface for the app
2. Automatically set contrast,font based on type of visual impairment