Navigation using Augmented Reality for Users with Low Vision

Eliminate Boundaries

This research was presented at the Visual Languages and Human-Centric Computing (VL/HCC) conference hosted by IEEE in Memphis,USA. 

The challenge

Creating Usable Access

Independence isn't something that you should be fighting for, it should be a basic right. Unfortunately, the society that we live in now does not prioritise this.

 

Having lived with a disabled person for all my life, I have observed how simple day to day tasks can become unnecessarily difficult. These challenges are mainly driven by a lack of foresight and empathy while creating products.  

This led to this research about using technology to enable disabled people to lead a much more active and independent lifestyle.

My overarching goals for this project are to: 

1. Understand the challenges users with low vision face while navigating

2. Create solutions using technology as an enabler

3. Raise awareness about the importance of a barrier free environment.

Timeline

Accelerated Innovation

Timeline copy.png

What is low vision?

Getting a Grip

hide.png
Group 1.png

Types of low vision

Poster 2 copy.png

Existing solutions

1. Drishti (Indoor Navigation)
2. Aira 

3. Aridane GPS

4. BuzzClip

5. Nearby Explorer

Navigational apps for people with low vision hardly exist on the app store. If they do exist, they rely on physical sensors that you need to buy to use the app. Solutions that share your camera feed with a centralised 'assistant' brings about the question of privacy and trust. Or else, they have a steeply priced subscription based model that is not accessible to everyone.

The approach

Embrace Diversity

Having never undertaken a project solely based on accessibility before, this was a very steep learning curve. It has been a very interesting curve to climb though. 

 

Something that I picked up very early on was the fact that people are really good at adapting to different situations. When something didn't really work the way they wanted it to, they adapted, and made it work the way they wanted it to. 

Inclusive design is something that I am very passionate about, and this project allowed me to explore that in greater detail. Understanding how to put people at the center from the very start of the process and continuing to get their diverse perspectives thought the process was a very interesting challenge.  

 

  

Solve for one, extend to many

By incorporating inclusive design right from the start, the product can be used as a stepping stone for greater solutions. For example, if an application is designed for a user with low vision, design languages from it can be used to improve at glance information in crowded spaces such as airport signages. 

The workflow

Plan, learn, adapt, iterate

UI.png
UX.png
UI.png
General (1) copy.png

This being my first solo UX project, I found it imperative to sit down and formulate a concise plan that I could follow. It was very helpful because I had essentially created a list of all my to-do's in order to successfully carry out my research project. 

From my time at Boeing, I learnt the management principle of lean UX planning and how it can help people stick to deadlines. Even though lean UX is targeted mostly at large teams, I found that it works incredibly well in solo projects as well. It served as a constant reminder of my goal, and every time I thought I was straying away from it, I came back to this plan. 

I also designed an initial user survey questionnaire to test out the current scenario and get a basic understanding of the user base. They were circulated among communities locally and on internet platforms such as Reddit and Discord.  

 

It was very interesting to see the response from people with low vision using technology already. Having constant interactions with them about design and how it affects their usage really helped shape the application visually. 

meeting.png

+

Local Community

iDdntscPf-nfWKqzHRGFmhVxZm4hZgaKe5oyFws-

Reddit

+

ZOKp8LH.png

Discord

Data from the survey aided by my interactions with the low vision community helped me set up some red routes to identify key workflows. This served as a good halfway point between by UX research and UI design. 

General.png

The Technology

Checkpoint navigation

The Procrastination Flowchart_2x.png

Checkpoint based navigation is a type of navigational interface where the software generates a list of checkpoints along a users route. The route planning algorithm gets static and dynamic models, plans an optimised route and sends it to the way point generator. Once the user reaches a predetermined checkpoint, the application intimates the user to pull their phone out and scan their surroundings. The object recognition framework along with tracking data, video input and orientation data checks if the user is indeed in the right spot. This is used as a check mechanism to let the user know that they are on the right path and are actually standing in front of a checkpoint. Feedback from the application is generated in the form of haptic feedback and also voice based commands. The user at any time may choose to use the augmented reality (AR) navigation interface. This tool draws out a path in AR and the user can follow it to reach the next checkpoint/destination. 

The number of checkpoints along a route are determined by:

1. Length of the route 

2. Complexity of the route 

3. Time of day 

4. Available data along route

The user can also choose to add favorite routes to help them navigate between known spots easier. Having an SOS button to let others know that the user is lost is also a key user interface element. People around can help the user in case the application is not able to locate itself successfully. 

Demo video

Early prototype

What's new?

1. Increased safety

2. Bespoke design 

3. Higher accuracy

4. Increased independence

What needs to improve

1. Voice based navigation interface for        the app

2. Automatically set contrast,font based      on type of visual impairment

The Takeaway

To infinity, and beyond!

1. Inclusive design is not only for disabled people

 

Inclusive design should not be an afterthought. Diversity is what makes us unique and we should learn to embrace it. Accessibility should be baked into the product right from inception. Products that have an inclusive design can reach a wider user base while championing social change.   

2. Empathy is key

 

For those of us who may have low vision, having a sound understanding of navigational difficulties can make or break a product. Assembling a focus group, brainstorming and coming up with solutions as a collective proved to be very beneficial. Refraining from asking questions like “How does it affect...” and asking questions like “How can we improve...” helped generate a connection with the user base.

3. Relearning colour and font theory

 

For people with low vision, aesthetics may matter less than functionality. Learning how font, colour and their perception of it greatly influenced the design process and final product.