NavAssist Application

For my final year project in UTM, I worked on the research and design of a navigation application for the blind people of developing countries. The target of NAVASSIST application aligns with SDG (sustainable development goals) 10, which is to reduce inequality within and among countries. It aims to provide equal opportunities for legally blind people when it comes to something as simple as navigation.

Application Homepage

The Product: 

The target of NAVASSIST application aligns with SDG (sustainable development goals) 10, which is to reduce inequality within and among countries. It aims to provide equal opportunities for legally blind people when it comes to something as simple as navigation.

Project Duration

1 year


The Goal

The aim of this research is to study the problems faced by legally blind people when navigating with current tools and technology, and also discover what kind of UI/UX is more convenient and preferred by blind people of developing countries, whilst navigating.

The problem

According to a research published in The Lancet Global Health, there are an estimated 36 million blind individuals worldwide, with this number expected to rise to about 115 million by 2050, with the highest impact in developing countries.  According to the Daily Sun (2021), more than 750,000 people in Bangladesh are blind, and there are around 40,000 visually handicapped women and children. The number of visually challenged people in Malaysia as of October 2020 is 51,540. 

Research Objectives

  • To study the problems faced by blind people when navigating, by collecting data and opinions from legally blind people of developing countries, such as Bangladesh.

  • To propose two most suitable types of accessible prototypes with good UX, but different interactions, design, layout, spoken content and audio descriptions, which address the user's problems and concerns.

  • To compare and contrast how the user engages with the two prototypes that have been developed and investigate the effectiveness from a UI/UX perspective.

Understanding the User

Empathize: Literature Review Summary

  • Blind people in developing countries were initially completely dependent on others for basic needs and locomotion. As a result, many blind people relied on the traditional white cane and a trained dog to navigate despite their limitations (Sahoo et al., 2019, Baldwin, 2003).

  • A white cane is a tool that helps blind people assess their surroundings. Although the cane for the blind is inexpensive, it cannot detect barriers and can only sense obstacles by touch, giving the user less time to react to situations, which is extremely dangerous.

  • Despite these disadvantages, people from low-income countries still rely on the white cane today because of the lack of assistive devices in these regions.

Empathize: Semi-structured Interview

  • After conducting a semi-structured interview with 14 blind people in Bangladesh, it was found that most of the participants use the white cane or the smart white cane.

  • They also have to be dependent on other people for their location and directions. All of the participants were dissatisfied with the current modes and methods.

  • Almost all participants showed preference for a voice-assisted or an instruction-based software/application. the majority of the participants would prefer a combination set of devices, such as an earpiece/earphone, voice-assisted application, and the smart white cane/white cane. They do not want to let go of the cane as it gives them a sense of awareness.

Define: Thematic Analysis

Define: User Personas

Starting The Design

Mind-map 1:

The mind-map is developed based on the needs of the user personas and also keeping the thematic analysis in mind. It is a navigation application, which has instruction-based features. As the users cannot use their vision, they make use of other senses, such as hearing, touching, and speaking.

Mind-map 2:

Second mind-map displays all the important UI/UX aspects of the navigation application. UI/UX is especially important, in terms of button layouts, button response, button size, instructions per page, and button placement. It is specifically important as users are blind. Therefore, the navigation application should be easy-to-use and have good UI/UX.

Low-fi Prototype:

This is part of first flow of the navigation application. The user can either register or log in. All the texts present in the screens will be read out. The buttons will also make distinct sounds when they are pressed. The user will be entering the input using the microphone. The user’s comfort with the instructions, button placement and sizes will be tested.

For the full low-fidelity prototype, check out the link: Low-fi Prototype

The navigation flow covers the sequence in which information will be presented to the user. This is important as the information contained in one page cannot be a lot as the users will be hearing the instructions.

Compare Designs

Prototype 1:

This is the user flow that the user has to go through when they open the NavAssist application for the first time. As soon as the user opens the application, they hear the audio descriptions accordingly and decide if they want to register or login. The users can translate to their native language, Bangla, anytime, by pressing the button at the top-right corner.

The user flow-3 comes after successfully registering or logging in for the first time. In this user flow, the user can hear tips, which explains about the functionalities and the features of the NavAssist application. It explains the two main functionalities of the application, which is to know their current location and also to navigate somewhere. In prototype-1, the three different buttons present in each screen will make three different sounds when pressed. This is in conjunction with code (11), which states that the vibration alone from the smart white cane is not enough.

The figure below shows that in order to login the user only has to enter their password. This simple login is done in order to make the process more convenient and faster for the users. This can also be translated to Bangla in the same way.

As they enter the already downloaded application, they will be presented with the two main functions of the applications, which is they can go somewhere and they can know their current location as well. This caters to code (7) people may misguide when asked about current direction and current location, and code (8) not enough technologies available based on needs.

The fifth user flow is the user flow of finding out the current location. When the user presses on the current location, that is the bottom button, the user is told about their current location, resulting in solving the concerns regarding being misguided by people.

Next, the user flow of “Let’s go somewhere” (Go somewhere new). In this flow, the user has to input the address using the microphone and begin their journey.

When the user presses “Go to saved addresses”, the saved addresses are displayed and after the user clicks on one, they get to choose the mode of transportation. They are directed to the map and short instructions are provided. After they reach their destination, they can go back to the homepage.

When the user presses “Go to saved addresses”, the saved addresses are displayed and after the user clicks on one, they get to choose the mode of transportation. They are directed to the map and short instructions are provided. After they reach their destination, they can go back to the homepage.

Prototype 2:

In prototype-2, the two buttons are placed in the center and the bottom and in the given instructions “center'' and “bottom” are also mentioned, whereas in prototype-1 only “top” and “bottom” are mentioned in the audio descriptions.

Quantitative Results

Prototype 1

Prototype 2

Qualitative Results:

When asked about how the users feel when they are navigating in a real-life setting using the application, most of them find it more convenient. P1 quoted, “...not that much difficulty is encountered. The three devices are a good combination.”. P2 and P3 quoted, “..it is a better experience than only going out with a white cane”.

P3 also mentioned that the features, particularly the hearing tips flow, makes them feel more independent. The overall feelings are positive. However, one participant expresses concerns about whether it will be as easy in a very noisy setting.

Going Forward

Takeaways:

The results of the usability testing, in which both quantitative data and qualitative data is collected, show that prototype-2 is more effective and easier to use for the users. Prototype-2 on day-1 scores significantly better on placement of buttons. The same kinds of sound when pressed also makes it less confusing for the users and it is something that they will get used to. Moreover, the center button is more preferred than the top button. This is because the centre button is easier to access than the top button as the upper area also has the translation and the saved button, the user is more likely to make a mistake. Overall, the features and functionalities are preferred by both the groups. The feature of translating at any time particularly impressed the users. The application is found to be easy-to-use and overall a positive attitude is displayed by all user groups.

Future Work

  • Future works can include replicating the same study in other regions and with a larger number of people.

  • The UI/UX aspects of inputting information may also be improved.

  • The study can act as a good reference when developing applications for blind people.

Next
Next

Security Bank Corporation (SBC) Philippines - UX Research