UX Engineer and Design Technologist
first.jpg

Copy of Roshni

UX design

 
first.jpg

Roshni

Hindi meaning Light

The Roshni app allows a visually impaired user to navigate indoor environments through voice interaction. By installing the Roshni Bluetooth hardware devices in the user’s surrounding location, the user is guided to their destination by pairing with the Roshni app as a sensor voice navigation. Modern navigation solutions are limited for the visually impaired to navigate indoor spaces, as the accuracy of the GPS is not reliable indoors. The Roshni app solves the pain point of the user to have clear and trustworthy guided navigation.

 

Interface design

The interface of the app was created to keep in mind the disability of the user. The navigation could not be dependent on the users vision. Each active interaction of a swipe will bring the user to a new feature on the app.  The interaction allows the user to swipe full screen, left, right or up in order to go to different views. The app always provides feedback for the user to confirm which feature or navigation he or she is on

splash.jpg
1.png
 

Voice interface design

The app has an option to switch from touch interface to audio interface quickly by using voice activation software installed in the app.  This allows the user to give voice commands to the app and also communicate with space around it. If a specific landmark is saved on the system, the user can inquire about it through the conversation interface.

 

Wireframes

 
wireframe_paper.jpg
 

Interaction Language

The interaction design for the app had to work for people who are partially or fully visually impaired. To develop the interaction system, I leveraged the screen-hand ergonomic positions. When the phone is held in the hand there are five key interactions we can do without looking at the screen. These interactions became my basic interaction elements.

1. Swipe left

2. Swipe Right

3. Tap on the Top half of the screen

4. Tap on the bottom half of the screen

5. Tap on the bottom edge of the screen

In all 5 interactions, the position and action of the thumb/finger are different enough for the user to understand them cognitively without looking at the screen.

Tap area.png
swipe.png

User Flow

Letter+Copy+2.jpg

Destinations

This feature allows the user to navigate to a specific destination. This features engages with the hardware installed in the infrastructure to triangulate the position and assist the user to reach the destination.

Summons

Some times for visually impaired users it is difficult to navigate through the space. The app allows the user to send a summon to their care takers to come to them. The app shares the location of person summoned with the user.

Explore

This feature allows the user to explore the space around them. The app communicates with the bluetooth beacon around the building that updates the app with the landmarks closest to the user.

 
6.jpeg

Navigation system - Bluetooth Beacons

The system consists of wall sensors that locate the wearable on the user in the building. The triangulated position is communicated to the phone that overlays the triangulated position on the indoor map. The phone announces the position of the user at every land mark. User can get feedback on their position by talking into the app. App works with both touch interface and voice interface.

 

User Testing and Analysis

 

Interface design