Neha Kale's profile

Hackathon-ASSETS'17 Self-tracking and Accessibility

Hackathon- ASSETS'17

For the hackathon organized at Assets'17, we looked at ideas for making the world more accessible. We discussed the various tools and apps that are accessible to the blind people and the problems they face due to lack of accessibility in certain places.

The theme for this hackathon was Self-tracking and we brainstormed on a lot of ideas with 2 legally blind team members before deciding to work on a problem.

Problem
Self-tracking can be done in several ways, such as health monitoring, diet control, daily
reminders, and more. One self-tracking task is being able to monitor one's physical
parameters, such as heart rate while doing physical activities like running on a treadmill.
On a typical treadmill, the interface must be visually observed by a person for it to be
useful. A person can see visual parameters which include the running speed, calories burnt, duration, miles covered, and heart rate. However, for a person who does not have access to this visual interface (e.g. blind), real-time tracking of these parameters is a challenge. In this case, they would have to ask for assistance from people around
them to read these parameters, which is cumbersome for the user and interruptive to
their workout routine.

Proposal
In order to tackle this problem, we propose a non-visual interaction modality that makes
the real-time tracking of these parameters possible and more universally accessible. Haptic interfaces have been proven to be a modality that can be efficiently used for output
functionalities, including vibration alerts or feedback on data input. Haptic feedback is generally received on the phones however it can be in the form of a
wearable device which can communicate with the phone wirelessly. Examples include using wristbands or armband or headbands. In this proposal, we wanted to examine if the neck area can act as a medium for hosting the haptic wearable interface. We know that haptic feedback has certain limitations in the number or complexity of patterns that can be well distinguished and communicated by the user compared to other types of interfaces
like audio or visual modalities. Therefore, we would like to understand how a minimal set of commands can be designed which are distinctive enough for the user to perform their desired real-time tracking activities mentioned above. In particular, we would like to explore the area of neck haptic interactions for accessible self-tracking.

Other uses of this interface 
Haptic neck interfaces can be utilized in many applications with minimal input/ output
interactions that can help their users to communicate with their smartphones for certain tasks. Examples include receiving turn by turn navigation instructions using GPS, wayfinding apps which will help blind users experience less distraction compared to audio/ speech feedback, to connect to otherwise inaccessible exercise equipment and other types of data tracking and providing reminders and simplified feedback mechanisms.

Interface Design
List of items needed to build this interface
1.Input- Two input buttons one on each side of the neck that can receive a single tap, double tap, one button at a time or both inputs at once.
2.Language- This interface can be customized for each workout tracking scenario. The
user can assign certain parameters of their choice to be communicated to them through
the haptic interface. Examples are as follows-
a. Input language
i.Single press on the right
ii.Single press on the left
iii.Single press on both buttons at once
iv.Double press
b.Output language
i.Vibration on the right
ii.Vibration on the left
iii.Vibration on the back
3.Output- Three actuators/ buzzers to provide haptic feedback, one each on the left, right
and back of the neck

Methodology
To evaluate our proposal, we constructed 2 low fidelity prototypes, one to display the
inputs and one to display the outputs. For the input prototype, we constructed the form factor of a neck headset with pipe cleaners. We used little bits to simulate the button input and sound output. For each of the 3 haptic sensors, we used little bit buttons, threshold, and a buzzer. We placed a sensor set on two sides of the neck and one at the back.
The buzzer simulates the haptic sensor and in order to activate the buzzer you have to
press the button.

For the second prototype, we used Arduino and three speakers from the little bits to simulate the tactile feedback using low-frequency intervals played back
by each speaker for a certain duration of time. To make this system work, we used the interface programming kit connected through USB. One of our team members created a frame for the speakers using a pizza box to hold the speakers around the neck. We then tested the system with two of our other teammates who are legally blind and received their feedback on how well they could perceive the tactile feedback generated by the speakers around their necks.

Hackathon-ASSETS'17 Self-tracking and Accessibility
Published:

Hackathon-ASSETS'17 Self-tracking and Accessibility

Published:

Creative Fields