top of page

Project Running

An early-stage prototype for people with low vision to run following a guide, without a tether. We built it using Open CV.

In collaboration with Ravi Morbia & Jason Chu under the supervision of Dr. Patrick Carrington

The Client

Catherine Getchell

My Roles

UX Designer
Lead Developer

Time Frame

4 Months:

Mid January - Mid May, 2018

The Design Process

170702-D-DB155-001.jpg

The Problem Space

Currently, low vision runners need to be closely tethered by the hand, which means the runner and sighted guide must run synchronously. The challenge for us was to create a solution that allows low vision athletes to run untethered from their sighted guide.

DoD photo by EJ Hersom

fisheye_edited.jpg

Working Within Limitations

The trials ruled out the use of auditory feedback. Even bone-conduction headphones, (which are often used by persons with low-vision for navigation by GPS), failed to supply the range of hearing necessary for an athletic activity such as running. A haptic feedback system was chosen instead. The client also indicated that she would prefer a belt to the alternatives for haptic feedback, such as a chest-piece, or armbands.

Testing Feedback Methods

While the marker system was being developed, user testing was conducted with our client. We tested information feed-back through a "Wizard-of-Oz" method: Sending signals to the user through auditory channels was mimicked by having  third person run beside the user, telling them when and what direction they saw the marker on the guide's back move. The level of detail provided was varied and tested during the trial runs as well.

"Wizard-of-Oz" test
IMG_0808.jpg

Finding a Signal

Research began by choosing a means of detecting the guide's position relative to the runner. While there are a number of methods available (radio, audio detection, Bluetooth, GPS) they almost all lacked the combination of precision and reliability that was needed. Computer vision was the only system that could adequately provide both. A marker reading system using Open CV was to be modified for our use.

Ideation

The team began by sketching out alternatives to the tether system. Ideas ranged from direct communication using buttons to conveying directional information using audio, or haptic feedback. A wide variety of wearable solutions for housing the system were considered, such as haptic wristbands, chest-pieces, or headsets.

ideation sketches

The Prototype

IMG_1424.JPG

Tracking the Marker

The initial build of the marker detection system was developed in Objective-C for the iPhone. For the rapid turn around our prototyping required, however, this proved inadequate. Instead we settled on building our model in Open CV on Processing (in Java) and running the marker detection on an Android phone.

The Haptic Belt

The haptic belt worked on a five-point left to right directional model. Left and right turns at two different angles each are indicated by the left and right motors in the belt. The central motor indicated that the marker is straight ahead of the user.

sarah_s sketches_edited.jpg
IMG_20180319_144327.jpg

Holding the Phone

To hold the phone itself a simple harness model was built from velcro-straps. This allowed for maximum movement by the user and kept the phone steady and in position enough to follow the marker.

The Final Tests

Our finished system was able to successfully detect a marker and transmit that information through Processing to a series of five lights, representing the motors, built into an Arduino. In a later demonstration, the marker detection system also transmitted the directional information to the series of five haptic motors as intended.

That the system could detect a moving marker and successfully translate that information into haptic feedback is highly encouraging. A relatively low-cost untethered alternative to to tethered running is achievable.

Future Avenues

If Project Running is carried forward, future iterations should expand into other use cases. Our client herself prefers kayaking.


Further refinement of the physical design of the belt itself is of course still required and a more robust software base than Open CV in Processing would be necessary moving forward too. A system built for iPhone in Swift could be a next step.

bottom of page