CASE STUDY
June 24, 2021

How Sama’s Accurate AI is Helping Blind Runners Run Independently

How Sama’s Accurate AI is Helping Blind Runners Run IndependentlyAbstract background shapes
Table of Contents
Talk to an Expert

The Overview

Project Guideline is an early-stage research project by Google that explores how on-device machine learning can help people with reduced vision walk and run for exercise independently. The team partnered with Sama to help fuel their experimental technology, which allows people who are blind and low vision to run without a guide — using only a mobile phone, headphones, and a yellow guideline painted on the ground.

It started with a pointed question during a Google hackathon in the fall of 2019. Thomas Panek, an avid runner and CEO of Guiding Eyes for the Blind, posed the question to a group of designers and engineers:

“Would it be possible to help a blind runner navigate, independently?”

Panek, who is blind himself, has completed more than twenty marathons, including five Boston Marathons. While he’s had guide dogs and volunteer human guides to run with him throughout the years, he’s always had to follow—even though his legs and lungs had the capacity to go faster.

It quickly took off from there. By end of day, the team had a working prototype. Less than a year later, Panek was able to run independently for the first time in more than two decades.

The Challenge

The machine learning algorithm developed by Google requires only a line painted on a pedestrian path. Runners wear a regular Android phone in a harness around the waist and count on the camera to feed imagery to the algorithm, which identifies the painted line. The algorithm is tasked with detecting whether the line is to the runner’s left, right, or center, so audio signals can be sent to the runner to guide them to stay on track.

For humans, it’s fairly straightforward to recognize a line and follow it. For a machine learning model, it isn’t that easy. Imagine the running motion: as you start moving your feet you step from left to right, introducing a shake that can make the guideline blurry. Moving outdoors to Panek’s preferred running location, you introduce even more variables. The model must be able to handle a wide range of weather and lighting conditions, or objects like fallen leaves blocking the guideline.

{{testimonial-1}}

The Solution

Sama’s expert annotators draw precise polygons around the single solid yellow lines and a single center line in the images. To do this, the Sama team underwent an intensive training process and continues to meet with Google’s engineers each week to check in on quality, discuss edge cases, and receive new instructions. Thus, with enough examples to learn from, the algorithm is trained to distinguish the pixels in the yellow line from everything else.

But quality AI starts with quality data. And quality data has to be diverse. The Project Guideline data needed to encompass every imaginable scenario that a runner might encounter. In Panek’s case, we quickly noticed that we had to include the runner’s hand blocking the guideline. This was solved by having our annotators infer the position of the line behind the hand—a great example of something that is easy for a human to do but rather difficult for a computer. By continuously adding these variations to the dataset, the model is getting smarter over time.

Conclusion

Fueling the cutting-edge technology that helps a blind man run without assistance truly is the stuff that dreams are made of. Sama’s annotators, the experts giving artificial intelligence its intelligence, love being a part of Project Guideline. Bridget Nattabi, who has worked on this project since its kickoff in July 2020, shares her thoughts:

“Working on this project has allowed me to grow and master polygon annotation with high efficiency and accuracy. I also feel honored to be part of a team that is creating a life-changing navigation experience for the blind. It’s heartwarming to consider that what I do gives people a chance to navigate the world without a guide just like any sighted individual would.”


Sama was a force multiplier for us and a key success factor for our project. They delivered high-quality annotated data on time, listened to our feedback, and were very flexible in accommodating our requests.

Xuan Yang
Xuan Yang
Computer Vision Researcher
at
Google

at
RESOURCES

Related Case Studies

How Indoor Robotics Improved Training Data To Enhance Model Performance
CASE STUDY
MIN READ

How Indoor Robotics Improved Training Data To Enhance Model Performance

Learn More
Orbisk is Using Accurate AI to Help Restaurants Reduce Food Waste Up to 70%
CASE STUDY
MIN READ

Orbisk is Using Accurate AI to Help Restaurants Reduce Food Waste Up to 70%

Learn More
Swift Cost-Effectively Scaled Annotations Without Compromising Accuracy
CASE STUDY
MIN READ

Swift Cost-Effectively Scaled Annotations Without Compromising Accuracy

Learn More
High-Quality Labels Power Accurate Search for Walmart’s 385 Million Online Visitors
CASE STUDY
MIN READ

High-Quality Labels Power Accurate Search for Walmart’s 385 Million Online Visitors

Learn More