Skip to main content
The Keyword

Accessibility

How Project Guideline gave me the freedom to run solo

Image shows curved road with tall green trees on either side. A man is in the middle wearing running shorts and t-shirt with his back to the camera and is mid-stride. He is running alongside a yellow line that has been painted in the middle of the road.
10:25

Editor's Note: At Google Research, we’re interested in exploring how technology can help improve people’s daily lives and experiences. So it’s been an incredible opportunity to work with Thomas Panek, avid runner and President & CEO of Guiding Eyes for the Blind, to apply computer vision for something important in his everyday life: independent exercise. Project Guideline is an early-stage research project that leverages on-device machine learning to allow Thomas to use a phone, headphones and a guideline painted on the ground to run independently. Below, Thomas shares why he collaborated with us on this research project, and what the journey has been like for him.

I’ve always loved to run. Ever since I was a boy, running has made me feel free. But when I was eight-years-old, I noticed that I couldn’t see the leaves on a tree so well, and that the stars in the night sky began to slowly disappear—and then they did forever. By the time I was a young adult, I was diagnosed as legally blind due to a genetic condition. I had to rely on a cane or a canine to guide me. For years, I gave up running.

Then I heard about running with human guides, and I decided to give it a try. It gave me a sense of belonging, holding a tether and following the guide runner in front of me. I even qualified for the New York City and Boston Marathons five years in a row. But as grateful as I was to my human guides, I wanted more independence. So in 2019, I decided to run the first half-marathon assisted only by guide dogs.

But I know it’s not possible for everyone to have a brilliant, fast companion like my guide dog, Blaze. I run an organization called Guiding Eyes for the Blind, and we work tirelessly to help people with vision loss receive running guide dogs that can help them live more active and independent lives. The problem is that there are millions more people with vision loss than there are available guide dogs. So I started asking a question: “Would it be possible to help guide a blind runner, independently?” 

In the fall of 2019, I asked that question to a group of designers and technologists at a Google hackathon. I wasn’t anticipating much more than an interesting conversation, but by the end of the day they’d built a rough demo that allowed a phone to recognize a line taped to the ground, and give audio cues to me while I walked with Blaze. We were excited, and hopeful to see if we could develop it into something more.

We began by sketching out how the prototype would work, settling on a simple concept: I’d wear a phone on a waistband, and bone-conducting headphones. The phone’s camera would look for a physical guideline on the ground and send audio signals depending on my position. If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear. If I drifted to the right, the same thing would happen, but in my right ear. Within a few months, we were ready to test it on an indoor oval track. After a few adjustments, I was able to run eight laps. It was a short distance, and all with my Google teammates close by, but it was the first unguided mile I had run in decades.

Our next step was to see if the tech could work where I love running most: in the peace and serenity of a park. This brought a whole new batch of challenges to work through: variables in weather and lighting conditions and the need for new data to train the model, for starters. After months of building an on-device machine learning model to accurately detect the guideline in different environments, the team was finally ready to test the tech outside for the first time.

I’d been waiting 25 years to run outdoors, on my own. I stood at the start of the guideline, hopping up and down with excitement. When the team gave me the go-ahead, I began sprinting on my toes, as fast as my legs could carry me, down the hill and around a gentle bend in the road. As I tightened my form, my stride was getting more confident and longer with every step. I felt free, like I was effortlessly running through the clouds.

When I arrived at the finish line, I was completely overcome with emotion. My wife, Melissa, and my kids hugged me. My guide dog Blaze licked the salt off of my hand. They were happy for me, too. For the first time in a lifetime, I didn’t feel like a blind man. I felt free.

Today, we’re testing this technology further. I’ll be attempting to run NYRR’s Virtual Run for Thanks 5K along a line temporarily painted in Central Park in New York City. I want to thank NYRR, NYC Department of Parks & Recreation, Central Park Conservancy, NYPD, NYC Department of Sanitation and the NYC Department of Transportation for helping to make today’s 5K run possible. We want to see how this system works in urban environments, just one of the many challenges to complete before it can be used more widely. 

Collaborating on this project helped me realize a personal dream of mine. I’m so grateful to the Google team, and whoever came up with the idea of a hackathon in the first place. I hope there will be more runs with Project Guideline in my future, and for many other runners as well.

By sharing the story of how this project got started and how the tech works today, we hope to start new conversations with the larger blind and low-vision community about how, and if, this technology might be useful for them, too. As we continue our research, we hope to gather feedback from more organizations and explore painting guidelines in their communities. To learn more, please visit: goo.gle/ProjectGuideline.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe