Skip to main content
The Keyword

How computers can use radar to understand nonverbal cues



Computers are disappearing into the things we use every day: thermostats, doorbells and speakers all have advanced sensing capabilities and run machine learning algorithms. However, they are far from feeling “invisible” — they often interrupt us and feel overwhelming.

Today, the connected products in our homes can help us with activities like cooking, preparing for the day or dining with friends. But these devices require explicit commands to initiate every interaction. Using a voice assistant requires saying a “wake word” — e.g. “OK Google” — to begin a conversation, and activating a smart display often requires stopping what you are doing and double-tapping a screen. The repetition of these commands is tiresome and hinders the natural flow of everyday life.

One of the goals of our design team within Google ATAP is to solve this, and help build devices that understand us as intuitively as, say, a person holding a door for us when our hands are full — without the need for verbal commands. That means figuring out how to get away from traditional interactions like the mouse and keyboard, or touch gestures, and developing technologies that can respond to us as naturally as humans.

Helping computers develop “social intelligence” using radar

Over the past 6 years our team has used a radar-based sensing platform called Soli to create these new interaction techniques. The radar sensor uses radio waves to sense presence, body language and gestures within its sensing area. This motion sensor is embedded in several Google products including the Pixel 4 phone, Nest displays and Nest Thermostat. Because the radar sensor is not a camera and cannot identify people, this can provide a privacy-preserving solution for intelligent home environments.

Using human communication as a starting point for design, we looked at how humans have their own personal space around them by which they make decisions and take cues for interacting with others. We imagined devices have their own personal space, too, and looked at ways for the space between people and computers to serve a similar purpose. We call these spaces, Fields.

A man moves close to a computer screen that is recognizing his proximity and displaying it as fields on radar.

Early prototypes using radar technology that explore spatial relationships between people and devices.

Through our investigations, we’ve found that the amount of overlap between these fields is a good indicator of the level of interest between a user and a device. The use of fields can take what is intuitive between people — understanding spatial relationships such as proximity, orientation and pathways — and teach “social intelligence,” allowing technology to participate in our daily life in a more harmonious and considerate way.

This idea of fields around a person and computer makes interactions automatically feel more conversational. For example, let’s imagine you are cooking in the kitchen by following a recipe. When you turn away to grab an ingredient, the device can be considerate in that moment and pause, only to resume when you’ve turned back towards it.

A cook steps away from a smart screen giving recipe instructions and the screen pauses.

Speculative concept using orientation and turning to play/pause recipe.

While the “social intelligence” to pause until you return is a first step, we believe that understanding nonverbal cues has the potential to become a new standard for “ambient computing” — a vision of computing where computers weave gracefully into our everyday lives. In the same way we use inputs such as a mouse and keyboard, we believe that a new set of interaction patterns and standards will emerge as we move computing into the physical world that takes into account implicit behaviors. It will change everything at home and at work, and could create more natural interactions in a way that predates those of us who’ve grown up with technology. As Xerox Parc outlined in a prolific idea, rather than sitting in front of your desktop computer giving it your full attention, computers can work alongside you in the physical world, invisible but always helpful.

This means even in a world with more devices around us, they can be more respectful and graceful by understanding our intent. Read our full research paper on the project or watch our first episode of “In the Lab,” a docu-series which highlights the people, process and thinking behind this work and other projects at ATAP.

A YouTube video explaining nonverbal interactions with Soli radar
10:25

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe