Skip to main content
The Keyword

Doing more to protect your privacy with the Assistant



We believe you should be able to easily understand how your data is used and why, so you can make choices that are right for you. Recently we’ve heard concerns about our process in which language experts can listen to and transcribe audio data from the Google Assistant to help improve speech technology for different languages. It's clear that we fell short of our high standards in making it easy for you to understand how your data is used, and we apologize.

When we learned about these concerns, we immediately paused this process of human transcription globally to investigate, and conducted a full review of our systems and controls. Now we want to share more about how audio recordings work, and some changes we’re making: 

Your audio data isn't stored by default on Google servers

By default, your audio recordings are not saved on Google servers. You can still use the Assistant to help you throughout the day, and have access to helpful features like Voice Match.

To store your audio data in your Google Account, you can opt in to the Voice & Audio Activity (VAA) setting when you set up your Assistant. Opting in to VAA helps improve the Assistant for everyone by allowing us to use samples of audio to understand more languages and accents. You can view your past interactions with the Assistant, and delete any of these interactions at any time.

Updating our audio setting 

We’re updating our settings to highlight that when you turn on VAA, human reviewers may listen to your audio snippets to help improve speech technology. If you’re an existing Assistant user, you’ll have the option to review your VAA setting and confirm your preference before any human review process resumes. We won’t include your audio in the human review process unless you’ve re-confirmed your VAA setting as on. 

More privacy protections for our transcription process

We take a number of precautions to protect data during the human review process—audio snippets are never associated with any user accounts and language experts only listen to a small set of queries (around 0.2 percent of all user audio snippets), only from users with VAA turned on. Going forward, we’re adding greater security protections to this process, including an extra layer of privacy filters. 

Automatically deleting more audio data

The Assistant already immediately deletes any audio data when it realizes it was activated unintentionally—e.g., by a noise that sounds like “Hey Google.” We understand it’s important to get this right, and will continue to focus on this area, including implementing additional measures to help us better identify unintentional activations and exclude them from the human review process. Soon we’ll also add a way to adjust how sensitive your Google Assistant devices are to prompts like “Hey Google,” giving you more control to reduce unintentional activations, or if you’d prefer, make it easier for you to get help in especially noisy environments.

One of the principles we strive toward is minimizing the amount of data we store, and we’re applying this to the Google Assistant as well. We’re also updating our policy to vastly reduce the amount of audio data we store. For those of you who have opted in to VAA, we will soon automatically delete the vast majority of audio data associated with your account that’s older than a few months. This new policy will be coming to VAA later this year. 

We believe in putting you in control of your data, and we always work to keep it safe. We’re committed to being transparent about how our settings work so you can decide what works best for you. To check your current settings and learn more about the controls available, visit the “Your data in the Assistant” page.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe