I think we can all agree when I say that voice command is actually a pretty cool and convenient feature to have. I mean, we don’t even need to do or type anything in order to have something done for us. All we need to do is speak. But recently, we’ve all discovered that voice command actually takes recordings and sends them to the company itself to be processed. And that includes everything insignificant like ordering pizza or talking to a friend, to really disturbing things like drug dealings and people having sex. But recently, the people behind Google Assistant recordings have promised to change that.
Updated on 6 October 2024
Google Voice Data Allocation
Recently, Google came out with a blog post saying that they promise to be more upfront about their voice data storing.
First of all, you should know that Google doesn’t store your recordings by default. Rather, they only recommend you allow it so that it’ll be easier for the Assistant to tend to your commands.
According to Nino Tasca, the Senior Product Manager of Google Assistant, he said that “by default, we don’t retain your audio recordings. This has been the case and will remain unchanged. You can still use the Assistant to help you throughout the day, and have access to helpful features like Voice Match.”
But the thing is, the people that choose to activate this feature might have their recordings sent to actual human workers at the Google headquarters.
Although they were supposed to be anonymous, over the summer, 1,000 voice recordings were leaked into a Belgian media outlet. This eventually caused Google to pause their voice data allocation.
Google Assistant Recordings
However, if you still want to use the voice command, there is a way for you to review your Voice and Activity setting.
“If you’re an existing Assistant user, you’ll have the option to review your VAA setting and confirm your preference before any human review process resumes,” the company added. “We won’t include your audio in the human review process unless you’ve re-confirmed your VAA setting as on.”
Just a quick tip: Google Assistant actually automatically activates if it hears something similar to its wake word thereby eavesdropping on the user’s private conversations unintentionally.
“The Assistant already immediately deletes any audio data when it realizes it was activated unintentionally,” Google says. But they are also adding an option that allows you to calibrate how sensitive Google Assistant devices can be to the activation prompt “Hey Google.”
And not to worry, they also said that the company is taking more measures regarding data storing for the transcription process.
“We take a number of precautions to protect data during the human review process—audio snippets are never associated with any user accounts and language experts only listen to a small set of queries (around 0.2 percent of all user audio snippets), only from users with VAA turned on. Going forward, we’re adding greater security protections to this process, including an extra layer of privacy filters.” Tasca continued to say.