Always listen to Google watch me

Google employees also listen to Assistant conversations

Google advertises the use of the Assistant in the bedroom - and that's exactly where he is listening in without permission.
Image: Google Terrifying, but not really surprising, was the news that Amazon employees listen to Alexa voice commands and that the recordings are stored indefinitely. Even law enforcement officers are now demanding access to this data.

The world didn't have to wait long for Google to admit such practices with its Google Assistant. Now the time has come: A Belgian magazine reports in detail on how Google employees received even the most private voice recordings from users to listen to.

Listened to more than 1000 recordings

VRT NWS magazine reports that Google employees systematically listen to audio files recorded by Google Home's smart speakers and the Google Assistant's smartphone app. Google employees around the world are supposed to listen to these audio files for the purpose of improving Google's search function. According to its own statements, VRT NWS was able to listen to more than a thousand recordings.

Most of these recordings were made consciously, so the customer actively addressed the Google Assistant. But Google also listens to conversations that should never have been recorded. Some of them even contain confidential information. Not all customers are aware that everything they say to their Google Smart Speakers and Google Assistant is recorded and saved. However, this is clearly stated in the Google Terms of Service. Google employees can also listen to excerpts from these recordings - and users are certainly not aware of this because Google does not mention it in its terms of use.

A Google employee has apparently passed on a large number of audio files to VRT NWS. In these recordings, journalists could clearly hear addresses and other sensitive information. This made it pretty easy to find the people involved and confront them with the audio recordings.

Spicy content: sex, violence, professional secrets

Google advertises the use of the Assistant in the bedroom - and that's exactly where he is listening in without permission.
Image: Google In order to avoid that excerpts are automatically linked to a user, the recordings are separated from the user information before they are passed on to the analysts. Google will delete the username and replace it with an anonymous serial number. Apparently, however, it was quite easy to find out someone's identity from the recordings.

As soon as someone nearby utters a word that sounds a bit like "Okay Google," Google Home starts recording. This also leads to recordings from bedrooms with clear conversations between partners, but also conversations between parents and children were among them. There are also said to have been cases where Google employees had to hear clear cases of domestic violence.

The question also arises as to what happens if a Google Assistant eavesdropping unobserved in the vicinity of someone who is carrying professional secrets. This could have unforeseen consequences for doctors, lawyers, pastors and pastors in particular, because they might have breached their duty of confidentiality without being noticed.

Google responds to the allegations

After Google had been silent on this topic for a long time, the company has now for the first time passed on details of the analysis of the voice recordings in a blog entry.

Indeed, as part of its work to develop language technologies for more languages, Google would be working with language experts around the world who understand the nuances and accents of a particular language. These language experts would review and transcribe "a small series of queries" so that Google could "understand" those languages ​​better. This is a critical part of the process of creating voice technologies and creating products like the Google Assistant.

Google does not change the technology, only tracks the "leak"

Google has now learned that "one of these language auditors" has violated data protection guidelines by disclosing confidential Dutch audio data. The in-house security and data protection response teams have been activated on this issue, will investigate and take further action. Google wants to do a full review of security precautions in this area to prevent such misconduct from happening again.

Google uses a "wide range of safeguards" to protect user privacy throughout the verification process. Language experts would only review about 0.2 percent of all audio snippets. Audio snippets would not be linked to user accounts as part of the review process, and reviewers would be instructed not to transcribe background conversations or other noises and only transcribe snippets directed to Google.

In rare cases, devices with an integrated Google assistant were said to have been "incorrectly accepted". This means that there were noises or words in the background that the software interpreted as a hotword (e.g. "Ok Google"). Google has taken a number of security precautions to prevent such "misconceptions" at home. Customers could turn off the storage of audio data in their Google account entirely or choose to have data automatically deleted every three or 18 months.