Technology
Google in damage control after leaked recordings of ‘bedroom conversations’
More than 1000 Google recordings of “violence” and “bedroom conversations” leaked in privacy breach; proof your device is always listening.
“OK Google, can you stop listening now?”
Tech giant Google is in damage control after more than 1000 Google Assistant recordings were leaked this week.
The worst part, according to Belgian broadcaster VRT NWS which heard the recordings, is that some of the audio recordings weren’t even triggered with an activation phrase.
VRT NWS said that of more than 1000 excerpts, 153 “were conversations that should never have been recorded and during which the command ‘OK Google’ was clearly not given”.
The subject matter of the audio was, at times, also troubling. The broadcaster allegedly heard “bedroom conversations, conversations between parents and their children, but also blazing rows and professional phone calls containing lots of private information”.
Like many tech companies — including Amazon and Facebook — the purpose of Google recordings is so that realistic human speech can be fed back to a team of reviewers to improve the device’s speech technology.
Google assured its customers the audio recordings sent to humans for transcription were “not associated with user accounts as part of the review process”. But disturbingly, VRT NWS claims they were able to identify specific individuals based on the content of the recordings.
Google addressed the leak and customer privacy concerns by blaming a Dutch “language reviewer” for violating their data security policies.
“Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again,” the company said.
Despite the personal nature of the recordings, Google insists that its Assistant “only sends audio to Google after your device detects that you’re interacting with the Assistant — for example, by saying ‘Hey Google’ or by physically triggering the Google Assistant”.
The company also said users could clearly tell when the device was recording by noting the flashing dots on top of a Google Home or an on-screen indicator on their Android device.
“Rarely, devices that have the Google Assistant built in may experience what we call a ‘false accept’,” it said in a statement, referring to noise or words that are interpreted by the software to be “wake words”.
Still, Google assures consumers: “We have a number of protections in place to prevent false accepts from occurring in your home.”
Source: news.com.au