Google employees reportedly listening in on all user conversations via Google Assistant


Google is reportedly listening in on all your conversations, including ones that are not meant to be recorded. This happens via the company's AI-powered Google Assistant on phones and Google Home speakers, Belgian news portal VRT NWS claims. This news agrees by Google on its platform in its terms and conditions, states everything said by users to their Google smart speakers and Google Assistant is being recorded and stored. However, it doesn't mention its employees can listen to excerpts from these recordings.

It is true that Google does not eavesdrop directly, but VRT NWS discovered that it is listening in. Or rather: that it lets people listen in. When people listened to their own recorded audio clips, they were surprised to find that their conversations being recorded and accessed by employees. In those recordings, we could clearly hear addresses and other sensitive information, which made it easier for the Belgian news portal to find affected users.

Meanwhile, Google has accepted that it listens to the conversations and has provided an in-depth explanation of what it does. As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology and is necessary to create products like Google Assistant.

The Mountain View-based company says that Google Assistant sends back audio only after a device detects the user as saying Hey Google, or physically triggering the Google Assistant, to interact with it. When the assistant is summoned, the device is said to provide a clear indicator (flashing dots on top of a Google Home or an on-screen indicator in case of an Android device) that a conversation has started.

However, VRT NWS shortlisted a few conversations and said that out of more than a thousand excerpts, 153 conversations should never have been recorded because the user didn't say the Ok Google command. For those unaware, users with Google Assistant on their phones and smart speakers have to speak Ok, Google to start a conversation with the AI-powered virtual assistant.

Since users didn't call up the virtual assistant, multiple sensitive user conversations were reportedly recorded unintentionally. As per the report, these include bedroom conversations, chats between parents and their children, along with phone calls containing a slew of sensitive private information. In addition, there is said to be a woman's recording who was apparently under definite distress. This fiasco raises questions about Google respecting its user's privacy.

Google clarifies that rarely, Google Assistant-built devices may experience a false acceptance. It means that either some noise or words in the background was interpreted to be the hot word to invoke the assistant. Google says that it also has a number of protections in place to prevent false accepts.

David Monsees, Product Manager, Search, said, we apply a wide range of safeguards to protect user privacy throughout the entire review process. Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google. 

Not all is as bad as it looks though. Google reportedly tries to ensure that the voice excerpts are not being linked to a user account to make it difficult to track someone's identity. The company is said to delete the user name and replace it with an anonymous serial number. 

It doesn’t take a rocket scientist to recover someone’s identity, you simply have to listen carefully to what is being said. If they don’t know how it is written, these employees have to look up every word, address, personal name, or company name on Google or on Facebook. In that way, they often soon discover the identity of the person speaking.

As far as the conversations accessed by VRT NWS are concerned, Monsees says that one of the language reviewers has violated its data security policies by leaking confidential Dutch audio data. 

The executive said, our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.

Google also reiterated that users can turn off storing audio data to their Google accounts completely or choose to auto-delete data after every 3 or 18 months.

Post a Comment

0 Comments