Earlier this month, everyone who’s gotten used to saying, ‘Okay Google’ instead of swiping and tapping to call, set an alarm or reminder, or perform the dozens of functions Google Assistant is capable of was subjected to an unpleasant surprise. There had been a big audio leak and the leaked data contained audio recordings of users’ interaction with the AI.
To its credit, Google promptly admitted to the leak of Dutch audio data comprising 1,000 plus private conversations that had been recorded by its voice assistant. According to sources, the recordings were leaked to VRT NWS, a Belgian public broadcaster by some of Google’s language review partners and the news company was even able to identify a number of people from the voice recordings.
Concerns about the ethical and social implications of the use of AI technology are nothing new. When Google Assistant was first launched, experts had criticized the use of AI that mimics human speech on the basis that it could trick human operators into thinking they were conversing with another human. The biggest of these concerns were related to user privacy. The AI’s act of recording user interactions and analyzing the audio samples to develop a greater understanding of user speech patterns and preferences was viewed as a potential threat.
In response, Google had assured the world that it would take adequate measures to ensure human users can identify the AI and are made aware when the audio is being recorded. About three years after its launch, the Google Assistant audio data leak has shown how the company has failed to live up to its promise of utmost clarity and transparency.
What shocked the world more than the leak was the revelation that Google workers had been listening in to people’s conversations with the voice assistant. In some instances, the voice assistant had started recording even when the user had not given the ’Okay Google’ command.
Google’s smart speakers can trigger without being commanded. Actual human beings can listen in to people’s conversations with the voice assistant and other sounds it picked up and retained without their knowledge or consent. Sure, the AI could access people’s audio data for improving sensitivity and most people wouldn’t have an issue with that. But having human reviewers listening in didn’t sit well with many people.
Nino Tasca, Senior Product Manager of Google Assistant has issued an apology for failing to be more transparent about the manner in which user audio data is utilized. In a recent blog, Tasca clarified that the process of human review of audio data is aimed at improving speech technology across different languages. He also assured users that Google does not store their audio recordings by default.
So, not a whole lot of people knew their interactions with the AI are accessible to human language experts and Google didn’t realize that the process would create such an uproar. To set things right, Google has already stopped the human audio review of users’ audio data worldwide.
In addition, Google has issued guidelines for users of its voice technology to gain more control over the storage and use of their audio data. The company has also assured it’s taking steps to put users in control of their data and better secure their privacy.
This isn’t a new feature, but the data leak has revealed a lot of users aren’t privy to the process of changing audio recording permissions. Users that don’t wish for Google to record their interactions with the voice assistant can choose to opt-out of Voice and Audio Activity (VAA). According to Google, opting into VAA will allow snippets of users’ audio interactions to be recorded and reviewed by human linguists. This will help to improve the AI’s ability to recognize the users’ voice, accent and speech patterns and get better at serving them over the course of time.
Users can also access the recordings of their past interactions with the assistant in the voice and Audio Activity settings and choose to delete these at any time.
As per Google, the VAA will now highlight how users’ choice of settings works. Existing voice assistant users can review their VAA settings and choose to turn it ‘on’ or ‘off’. Users that choose to opt-out will be excluded from the human review of their audio samples. Those who agree to human reviews to improve speech technology can reconfirm their VAA settings to express their preference.
Google has assured users that it’s working towards drastically reducing the amount of audio data it stores. Users opting into VAA will also have audio data older than the past few months automatically erased from the archives. This move is expected to come into effect by the end of the current calendar year.
Google has assured users that audio that gets recorded by accidental activation caused by sounds similar to the wake-up phrase ‘Okay Google’ is erased automatically. The company is working towards further improving command identification as well as the sensitivity of Google Assistant to reduce the incident of unintentional activation.
According to estimates/ ComScore, by 2020, 50 percent of all searches will be voice-based. Being one of the best virtual assistants of the day, Google Assistant will clearly be needed and in use big time. Google has shown its intent by acknowledging the missteps it’s made with the voice assistant and has already set about fixing the chinks. Along with opening up to the general public about its MO and the intricacies of Google Assistant controls, the company has also promised to add more security protections for audio transcription purposes to prevent any future leaks.
The new and upcoming changes to Google Assistant are steps towards a better, more transparent future for human-AI interaction. With this newfound awareness of how the AI operates and evolves, users will be able to choose when and if they want to rely on a virtual assistant that occasionally records their voice. And with Google’s newfound focus on improving the service, on the whole, users that choose to stick with it are promised a safer, more efficient Google Assistant in the coming months and beyond.
(4 ratings, average: 4.75 out of 5)
Get the weekly updates on the newest brand stories, business models and technology right in your inbox.
Suite #304, 11200 Manchaca, Austin, Texas, US, 78748
#2011, Floor 20, Burjuman Business Tower, Dubai.
If you want our experts to help you with software or app development, just book a call!