Apple and Google suspend some of their eavesdroppingApple and Google suspend some of their eavesdropping
Two of the world’s leading voice assistant makers pulled the plug on the respective analytics programmes of Siri and Google Assistant after private information including confidential conversations were leaked.
August 2, 2019
Two of the world’s leading voice assistant makers pulled the plug on their respective analytics programmes of Siri and Google Assistant after private information including confidential conversations were leaked.
Apple decided to suspend its outsourced programme to “grade” Siri, by which it assesses the voice assistant’s response accuracy, following reports that private conversations are being listened to by its contractors without the users’ explicit consent. The company committed to add an opt-out option for users in a future update of Siri. It also promised that the programme would not be restarted until it had completed a thorough review.
“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally,” the Cupertino-based iPhone maker toldThe Guardian. “Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
This is in response to the leak that was first reported by the British broadsheet, which received tipoff from whistle-blowers. The paper learned that contractors regularly hear private conversations ranging from dialogues between patients and doctors, to communications between drug dealers and buyers, with everything is between. These could include cases when Siri has triggered unintentionally without the users’ awareness.
The biggest problem with Apple’s analytics programme is that it does not explicitly disclose to consumers that some of Siri recordings are shared with contractors in different parts of the world who will listen to the anonymous content, as a means to improve Siri’s accuracy. By not being upfront, Apple does not provide users with the option to opt out either.
Shortly before Apple’s decision to call a halt to Siri grading, Google also pulled the plug on its own human analysis of Google Assistant in the European Union, reported Associated Press. The company promised to the office of Johannes Caspar, Hamburg’s commissioner for data protection and Germany’s lead regulator of Google on privacy issues, that the suspension will last at least three months.
The decision was made after Google admitted that one of the language reviewers it partners with, who are supposed to assess Google Assistant’s response accuracy, “has violated our data security policies by leaking confidential Dutch audio data.” Over 1,000 private conversations in Flemish, some of which included private data, were sent to the Belgian news outlet VRT. Though the messages are supposed to be anonymised, staff at VRT were able to identify the users through private information like home addresses.
At that time Google promised “we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
These are not the first cases where private conversations are leaked over voice assistants. Last year an Alexa-equipped Amazon Echo recorded a conversation between a couple in Portland, Oregan, and sent it to a friend, which was another recent case that rang the alarm bell of private data security.
It should not surprise those in the tech world that AI powered natural language processing software still has a long way to go before it can get all the intricacies right. Before that it needs human input to continuously improve the accuracy. The problems that bedevilled Google and Apple today, and Amazon in the past, and Microsoft (Cortana) which fortunately has not suffered high profile embarrassment recently, are down to the lack of stringent oversight of the role humans play, the lack of clear communication to consumers that their interactions with voice assistants may be used for data analysis purposes, and the failure to give consumers the choice to opt out.
There is also the controversy of data sovereignty, as well as the question of whether private data should be allowed to be stored in the cloud or should be kept on device. Apple’s leak case is not geographically specified, but Google’s case is a clear violation of GDPR. According to the AP report, Germany has already started proceedings against Google.