Apple said Thursday that it is suspending a program that “grades” Siri voice commands issued by users because of privacy concerns.
“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple statement reads. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
News of Apple’s secret Siri grading program first came to light earlier this year when Bloomberg reported that the firm had “teams of people” manually analyzing recordings of users issuing commands to the voice assistant. (The report also noted that Amazon engages in similar activities with its Alexa voice assistant.) Apple refers to these people as “human helpers” for Siri that gauge whether the assistant is working reliably. The recordings provide the helpers with no personally identifiable information, the report noted, and they are stored on Apple servers for at least six months.
Last week, a report in The Guardian furthered provided more details, noting that Apple contractors routinely “hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading’, the company’s Siri voice assistant.”
In response to that report, Apple explained that less than one percent of Siri activations are randomly recorded for analysis. “A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple said last week. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
But the whistleblower who alerted the publication to the privacy invasions said that Siri has a particular problem with accidental activations—where the user didn’t ask for Siri’s help but the assistant thought they had—and the resulting recordings “pick up extremely sensitive personal information” at great frequency.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower told The Guardian. “These recordings are accompanied by user data showing location, contact details, and app data.” The Apple Watch and HomePod smart speaker are the most frequent sources of mistaken recordings, the source said.
Apple’s not alone, of course. In addition to Amazon, which is known to have “thousands” of workers listening in on Alexa conversations, Google has run into issues in the EU recently thanks to its Assistant as well. Germany this week temporarily banned Google employees and contractors from transcribing Assistant voice recordings in the EU after whistleblowers described the sensitive information they hear. Google agreed to halt the work for three months to determine whether it is violating the EU’s General Data Protection Regulation (GDPR).