Apple said Thursday that it is suspending a program that “grades” Siri voice commands issued by users because of privacy concerns.
“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple statement reads. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!
"*" indicates required fields
News of Apple’s secret Siri grading program first came to light earlier this year when Bloomberg reported that the firm had “teams of people” manually analyzing recordings of users issuing commands to the voice assistant. (The report also noted that Amazon engages in similar activities with its Alexa voice assistant.) Apple refers to these people as “human helpers” for Siri that gauge whether the assistant is working reliably. The recordings provide the helpers with no personally identifiable information, the report noted, and they are stored on Apple servers for at least six months.
Last week, a report in The Guardian furthered provided more details, noting that Apple contractors routinely “hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading’, the company’s Siri voice assistant.”
In response to that report, Apple explained that less than one percent of Siri activations are randomly recorded for analysis. “A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple said last week. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
But the whistleblower who alerted the publication to the privacy invasions said that Siri has a particular problem with accidental activations—where the user didn’t ask for Siri’s help but the assistant thought they had—and the resulting recordings “pick up extremely sensitive personal information” at great frequency.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower told The Guardian. “These recordings are accompanied by user data showing location, contact details, and app data.” The Apple Watch and HomePod smart speaker are the most frequent sources of mistaken recordings, the source said.
Apple’s not alone, of course. In addition to Amazon, which is known to have “thousands” of workers listening in on Alexa conversations, Google has run into issues in the EU recently thanks to its Assistant as well. Germany this week temporarily banned Google employees and contractors from transcribing Assistant voice recordings in the EU after whistleblowers described the sensitive information they hear. Google agreed to halt the work for three months to determine whether it is violating the EU’s General Data Protection Regulation (GDPR).
dontbeevil
<p>"what happens on iphone, stays on your iphone" *</p><p><br></p><p>*till we get caught</p>
jedwards87
<blockquote><em><a href="#447023">In reply to ghostrider:</a></em></blockquote><p>No they do not. And a lot of data stays and gets processed on the phone and doesn't go to their servers. Quit spreading misinformation.</p>
wocowboy
Premium Member<p>This is great news for all the hundreds of millions of users who have been harmed by this listening, who have been convicted of crimes, been charged with crimes, suffered embarrassment from the public playback of all these recordings, lost friends and loved ones because of the public playback, or the other multitude of ways people have been severely harassed or harmed. </p><p><br></p><p>Oh, wait, none of this has ever happened? Never mind. </p><p><br></p><p>I would be WAY more OK with this entire situation if equal scorn and outrage was being applied to Google and Amazon, who have been and are doing the very same thing in order to improve their voice assistants, but that is not happening. I realize why it is not happening, it is because Apple is the only one of the three to have an actual public stance on privacy. The others openly harvest, collect and keep all sorts of information, both identifiable and unidentifiable, on their users. Apple does not, and they say they do not, and they are being vilified for it. </p><p><br></p><p><br></p>
jedwards87
<p>I just read that both Amazon and Google are changing their policies too. Good job Apple.</p>