Apple Suspends Siri Grading Program

Posted on August 2, 2019 by Paul Thurrott in Apple, Apple Watch, Cloud, iOS, Mobile, Smart Home with 11 Comments

Apple said Thursday that it is suspending a program that “grades” Siri voice commands issued by users because of privacy concerns.

“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple statement reads. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

News of Apple’s secret Siri grading program first came to light earlier this year when Bloomberg reported that the firm had “teams of people” manually analyzing recordings of users issuing commands to the voice assistant. (The report also noted that Amazon engages in similar activities with its Alexa voice assistant.) Apple refers to these people as “human helpers” for Siri that gauge whether the assistant is working reliably. The recordings provide the helpers with no personally identifiable information, the report noted, and they are stored on Apple servers for at least six months.

Last week, a report in The Guardian furthered provided more details, noting that Apple contractors routinely “hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading’, the company’s Siri voice assistant.”

In response to that report, Apple explained that less than one percent of Siri activations are randomly recorded for analysis. “A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple said last week. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

But the whistleblower who alerted the publication to the privacy invasions said that Siri has a particular problem with accidental activations—where the user didn’t ask for Siri’s help but the assistant thought they had—and the resulting recordings “pick up extremely sensitive personal information” at great frequency.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower told The Guardian. “These recordings are accompanied by user data showing location, contact details, and app data.” The Apple Watch and HomePod smart speaker are the most frequent sources of mistaken recordings, the source said.

Apple’s not alone, of course. In addition to Amazon, which is known to have “thousands” of workers listening in on Alexa conversations, Google has run into issues in the EU recently thanks to its Assistant as well. Germany this week temporarily banned Google employees and contractors from transcribing Assistant voice recordings in the EU after whistleblowers described the sensitive information they hear. Google agreed to halt the work for three months to determine whether it is violating the EU’s General Data Protection Regulation (GDPR).

Join the discussion!

BECOME A THURROTT MEMBER:

Don't have a login but want to join the conversation? Become a Thurrott Premium or Basic User to participate

Register
Comments (11)

11 responses to “Apple Suspends Siri Grading Program”

  1. train_wreck

    OK, what the hell is that picture? ? I’m gonna guess it’s from an Apple ad?

  2. dontbeevil

    "what happens on iphone, stays on your iphone" *


    *till we get caught

  3. Tony Barrett

    Apple collect and process as much user data as all the others - they're just very good at keeping it all under wraps.

  4. wocowboy

    This is great news for all the hundreds of millions of users who have been harmed by this listening, who have been convicted of crimes, been charged with crimes, suffered embarrassment from the public playback of all these recordings, lost friends and loved ones because of the public playback, or the other multitude of ways people have been severely harassed or harmed.


    Oh, wait, none of this has ever happened? Never mind.


    I would be WAY more OK with this entire situation if equal scorn and outrage was being applied to Google and Amazon, who have been and are doing the very same thing in order to improve their voice assistants, but that is not happening. I realize why it is not happening, it is because Apple is the only one of the three to have an actual public stance on privacy. The others openly harvest, collect and keep all sorts of information, both identifiable and unidentifiable, on their users. Apple does not, and they say they do not, and they are being vilified for it.



    • jchampeau

      In reply to wocowboy:

      You had me going there for a second. The fact that the recordings aren't tied to an Apple ID and thus can't be attributed to a person makes it sort of a non-issue to me. Not that privacy isn't important; it is. But at this point, if you live in an industrialized country and use modern technology, forms of payment, etc., all of our PII is already out there for bad actors to steal and use.

  5. ReformedCtrlZ

    To be fair I will commend Apple on adding the option to opt out of the program to protect recordings but I think the privacy concerns are somewhat overblown. I also love that Apple is "committed to delivering a great Siri experience while protecting user privacy" and charged ahead with the program until they got caught - just saying :)

  6. SyncMe

    So shouldn't we be asking the same question about Microsoft and Cortana?

  7. jedwards87

    I just read that both Amazon and Google are changing their policies too. Good job Apple.

Leave a Reply