Apple Suspends Siri Grading Program

Apple said Thursday that it is suspending a program that “grades” Siri voice commands issued by users because of privacy concerns.

“We are committed to delivering a great Siri experience while protecting user privacy,” an Apple statement reads. “While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

News of Apple’s secret Siri grading program first came to light earlier this year when Bloomberg reported that the firm had “teams of people” manually analyzing recordings of users issuing commands to the voice assistant. (The report also noted that Amazon engages in similar activities with its Alexa voice assistant.) Apple refers to these people as “human helpers” for Siri that gauge whether the assistant is working reliably. The recordings provide the helpers with no personally identifiable information, the report noted, and they are stored on Apple servers for at least six months.

Last week, a report in The Guardian furthered provided more details, noting that Apple contractors routinely “hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading’, the company’s Siri voice assistant.”

In response to that report, Apple explained that less than one percent of Siri activations are randomly recorded for analysis. “A small portion of Siri requests are analyzed to improve Siri and dictation,” Apple said last week. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

But the whistleblower who alerted the publication to the privacy invasions said that Siri has a particular problem with accidental activations—where the user didn’t ask for Siri’s help but the assistant thought they had—and the resulting recordings “pick up extremely sensitive personal information” at great frequency.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the whistleblower told The Guardian. “These recordings are accompanied by user data showing location, contact details, and app data.” The Apple Watch and HomePod smart speaker are the most frequent sources of mistaken recordings, the source said.

Apple’s not alone, of course. In addition to Amazon, which is known to have “thousands” of workers listening in on Alexa conversations, Google has run into issues in the EU recently thanks to its Assistant as well. Germany this week temporarily banned Google employees and contractors from transcribing Assistant voice recordings in the EU after whistleblowers described the sensitive information they hear. Google agreed to halt the work for three months to determine whether it is violating the EU’s General Data Protection Regulation (GDPR).

Share post

Please check our Community Guidelines before commenting

Conversation 11 comments

  • train_wreck

    02 August, 2019 - 3:56 am

    <p>OK, what the hell is that picture? ? I’m gonna guess it’s from an Apple ad?</p>

    • Paul Thurrott

      Premium Member
      02 August, 2019 - 4:10 am

      <blockquote><em><a href="#447009">In reply to train_wreck:</a></em></blockquote><p>Yes. "Siri, is it raining?"</p>

  • dontbeevil

    02 August, 2019 - 4:52 am

    <p>"what happens on iphone, stays on your iphone" *</p><p><br></p><p>*till we get caught</p>

  • Tony Barrett

    02 August, 2019 - 6:42 am

    <p>Apple collect and process as much user data as all the others – they're just very good at keeping it all under wraps.</p>

    • jedwards87

      03 August, 2019 - 9:20 am

      <blockquote><em><a href="#447023">In reply to ghostrider:</a></em></blockquote><p>No they do not. And a lot of data stays and gets processed on the phone and doesn't go to their servers. Quit spreading misinformation.</p>

  • wocowboy

    Premium Member
    02 August, 2019 - 7:00 am

    <p>This is great news for all the hundreds of millions of users who have been harmed by this listening, who have been convicted of crimes, been charged with crimes, suffered embarrassment from the public playback of all these recordings, lost friends and loved ones because of the public playback, or the other multitude of ways people have been severely harassed or harmed. </p><p><br></p><p>Oh, wait, none of this has ever happened? Never mind. </p><p><br></p><p>I would be WAY more OK with this entire situation if equal scorn and outrage was being applied to Google and Amazon, who have been and are doing the very same thing in order to improve their voice assistants, but that is not happening. I realize why it is not happening, it is because Apple is the only one of the three to have an actual public stance on privacy. The others openly harvest, collect and keep all sorts of information, both identifiable and unidentifiable, on their users. Apple does not, and they say they do not, and they are being vilified for it. </p><p><br></p><p><br></p>

    • jchampeau

      Premium Member
      02 August, 2019 - 9:12 am

      <blockquote><em><a href="#447024">In reply to wocowboy:</a></em></blockquote><p>You had me going there for a second. The fact that the recordings aren't tied to an Apple ID and thus can't be attributed to a person makes it sort of a non-issue to me. Not that privacy isn't important; it is. But at this point, if you live in an industrialized country and use modern technology, forms of payment, etc., all of our PII is already out there for bad actors to steal and use.</p>

      • Skolvikings

        02 August, 2019 - 11:00 am

        <blockquote><em><a href="#447045">In reply to jchampeau:</a></em></blockquote><p>Plus these voice assistants either aren't enabled by default or can be disabled. People, such as myself, are voluntarily using them.</p>

  • ReformedCtrlZ

    Premium Member
    02 August, 2019 - 11:21 am

    <p>To be fair I will commend Apple on adding the option to opt out of the program to protect recordings but I think the privacy concerns are somewhat overblown. I also love that Apple is "committed to delivering a great Siri experience while protecting user privacy" and charged ahead with the program until they got caught – just saying :)</p>

  • SyncMe

    02 August, 2019 - 4:12 pm

    <p>So shouldn't we be asking the same question about Microsoft and Cortana?</p>

  • jedwards87

    03 August, 2019 - 9:19 am

    <p>I just read that both Amazon and Google are changing their policies too. Good job Apple.</p>

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC