Apple Makes Changes to Siri Data Collection

Posted on August 28, 2019 by Paul Thurrott in Apple, Mobile, iOS, Smart Home with 16 Comments

Responding to the recent uproar over contractors listening to conversations that users have with Siri, Apple is making changes.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” an Apple statement explains. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

Apple may have halted its Siri grading program in response to complaints. But it will start up the program “later this fall” after it’s made the following changes:

  • It will no longer retain audio recordings of Siri interactions. Instead, Apple will use computer-generated transcripts to help Siri improve.
  • Users will be able to opt in to help Siri improve by learning from the audio samples of their requests. “We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” Apple says. “Those who choose to participate will be able to opt-out at any time.”
  • When customers do opt-in, only Apple employees, and not outside contractors, will be allowed to listen to audio samples of the Siri interactions. The team will delete any recording which is determined to be an inadvertent trigger of Siri.

In short, Apple is doing for Siri exactly what Microsoft should be doing—but won’t—for Windows 10. Good for them.

Tagged with

Join the discussion!


Don't have a login but want to join the conversation? Become a Thurrott Premium or Basic User to participate

Comments (16)

16 responses to “Apple Makes Changes to Siri Data Collection”

  1. yoshi

    It blows my mind that Microsoft isn't riding this privacy wave as hard as they can. It's such a clear advantage over Google at the moment, which Apple is proving.

  2. Tony Barrett

    Apple play the privacy advocate whenever they can - it makes their users feel all warm and fuzzy inside (even though they're not upgrading their iPhones as much as they used to!). What this shows though is that Apple collect and process as much user data as anyone else, and this recent outcry has probably had Apple's PR department running around thinking how they can capitalize on it - and well, they're first out the blocks with an official statement, which is really more about damage limitation than anything.

    When a large part of a companies profit comes from hardware, they can afford to do this, but as they move more into services, that data is going to become invaluable to them, so don't expect Apple to collect any less in the long run - they'll be collecting and processing far more..

    • rosyna

      In reply to ghostrider:

      The section of the document titled How Siri Protects Your Privacy shows the exact opposite of what you claimed.

    • m_p_w_84

      In reply to ghostrider:

      I completely agree.

      I mean the literal panic of Tim Cook when their MASSIVE profits were going to be dented by tariffs; straight to the top to get that sorted for the shareholders.

      If they care so much about privacy why not half the price of the iPhones so more people can enjoy that benefit.

      • toukale

        In reply to m_p_w_84:

        The fanatic side wins again. It always amazes me whenever I see comments such as yours about any publicly traded company. Those ceo's first responsibility is not to us (users), but the company and the shareholders. We come in after the company and the shareholders needs are met. On occasion the users, the company and the shareholders needs are align as one, but that is not usually the case.

        I am not sure why people keep expecting corporations to fight their fights for them then are disappointed when those corporations do what's best for those corporations and shareholders. I expect corporations to do what's best for them first, I do not expect otherwise. I also don't expect them to fight for the fight people should do for themselves. It is not any corporations job to fight for social justice of the people. They can lend they support, but they should not be leading.

    • Jeffsters

      In reply to ghostrider:

      You are correct that users aren’t updating as often. Sadly for Apple they provide updates and software support for years keeping older phones viable for quite awhile. Are you suggesting to increase sales they should do what most Android handset manufacturers do and abandon users?

  3. chrisrut

    Actually, two things desperately need correction in the current crop of assistants.

    First, it must be abundantly clear that I - the user - own the data, and the companies - Apple, Google, whatever, have no rights to the data what-so-ever.

    Second, the assistant - the AI surfacing on my systems - must be working for me, and only for me. It must exist and operate for my benefit, not the benefit of the company that wrote it.

    Ergo: if any information is to be collected and released, it will be by MY AI bot, doing so on my behalf.

    The data, while naturally stored in the cloud, must be under my encryption keys, inaccessible to others. And the AI processing the data should be resident - instantiated - in whatever device I'm working from - not as a tiny sliver of a Big Data processing engine that serves a different master.

    Back at Epson I wanted to market their PC as "The Epson Mind-Amp." point being, that it is an amplifier for MY mind - the user. It is a tool to expand my capabilities - which is the pure essence of user-centric computing. It is there to serve ME, not Google or whomever.

    I, we, you, are NOT the "masters" of the current AIs. They don't work for us. They do stuff we want, and arguably need, but while useful their loyalties are clearly divided.

    I gave a presentation on the past and future of technology last May, in which I asserted that privacy issues are going to become far worse in the near future; the bi-directional mind-computer interface is just around the corner. The privacy implications are staggering! Today's privacy concerns will seem trivial by comparison. Clearly, I concluded, the "right to privacy" must be defined and added to the Constitution before that horse runs out of the barn. Alas, I have little hope such will occur soon enough. Imagine such technology in the hands of groups who would like to reprogram "wrong thinking."

  4. Bats

    YEAH.....this is the company that we were told were "Privacy Advocates." We were told by certain people because they believed Apple's word for it.

  5. dontbeevil

    "what happens on your iphone staus on your iphone"*

    *till we get caught

    • Jeffsters

      In reply to dontbeevil:

      That’s true! Siri requests go to Apple, always have, just as Google, MS, and Amazon. The difference Apple doesn’t store them. It does, as all the others did, anonymizes some recordings where Siri activated but shouldn’t have. Real people then listened, looked at the processing, and worked to reduce these incidents in follow-on releases. Meanwhile MS continues even after being caught which you seem to be fine with! So carry on!

      • dontbeevil

        >he difference Apple doesn’t store them. 

        any proof? or just apple said that like they said many other things

        > Meanwhile MS continues even after being caught which you seem to be fine with! So carry on!

        they stopped, maybe you can just focus on real news about ms, and stop to believe everything from apple

  6. wocowboy

    I would like to read some suggestions as to exactly how these companies are supposed to be able to improve their digital assistants without having a real human being listen to some of the incorrect triggers to analyze just what needs to be done to correct these mistakes. As yet I have seen nothing but outrage articles and no suggestions on ways to improve.

    Also I have not read of one single instance where someone was actually harmed, jailed, persecuted, publicly humiliated or embarrassed as a result of these recordings. Not one! None of these companies associate the recordings with an actual person's name, address, ID, or anything that would allow someone listening to associate that recording with a particular user. If I am wrong I hope someone will cite these instances where public shaming or prosecution has occurred. If there has not been any instances of this, then this is all a big non-event of faux outrage, which is what I suspect it has been all along. I don't care whether it's Apple, Microsoft, Amazon, Google, or whoever, sometimes it takes a real human being to parse what is being said or asked of these digital assistants, and if the user is shouting out at that assistant right in the middle of their lovemaking or while they are in the process of making a drug deal, then I submit that they have other, far more serious problems in their lives and that the tech pundits have far more important things to write about.