Responding to the recent uproar over contractors listening to conversations that users have with Siri, Apple is making changes.
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” an Apple statement explains. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”
Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!
"*" indicates required fields
Apple may have halted its Siri grading program in response to complaints. But it will start up the program “later this fall” after it’s made the following changes:
In short, Apple is doing for Siri exactly what Microsoft should be doing—but won’t—for Windows 10. Good for them.
m_p_w_84
<blockquote><em><a href="#452380">In reply to ghostrider:</a></em></blockquote><p>I completely agree. </p><p><br></p><p>I mean the literal panic of Tim Cook when their MASSIVE profits were going to be dented by tariffs; straight to the top to get that sorted for the shareholders.</p><p><br></p><p>If they care so much about privacy why not half the price of the iPhones so more people can enjoy that benefit. </p>
Bats
<p>YEAH…..this is the company that we were told were "Privacy Advocates." We were told by certain people because they believed Apple's word for it.</p>
dontbeevil
<p>"what happens on your iphone staus on your iphone"*</p><p><br></p><p>*till we get caught</p>
dontbeevil
<p>>he difference Apple doesn’t store them. </p><p><br></p><p>any proof? or just apple said that like they said many other things</p><p><br></p><p>> Meanwhile MS continues even after being caught which you seem to be fine with! So carry on!</p><p><br></p><p>they stopped, maybe you can just focus on real news about ms, and stop to believe everything from apple</p>
wocowboy
Premium Member<p>I would like to read some suggestions as to exactly how these companies are supposed to be able to improve their digital assistants without having a real human being listen to some of the incorrect triggers to analyze just what needs to be done to correct these mistakes. As yet I have seen nothing but outrage articles and no suggestions on ways to improve. </p><p><br></p><p>Also I have not read of one single instance where someone was actually harmed, jailed, persecuted, publicly humiliated or embarrassed as a result of these recordings. Not one! None of these companies associate the recordings with an actual person's name, address, ID, or anything that would allow someone listening to associate that recording with a particular user. If I am wrong I hope someone will cite these instances where public shaming or prosecution has occurred. If there has not been any instances of this, then this is all a big non-event of faux outrage, which is what I suspect it has been all along. I don't care whether it's Apple, Microsoft, Amazon, Google, or whoever, sometimes it takes a real human being to parse what is being said or asked of these digital assistants, and if the user is shouting out at that assistant right in the middle of their lovemaking or while they are in the process of making a drug deal, then I submit that they have other, far more serious problems in their lives and that the tech pundits have far more important things to write about. </p>