Apple suspends Siri response grading program after privacy concerns

Apple Inc. said on Friday it suspended its global program where it analyzed recordings from users interacting with its voice assistant Siri, after some privacy concerns were raised about the program.

“While we conduct a thorough review, we are suspending Siri grading globally,” an Apple spokeswoman said in a statement, adding that in a future software update, users will be able to opt out of the program.

In an effort to perform quality checks and improve the voice assistant’s responses, contractors graded Siri’s answers to user queries, The Guardian reported. They also looked at whether the response was triggered accidentally, without a deliberate query from the user, the newspaper said.

MacDailyNews Take: Apple’s statement, verbatim: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Apple’s stance on privacy should be sacrosanct, but not to the detriment of quality. Allowing users to opt-in to help improve Siri is a good solution.

Next Post

7 Comments

Comment navigation

Cue all the folks who will be complaining in a few months that Siri doesn’t seem to be improving. Opt out is a good notion, but the data was already anonymized. Where was all the outrage against the other voice assistants that were doing even more listening without any privacy measures?

It only seems alarmist after the coverage of Alexa and Google Assistant being accused of the same a few weeks ago. People, especially iOS users, gained a false sense of security that Siri wasn’t one of the group. This week users feel their trust was misplaced. That difference between what users ‘expect’ of Google/Amazon’s assistants vs Siri makes it ‘feel’ worse. Apple just had a lot more to live up to, partly because they keep advertising their Security/Privacy as being so much better. The bit about how Apple Watch automatically starts listening by default w/o a trigger word didn’t help matters.

Unless you are actually familiar with the process each company takes in securing and using audio files generated from their respective Assistants, you shouldn’t be accusing anyone of false equivalencies or bad tech journalism.

It is good to see Apple admitting that they too use audio recordings to better Siri. However keeping it mum when others are accused of the same seems ‘dishonest’ for the face Apple displays publicly. Following Google and Amazon now with a system to give the user a choice to ‘participate’ is a good first step. Having the choice an ‘opt-in’ vs Google/Amazon’s ‘opt-out’ may help towards rebuilding trust.