Apple has once again proved that it listens to valid criticism with the immediate global suspension of the Siri listening program that attracted so much controversy.
When it comes to privacy, Siri listens
At issue was quality control.
A small number of conversational snippets were shared with third party human contractors for quality control purposes.
That sounds innocuous enough, the problem is that some of those conversational snippets were highly personal, and many took place without the people who were recorded being aware that Siri was listening.
They may not even have made a conscious request.
Another challenge is that the fact snippets were shared with third parties wasn’t clear in Apple’s terms and conditions and users were given no control or oversight over such use.
Overall, this wasn’t good look for a company that puts so much store in privacy.
The good news is that when it comes to the battle between commercial need as evidenced by human quality control vetting and privacy, privacy has won.
Apple is suspending the program and plans to give customers more control over it in future, the company said.
Update: Google has also suspended a similar scheme.
Privacy: 1; Surveillance: 0
In a statement supplied to TechCrunch, Apple said:
“We are committed to delivering a great Siri experience while protecting user privacy…
“While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
Grading is Apple’s term for the quality control process under which third-party operators would listen to snippets of conversation to figure out how accurately Siri had understood what was said.
This isn’t unusual – Amazon, Google and other voice recognition developers all do this.
[Also read: How ‘Find My’ Mac works in macOS Catalina and iOS 13]
Convenience versus services
However, as recognition of the need and value of privacy grows, we are all becoming more vigilant in our attempts to protect it.
This is exposing a clear division in tech industry business models between those who offer services for a fee and others who swap convenience for our personal data.
Apple doesn’t make its business from personal data – even the music, movies and photos recommendations it offers users are in part developed on the device.
With this in mind it also makes sense for Apple to provide tools to control this Siri grading process, and to ensure customers are aware that it happens at all.
It is somewhat of a misstep that it hadn’t recognized this need before now.
Apple does listen
Critics frequently (and incorrectly) slate Apple as being a remote, arrogant entity.
While it is certainly true the company maintains some degree of public aloofness, history shows it nearly always hears and responds to fair criticism.
It’s decision around Siri quality control grading is a perfect illustration of this: historical decisions around Maps, Macs and iOS batteries also showed this pattern, for example.
The one thing I’m not clear about in this story is where it emerged from.
The first report – which cited an insider from Apple’s third-party quality control teams — appeared in the Guardian, which isn’t really known for cutting edge Apple coverage.
Where did the source come from? What’s interesting about this is that the tale emerged as Apple faces attacks from multiple quarters around privacy and security – attacks it must have anticipated when it itself went on the attack around privacy at CES earlier this year.
What might this mean?
I think it means Apple’s message around the need for privacy and security is getting through, and those of its competitors who cannot match this commitment can see their market shifting.
They also recognize that Apple will continue to improve its privacy and security protections in future. This is driving them to go on the offensive.
Within this context it’s going to be interesting to see how dirty this part of the game gets. It also behoves Apple watchers and tech writers to really verify any claims they see, as it is reasonable to expect that some will be vexatious.
A great deal of money is at stake and privacy and user control of it are becoming winning cards in the game. And not every player holds those cards.