Google may have been forced by regulators (temporarily at least) to stop its subcontractors listening to audio captured by its smart speaker, but Apple clearly sees the benefit of not waiting until it receives a stern letter from the authorities.
Last week The Guardian reported that – like Amazon and Google – Apple was farming out a small percentage of audio recordings made by its Siri digital assistant to third-parties in order that speech recognition could be improved. And sometimes that can result in personal and potentially sensitive information being exposed:
Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.
The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
The Guardian now reports that Apple has decided to suspend what it calls Siri “grading” globally, while it conducts a “thorough review.”
“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
It is to Apple’s credit that they realised they should think again over this, rather than wait for a regulator to hit them with a cricket bat.
As The Guardian reports, Amazon is “the only major provider of voice assistant technology still using humans to check recordings in the EU.” One hopes that Amazon will also swiftly review whether it is handling its customers’ privacy appropriately.
Update: Amazon now lets you opt-out of having humans review your Alexa conversations.
It would be a truly intelligent robot, indeed, to know when not to record or listen to private affairs, since privacy means something different to every human.
Why don't the executives and managers at Apple offer to have their conversations recorded? After all they have a vested interest in keeping their company profitable.