Apple’s Siri contractors will no longer hear you having sex, making drug deals

Apple's Siri contractors will no longer hear you having sex, making drug deals

Google may have been forced by regulators (temporarily at least) to stop its subcontractors listening to audio captured by its smart speaker, but Apple clearly sees the benefit of not waiting until it receives a stern letter from the authorities.

Last week The Guardian reported that – like Amazon and Google – Apple was farming out a small percentage of audio recordings made by its Siri digital assistant to third-parties in order that speech recognition could be improved. And sometimes that can result in personal and potentially sensitive information being exposed:

Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

Sign up to our free newsletter.
Security news, advice, and tips.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

The Guardian now reports that Apple has decided to suspend what it calls Siri “grading” globally, while it conducts a “thorough review.”

“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

It is to Apple’s credit that they realised they should think again over this, rather than wait for a regulator to hit them with a cricket bat.

As The Guardian reports, Amazon is “the only major provider of voice assistant technology still using humans to check recordings in the EU.” One hopes that Amazon will also swiftly review whether it is handling its customers’ privacy appropriately.

Update: Amazon now lets you opt-out of having humans review your Alexa conversations.


Graham Cluley is an award-winning keynote speaker who has given presentations around the world about cybersecurity, hackers, and online privacy. A veteran of the computer security industry since the early 1990s, he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows, makes regular media appearances, and is the co-host of the popular "Smashing Security" podcast. Follow him on Twitter, Mastodon, Threads, Bluesky, or drop him an email.

2 comments on “Apple’s Siri contractors will no longer hear you having sex, making drug deals”

  1. David Carroll

    It would be a truly intelligent robot, indeed, to know when not to record or listen to private affairs, since privacy means something different to every human.

  2. Jim

    Why don't the executives and managers at Apple offer to have their conversations recorded? After all they have a vested interest in keeping their company profitable.

What do you think? Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.