What you need to know
- A report claims Apple contractors hear private information in accidental Siri recordings.
- This information can allegedly be used to track user location and identity.
- Apple released a statement saying a small portion of Siri requests are analyzed to improve Siri and dictation.
According to a new report from The Guardian, Apple contractors often hear private user information thanks to accidental Siri recordings. These recordings are passed on to contractors who then grade Siri's responses and whether they were appropriate.
The Guardian, speaking to an anonymous source, claims contractors hear a wide range of recordings, from drug deals to users divulging medical information.
"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on," The Guardian's source said. "These recordings are accompanied by user data showing location, contact details, and app data."
Meanwhile, The Guardian highlights the fact that Apple does not explicitly disclose contractors may listen to Siri recordings. AppleInsider, however, refutes that claim, saying Apple has disclosed to consumers from the beginning that some Siri recordings are listened to by humans.
The source speaking to The Guardian says they were motivated to speak out due to fears of such information being misused.
"There's not much vetting of who works there, and the amount of data we're free to look through seems quite broad," the source said. "It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers - addresses, names and so on."
In response to the report, Apple said only about 1% of daily Siri activations are heard by contractors.
Apple also told iMore's Rene Ritchie:
Apple isn't the only company that uses oversight of its artificial intelligence. Both Amazon and Google employ similar practices in an effort to improve the quality of their respective voice assistants.
Get the best of iMore in in your inbox, every day!
How about apple finally admitting what I have been saying all along. There products are no more private than any other companies. Apple are masters of the reality distortion field.
The fact that the contractors told The Guardian what type of information was heard, means that they broke Apple’s confidentiality rules and you know someone’s going to lose their job. I’d much rather Apple did this in a way that didn’t require listening to people’s recordings, but how else do they improve the service? I think all voice assistants require recordings to be listened to in order to improve them, just be careful what you say. “Accidental Siri recordings” makes this sound like clickbait, the most accidental Siri recordings come from the Apple Watch, otherwise it’s from “Hey Siri” which you’re free to turn off. It’s a lot different from the Alexa which has recordings even before you’ve said “Alexa…”
If you read the story, it's exactly the same thing. You really are special aren't you.
I’ve just re-read the story, and I still can’t see how it’s the same thing. Siri is only recording when it’s activated, either via the side button or by “Hey Siri”. Alexa records all the time, even before the trigger word is said.
"Alexa records all the time" Where do you get that? They both have to listen all the time or they wouldn't hear Hey Siri or Alexa. I imagine they both have buffers to capture that 'always on' speech in order to help improve trigger detection and rejection. My Echo is set to beep when it feels triggered. That does happen occasionally when nothing like Alexa was spoken, but I know when she thinks she heard it.
Does this really surprise anyone? Just curious, if criminal activity is heard, do they have an obligation to report it. Should be an interesting court case. Is any smart device really private? Apple says yes, well apparently not.
In the UK you can’t use recordings as evidence if the people involved aren’t aware. “Smart device” isn’t the same as “Voice assistant”, Smart devices can be secure, but voice assistants require snippets of recordings to be analysed to see how the voice assistant parsed that recording and improve it. You’re free to turn Siri off on your iPhone, or turn off “Hey Siri” so that Siri is only active when you want it to be
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.