From the Editor's Desk: Siri and privacy: Is it worth it?

HomePod
HomePod (Image credit: iMore)

This past week, some information came out that people contracted to work on Siri's quality control regularly hear up to 30 seconds of our personal conversations that Siri was not supposed to be listening to. This information, according to an unnamed source that spoke to The Guardian, some of that private audio includes conversations with doctors, drug deals, and even people engaged in sexual activity.

There's a lot to unpack here, and iMore's Apple analyst Rene Ritchie has already done some research and spoke to experts in the field of virtual assistant learning and has a deep-dive analysis of the situation right now.

Apple did respond to the issue with an explanation about its process and regulation regarding Siri training, which is also included in its whitepaper and user agreement.

A small portion of Siri requests are analyzed to improve Siri and Dictation. User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements.

While it's never been a secret that Apple has a department for quality assurance similar to Amazon and Google for Siri, which includes humans hearing accidental recordings of conversations, this reminder (not revelation) strikes me harder. I expect Amazon and Google to be less concerned with my privacy. I know that my browsing and online purchasing activities are being shared with marketers. Apple, on the other hand, stands on a very large soapbox of privacy. So, even though humans listening to recordings of our Siri queries is necessary (at least for now) to building a better experience, I wish Apple would put this information at the forefront of its devices' interfaces.

That friendly little handshake logo you see when you first install a new operating system that links to all the information about Apple's privacy policies? Not clear enough for me. The fact that you can disable Siri altogether and even opt-out of sharing your iCloud activities with Apple's analytics for quality assurance purposes? Not explanatory enough.

It's not that Apple, Google, Amazon, and other companies that are working on virtual assistant technology are being nefarious. They're providing us with details on what happens when we purposefully or accidentally trigger our assistants. It's that we haven't asked these companies to be clearer about explaining it to us. We accept the Terms and Conditions and start asking questions about the weather or our schedule.

It's only now that we, as the consumers of these wonderful virtual assistants, are starting to ask more important questions about who's listening, why they're listening, and how we can opt-out of being listened to.

This is a whole new world, technologically speaking, and we're only starting to grasp the reality of how much privacy we give away in exchange for convenience. Even when we trust a company to protect our privacy, as I do with Apple, we should consider what's being done behind the scenes and ask those holding the keys to help us understand it better.

It is, after all, our privacy, not theirs.

Lory

Lory Gil

Lory is a renaissance woman, writing news, reviews, and how-to guides for iMore. She also fancies herself a bit of a rock star in her town and spends too much time reading comic books.  If she's not typing away at her keyboard, you can probably find her at Disneyland or watching Star Wars (or both).