Researchers from ANSSI, France's National Information System Security Agency, have demonstrated a "hack" where, using transmitters from a short distance away, they can trigger Apple's Siri and Google Now under certain specific circumstances. Wired:
The researchers' silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don't have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user's voice. (On iPhones, however, Siri is enabled from the lockscreen by default, with no such voice identity feature.) Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.
Wait, why does Wired's headline and lede paragraph focus only on Apple's Siri but the demo and rest of the article talk about Google Now?
But my iPhone asked about Siri at setup and does have Voice ID, what gives?
Mine too and I'm not entirely sure. There seem to be several glaring errors in the article as published.
- As of iOS 9, the iPhone absolutely does have a Voice ID feature, which is part of the setup process.
- If you buy a new iPhone that comes pre-installed with iOS 9, it will ask during setup if you want to enable "Hey Siri", and then require you to go through a setup process to enable it. (And, if you do, defaults to Lock screen access because that's the whole point of hands-free.)
- If you upgrade an existing iPhone to iOS 9 and it supports "Hey Siri", the first time you enable it or toggle it off and back on, it will require you to go through the setup.
- Only iPhone 6s and iPhone 6s Plus can do persistent "Hey Siri". Older iPhones can only do "Hey Siri" when plugged into power, and if expressly enabled in Settings. While battery packs are a possibility, most iPhones connected to headphones on the go probably won't be in that state.
The article does state that, absent "Hey Siri", the "hackers" can spoof the audio signal used by the headset button to trigger Siri.
Doesn't Siri give audio replies and confirmations as well?
Indeed. You don't have to be visually attentive to see mysterious voice commands because Siri responds with audio replies you can hear.
While connected headphones stuffed into pockets are possible, they're probably not the most common situation.
So why does the headline say "silently" hack?
The radio signal beamed to the headset "antenna" is presumably "silent". Siri's responses and confirmations wouldn't be.
At what distances does this "hack" work?
Wired says 16-feet in their headline, but later elaborate:
In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack's range to more than 16 feet.
What can be done if someone activates voice from a distance?
From earlier in the article:
Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.
Communications is something that can be triggered directly using Siri. Other features, like using Siri to go to website, require passcode or Touch ID unlock first. That's if you could get Siri to recognize the likely obscure and not easily rendered name of a malicious website and request it to begin with.
It's also unclear how an audio transmission could form spam or phishing messages precisely enough to be functional. (Again, try getting Siri to render a complex URL for you and see how far you get.)
But everything from podcasts to prankster friends have been triggering voice activation for years, right?
Right. More Wired:
any smartphone's voice features could represent a security liability—whether from an attacker with the phone in hand or one that's hidden in the next room.
While true, of course, it's absolutely nothing new. When Google Now debuted, especially on Google Glass, CES pranksters loved to jump into rooms filled with early adopters and yell out search requests for... various parts of the human anatomy.
It's why Apple and other vendors have added Voice ID technology.
The difference here is a clever use of transmitters by security researchers unfortunately wrapped in what seems like really poor reporting.
Can Apple and Google prevent this type of "hack"?
The researchers make some recommendations for mitigating the "hack", including the ability to set custom trigger words. Some Android devices let you do that already, and I've been wishing for it on iOS for a while as well.
To prevent spoofing the button press, they also recommend enhanced shielding in the headphone cords. Though no doubt an expense, even if only the most popular brands implemented that, it would reduce the surface potential of the "hack".
So, should I be worried about any of this?
Like usual, it's something to be aware of but not overly concerned about. Once again, we should all be more concerned about the state of security reporting at mainstream publications.
Siri and Google Now are enabling and empowering technologies that help people live better lives. We should all be informed and educated about any potential security issues, but not sensationalized or made to feel scared in any way.
What, if anything, should I do?
iPhone 6s implements Voice ID, which profoundly reduces the chances of third-party activations, accidental, prank, or malicious. Keeping your headphones on when they're plugged in mitigates the potential consequences of any third-party activations as well—because you can hear them and intervene.
Security and convenience are almost always at odds. Siri, Google Now, "Hey Siri", and "Okay Google Now" provide for increased convenience at the expense of some security. If you don't use or need voice activation or Lock screen access, by all means turn them off.
Updated 3:30pm: Further explained how "Hey Siri" Voice ID setup works.
SENA Wallet Book Case beautifully protects your iPhone and valuables
This gorgeous case holds your iPhone as well as up to three cards plus cash securely.
'App Store Confidential' is in number one spot on Amazon thanks to Apple
The controversial "App Store Confidential" book is now the number one bestseller on Amazon. All because Apple is trying to get it banned.
AirPods Pro Lite rumors just won't go away with mid-2020 launch suggested
If AirPods and AirPods Pro aren't right for you, what about AirPods Pro Lite? A new report suggests a mid-2020 launch window.
Webcam hacking is real, but you can protect yourself with a privacy cover
Here are the best webcam privacy covers available for your MacBook that’ll give you some serious peace of mind.