Skip to main content

Siri 'distance activation hack'—what you need to know!

Siri
Siri (Image credit: iMore)

Researchers from ANSSI, France's National Information System Security Agency, have demonstrated a "hack" where, using transmitters from a short distance away, they can trigger Apple's Siri and Google Now under certain specific circumstances. Wired:

The researchers' silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don't have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user's voice. (On iPhones, however, Siri is enabled from the lockscreen by default, with no such voice identity feature.) Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.

Wait, why does Wired's headline and lede paragraph focus only on Apple's Siri but the demo and rest of the article talk about Google Now?

Good question.

But my iPhone asked about Siri at setup and does have Voice ID, what gives?

Mine too and I'm not entirely sure. There seem to be several glaring errors in the article as published.

  • As of iOS 9, the iPhone absolutely does have a Voice ID feature, which is part of the setup process.
  • If you buy a new iPhone that comes pre-installed with iOS 9, it will ask during setup if you want to enable "Hey Siri", and then require you to go through a setup process to enable it. (And, if you do, defaults to Lock screen access because that's the whole point of hands-free.)
  • If you upgrade an existing iPhone to iOS 9 and it supports "Hey Siri", the first time you enable it or toggle it off and back on, it will require you to go through the setup.
  • Only iPhone 6s and iPhone 6s Plus can do persistent "Hey Siri". Older iPhones can only do "Hey Siri" when plugged into power, and if expressly enabled in Settings. While battery packs are a possibility, most iPhones connected to headphones on the go probably won't be in that state.

The article does state that, absent "Hey Siri", the "hackers" can spoof the audio signal used by the headset button to trigger Siri.

Doesn't Siri give audio replies and confirmations as well?

Indeed. You don't have to be visually attentive to see mysterious voice commands because Siri responds with audio replies you can hear.

While connected headphones stuffed into pockets are possible, they're probably not the most common situation.

So why does the headline say "silently" hack?

The radio signal beamed to the headset "antenna" is presumably "silent". Siri's responses and confirmations wouldn't be.

At what distances does this "hack" work?

Wired says 16-feet in their headline, but later elaborate:

In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack's range to more than 16 feet.

What can be done if someone activates voice from a distance?

From earlier in the article:

Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.

Communications is something that can be triggered directly using Siri. Other features, like using Siri to go to website, require passcode or Touch ID unlock first. That's if you could get Siri to recognize the likely obscure and not easily rendered name of a malicious website and request it to begin with.

It's also unclear how an audio transmission could form spam or phishing messages precisely enough to be functional. (Again, try getting Siri to render a complex URL for you and see how far you get.)

But everything from podcasts to prankster friends have been triggering voice activation for years, right?

Right. More Wired:

any smartphone's voice features could represent a security liability—whether from an attacker with the phone in hand or one that's hidden in the next room.

While true, of course, it's absolutely nothing new. When Google Now debuted, especially on Google Glass, CES pranksters loved to jump into rooms filled with early adopters and yell out search requests for... various parts of the human anatomy.

It's why Apple and other vendors have added Voice ID technology.

The difference here is a clever use of transmitters by security researchers unfortunately wrapped in what seems like really poor reporting.

Can Apple and Google prevent this type of "hack"?

The researchers make some recommendations for mitigating the "hack", including the ability to set custom trigger words. Some Android devices let you do that already, and I've been wishing for it on iOS for a while as well.

To prevent spoofing the button press, they also recommend enhanced shielding in the headphone cords. Though no doubt an expense, even if only the most popular brands implemented that, it would reduce the surface potential of the "hack".

So, should I be worried about any of this?

Like usual, it's something to be aware of but not overly concerned about. Once again, we should all be more concerned about the state of security reporting at mainstream publications.

Siri and Google Now are enabling and empowering technologies that help people live better lives. We should all be informed and educated about any potential security issues, but not sensationalized or made to feel scared in any way.

What, if anything, should I do?

iPhone 6s implements Voice ID, which profoundly reduces the chances of third-party activations, accidental, prank, or malicious. Keeping your headphones on when they're plugged in mitigates the potential consequences of any third-party activations as well—because you can hear them and intervene.

Security and convenience are almost always at odds. Siri, Google Now, "Hey Siri", and "Okay Google Now" provide for increased convenience at the expense of some security. If you don't use or need voice activation or Lock screen access, by all means turn them off.

Updated 3:30pm: Further explained how "Hey Siri" Voice ID setup works.

Rene Ritchie
Rene Ritchie

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.

11 Comments
  • This kind of stuff is always like it's possible but highly unlikely. I'm sure your chance of getting hit by lightning is higher than you having earphones plugged in while not in your ears and have someone hacking you through the earphones. This is another in a long line of possible hacks that never will materialize. Look at the stagefright hacks. Very scary yet no confirmed actual victims. Should this be patched. Hell ya but don't worry about this.
  • This is a targeted attack so the likelihood of having this happen to you is almost zero. It is important if you are a possible political or corporate espionage target though.
  • it seems crazy that both Siri and Google Now can respond to a radio signal as opposed to a voice but I guess that is the physical limitation of using a headset.
  • As others have commented, the methods seem really obscure and thus chances of seeing this in the wild exceptionally low. As a novel security attack vector, it bears investigation, but VoiceID (and whatever Android comes up with) technology would render this much less feasible. That said, I have absolutely been one of "those pranksters" that grabbed the unattended iPhone of one of my coworkers and used Siri to text someone that worked for him. Hilarity ensued.
  • The suggestion that a “larger battery pack” is required for attacks farther away than 6', as well as the picture of a high-gain antenna, suggests the radio has to transmit a relatively high-power signal in order to get into the phone's earphone cable, and overwhelm the audio input circuits into non-linearity. (Radio signals induce positive and negative voltages at too high a rate for audio processing, so the input circuit has to treat the positive voltages differently from the negative ones, much as primitive radios used diodes that cut off one voltage entirely.) That strong a signal risks frying other parts that the radio gets into. I can't tell if it'd rise to the level of microwave heating (although the photo of the antenna suggests similar frequencies), so this attack might be physically dangerous to people, as well as detectable when your foil-lined cigarette pack next to the iPhone goes up in smoke. Fun for fantasy. Not news.
  • "Keeping your headphones on when they're plugged in mitigates the potential consequences of any third-party activations as well—because you can hear them and intervene." You can also unplug your headphones when you're done using them. That way you either hear the attack while they're plugged in or there's no vector for the attack in the first place. Seems easy enough.
  • The iPhone 6 and iPhone 5s also only respond to a user's voice when using "Hey, Siri", though only the iPhone 6s can do it without being plugged in. This was a feature added in iOS 9, rather than a new hardware feature of the iPhone 6s.
  • It's both. The reason the 6S can do it without being plugged in is because of the new motion coprocessor chip. Older ships would use too much battery while doing this. Sent from the iMore App
  • I'd assume, from the fact it requires headphones plugged in and the power of a backpack-sized battery, that it has nothing to do with voice activation. Instead they're inducing a signal in the headphone cable that initially reproduces the "hold the play button" activation method, NOT "hey siri". Voice ID only seems to apply to the initial "hey siri" activation - "say hey siri" from the Terminal won't activate Siri for me, but Siri will accept subsequent commands in that voice, which is nothing like mine - so once they've activated Siri by mimicking the button press, the voice they use for subsequent commands doesn't matter. Siri is quite limited while the phone is locked, but what it can do is potentially dangerous, and of course your phone may not be locked when the attack is made.
  • Rene, Your VoiceID Link takes readers to a search that returns no results. I followed the link because I didn't know there was a VoiceID bit to iOS 9; I thought I was just training Siri an accent/pronunciation not my actual voice.
  • Much more likely to be hacked by responding to links for winning a car like someone's mother-in-law when reading an article like this. (see what is apparently spam above. But articles about big targets get more attention....