Apple reaffirms they've never worked with any government agency to create a backdoor in any product or service

Apple reaffirms it has never worked with any government agency to create a backdoor in any product or service

On July 18, Jonathan Zdziarski, a former iOS jailbreaker and current iOS forensic scientist and law enforcement consultant, gave a talk at the HOPE X conference in New York City. Zdziarski's talk was on backdoors, attack points and surveillance mechanisms in iOS. In the talk he alleged that there are a number of ways for government agencies, including law-enforcement, to get at the personal data you store on your iPhone, iPod touch, and/or iPad. Zdziarski posted slides from the talk, based on an earlier journal publishing, on his website a couple of days ago. They've since been shared via other websites and social networks, and a lot of confusion and concern has arisen.

When reached for comment, Apple reiterated to iMore that it has never worked with any government agency to create a backdoor in any product or service:

"We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers and Apple for troubleshooting technical issues," Apple told iMore. "A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent."

As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services."

So, what's going on here?

When you connect your iPhone or iPad to iTunes on Mac or Windows — and choose to trust that computer — a pairing record is created that maintains that trust for future connections. Zdziarski claimed that if someone takes physical possession of that computer, they can steal those pairing records, connect to your device, and retrieve your personal information and/or enable remote logging. If they don't have your computer, Zdziarski claimed they can try and generate a pairing record by tricking you into connecting to a compromised accessory, like a dock (juice jacking), and/or by using mobile device management (MDM) tools intended for enterprise to get around safeguards like Apple's Trusted Device requestor.

Because the NSA surveillance controversy is still fresh in many people's minds, Zdziarski added a "don't panic" statement on his blog, emphasizing that he wasn't accusing Apple of working with the NSA, but does suspect that the NSA might be using the techniques he outlined to collect data.

Zdziarski also praised iOS 7 security, saying that Apple has hardened its devices against typical attacks, including making changes that have shut down a "number of privately used spyware apps." However, he'd like to see them strengthen it further with asymmetric encryption of incoming messages and media, the file system equivalent to "session keys," a boot password, and a backup password.

Apple is rolling out new security and privacy protections as part of its upcoming iOS 8 software update, scheduled for release this fall. These improved features include MAC address randomization to prevent stores from tracking you as you walk around to shop, "while-in-use" rather than "always-on" location permissions to prevent apps from tracking you when they don't need to, better privacy controls for your contacts, always-on VPN to secure your connections, and more.

Bottom line, security is constant vigilance, and companies are only ever as good as the speed and efficacy of their last patch. Following Zdziarski's presentation, there'll be a lot more attention paid to just these kinds of data leaks, and that's good for all of us. Until then, if you're concerned about privacy and security, Apple provides several tools and features you can use to further lock down your iPhone, iPod touch, and/or iPad:

Rene Ritchie

Editor-in-Chief of iMore, co-host of Iterate, Debug, Review, Vector, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts

 

29
loading...
0
loading...
156
loading...
0
loading...

← Previously

Apple's new ad proclaims the MacBook Air is 'the notebook people love'

Next up →

iSkin exo case winners have been chosen!

Reader comments

Apple reaffirms it has never worked with any government agency to create a backdoor in any product or service

18 Comments

You can turn it off though. If you don't agree to the pop-up to share data with Apple, the data is only shared with your own computer. Every device you work with also has to be explicitly approved. If some idiot let's his iPhone be docked with some unknown accessory at an airport or a party or whatever, then that's not Apple's problem. As far as I understand Apple's answer, none of this data ever leaves your own devices without your explicit permission.

Also, no system will ever be secure enough for a security expert, and if it was, no one would EVER want to use it except them.

Case in point ... "boot password" and "backup password." Those would completely fuck up the user experience and seriously inconvenience or possibly even destroy all the data of, most users. The gain would be some theoretical extra fractional percentile of "safety."

I'm hoping Apple plugs these data leaks and soon, yet at the same time, in order not to panic people, it should be noted that using these attacks doesn't seem trivial.

If the NSA is looking into you, and have that level of access into your systems, we're now arguing size of the bonfire in the forest fire.

Any time someone with skill and resources takes physical possession of your computer and devices, let alone an organization of the caliber of the NSA, security is going to be tested to the extreme.

One extremely useful thing Apple could do would be a place on the iOS device to see the list of trusted external computers, and to manage/remove them.

Sent from the iMore App

Yup! Absolutely! I've paired one of my phones to install iOS betas. The other one has only ever known the gentle touch of iCloud. But I'd like that list in Settings.

I think when you read Zdziarski full posting (not just the powerpoint slides) you will see he is talking about way more than what was documented on that page.

Further, Apple's denial isn't a denial of putting these back doors into your device, its a denial of not working with any government agency when they put the back doors into your device. They went ten miles beyond what any law enforcement would be authorized to collect.

Further they didn't provide any method of turning these features off.
They could have limited it to exactly what law enforcement would ever be allowed to take.
They could have not made these features available at all. Diagnostics? Come on. Even you don't believe that.

Finally read through all the slides, and it seems a misnomer to call these leaks, as that implies it is accidental; he makes a pretty compelling case that the services that can be used to extract data bypassing normal measures are running by design. This does not mean we should don the tinfoil hats and panic, but I for one think Apple owes us answers to the "Questions for Apple" slide.

(and not just because question #4 was what I asked in my other comment :) )

Sent from the iMore App

All this talk by people about backdoors and spying is getting old fast. I know we all love our privacy but really even those that claim to be super secretive and claim to have a tiny footprint when it comes to their personal information are deluding themselves.

I know that the 'town crier' living next door to me retransmits every phone conversation I hears me have and I am sure that I am not the only person that knows that it doesn't take any special science for the world to know what is going on in their lives.

So I think it makes a difference if there is or isn't a secret back door that Apple has made? No! Do I care, NO!

I am coming back to Apple with the launch of the iPhone 6 (or whatever they choose to call it) because I tried Android and it's not for me. Nothing to do with it being a gateway to my inner secrets, but because I miss iOS, I miss the fact that if I go to the CNet App it works every time. I miss that if I go to the Google Map's App it works every time I miss that everything works so much smoother on iOS. (Yes all of these regularly give that '...has stopped working' message) and that's on the Nexus 5 -- heaven forbid anything like a Samsung that has been tweaked and messed with.

I do not have a concern about privacy, maybe I am not paranoid enough. I am sure that if the NSA, GCHQ, FBI, CIA or Scooby and the other occupants of the Mystery Machine are determined enough they will get the information off my phone anyway!

No foil on my head - especially not here in the desert - I would end up with a roasted head.

Apparently messages, photos, videos, contacts, audio recordings, and call history are "limited diagnostic data" according to Apple. Go figure.

Apple created undocumented services that bypass security, can't explain why they are there and continue to parrot "we don't work with governments" line. I hope you guys wrote back challenging this dumb statement by Apple.

The service com.apple.mobile.house_arrest is perhaps my favorite as it pulls data (those pesky things like username and password needed for "diagnostics") from third party apps and it has a cool name.

'I don’t buy for a minute that these services are intended solely for diagnostics. The data they leak is of an extreme personal nature. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a back door to bypass it?' - Zdiarsky

The evidence is pretty damming as a matter of fact. I don't know how anyone can make excuses for this blatant security flaw. I'll be paying attention to Steve Gibson's security podcast today.

When does the tin foil hat crowd get compromised to the point they go away? I'm waiting for that day. In the mean time everyone is free to go somewhere else if they think Apple is intentionally allowing easy access to their data.