On iOS, 'backdoors', and the eternal war between security and convenience

On iOS, 'backdoors', and the eternal war between security and convenience

Earlier this week forensic researcher Jonathan Zdziarski's work on security exploits in iOS pairing records and potential data leaks in diagnostic services went viral. Unfortunately, it was his slides, which used more provocative language and lacked the context of his talk, and not his pay-walled yet far more understandable journal article, that made the rounds. Tragically, many in the media pounced on the attention-getting potential, posting alarming articles that did nothing but spread fear, uncertainty, and doubt (FUD) to a mainstream customer-base that deserves much, much better. Apple responded with comments to iMore on Monday, and with a Knowledge Base (KB) article on Tuesday. However, there's been no word yet on whether or not the exploits and potential data leaks will be closed and, if they will be, how soon. So, what does it all mean?

In order to understand all of this, it's crucial to remember that security and convenience are eternally at war with each other. What makes something like an iPhone or iPad convenient for mainstream use can also compromise the absoluteness of its security. While some people might dearly wish every single action on their device required a long pseudorandom password, retina scan, and Lightning dongle, other people might wish to never see a single passcode or confirmation dialog cross their screens ever again in their lives. Apple's job is to find the best balance for the most people, most of the time.

Locking down the Lock screen

Lock screen is a classic example. The Lock screen has two jobs that exist in direct opposition. One job it to protect your iPhone from unintentional or unauthorized access. It does this via swipe-to-unlock, passcode, Touch ID or password.

The other is to pass through communications and functionality. That includes legally mandated services, like emergency dialing, desirable features like toggles and controls, interactive notifications and widgets, voice assistant and passes, media controls and more, and important functions like calls and messages.

You want to be able to answer the phone quickly, but not for anyone to be able to get into your contacts. You want to be able to toggle the flashlight on immediately, but not have anyone else toggle airplane mode to take your device offline. You want to be able to see who's messaging you, but you don't want people you don't trust seeing what they messaged. These conflicts of interest create incredible complexity, and that complexity is why we see Lock screen bypasses in the news, and in the patch lists of iOS updates.

To quote Steve Gibson from this week's Security Now! podcast, the Lock screen as it currently exists is, after first passcode entry following a reboot, an interface lock not a data lock. (Starting roughly 1:10:00 in.) That can and should always be improved, but those improvements will always be a compromise in furtherance of both its jobs.

Pairing the records

Pairing records are another example of the tension that exists between security and convenience. They were designed to bring the pre-post-PC era iTunes sync mechanism some manner of initial security, followed by persistent convenience. The upside is, after tapping the "Trust this Computer" dialog, you never have to do it again. The downside is, after tapping the "Trust this Computer" dialog, you never have to do it again.

This too can and should always be improved unless and until we fully leave the PC for the cloud. Pairing records should be secured against pilferage, expired after going unused for some period of time, and exposed in some interface so they can be visually audited and manually deleted when and as needed.

What "undocumented" means

Likewise, providing features intended to allow enterprise mobile device management (MDM) systems to reclaim a device from a deceased or fired employee, or an employee under investigation, opens the device up to those features being abused. Having a device wiped instead of re-provisioned unless an explicit consent action is performed to prove physical possession might strike a better balance.

Packet sniffers intended for developers means they can also be found on any device by those who know where to look. File relays intended to provide diagnostic information can be created far too broadly rather than too narrowly.

Of course, if someone has physical access to your device and has broken through the security, they could activate or install sniffers, dump data, and do pretty much anything their skills and resources allow. However, making it harder is always better. Keeping dev tools restricted to dev modes and eliminating data from diagnostics that isn't absolutely necessary to those diagnostics is always better.

That some of the services in question are "undocumented", however, is a red herring. With some notable exceptions like WebKit, iOS isn't an open source operating system. Apple keeps private interfaces and services private. Data leaks are absolutely a problem, but being "undocumented" isn't. That part is perfectly normal unless and until it becomes packaged for developers or customers and made public.

Bottom line

The war between security and convenience even plays out across the field of jailbreak. Many people anxiously await a new jailbreak. They can't wait to try it out and to spread the word. Yet jailbreaks also trade the security Apple has built into iOS for the convenience of custom tweaks and the ability to run unsigned code. That some exploits are celebrated and others used to induce panic shows just how poorly security is often communicated.

To quote Gibson again, to stand up is to be at constant risk of falling down. To provide convenience is to be at constant risk of compromising security.

Zdziarski, much to his credit, pleaded with journalists not to overreact to his findings. Not to scare people. He pointed out the advancements in iOS 7 security and how Apple has hardened the iPhone and iPad against typical attacks. Some of that got mentioned alongside the cries of "NSA backdoors" and "undocumented services" and "packet sniffers on zillions of phones". Yet those cries ultimately stole attention not only from readers but from the real issue: That iOS security has to get better. Always. As hard as it is. As tricky as it is. As sum-of-all-compromises as it is. It has to get better and always because that's Apple's job.

Phenomenal new features like Extensibility and Continuity have to be added while preserving or improving security because that's Apple's job.

It's the job of people reporting on security to read the material, even the paywalled stuff, numerous times, to understand it as best as possible, to consult with experts as needed, to digest it and ultimately to explain it as sanely and succinctly as possible to people who shouldn't to be made scared for the sake of sensationalism, but informed and empowered.

Hopefully the FUD surrounding Zdziarski's work will burn itself out and the work itself and the attention will go where it belongs: fixing exploits and making things better. And hopefully Zdziarski and others like him keep finding and reporting anything that gets missed so even more can be fixed and made better.

Because, in the eternal war between security and convenience, we the customers are the ones who need to win.

Rene Ritchie

Editor-in-Chief of iMore, co-host of Iterate, Debug, Review, Vector, and MacBreak Weekly podcasts. Cook, grappler, photon wrangler. Follow him on Twitter and Google+.

More Posts

 

11
loading...
28
loading...
103
loading...
0
loading...

← Previously

Deal of the Day: RevJams FlipBack Smart Cover for iPhone 5/5S

Next up →

The Walking Dead Episode 4, Season 2 shambles onto App Store

Reader comments

On iOS, 'backdoors', and the eternal war between security and convenience

17 Comments
Sort by Rating

I don't think the security-versus-convenience rubric applies to Zdziarski's work, unless there is some convenience to extraction of user data like messages and photos.

Sent from the iMore App

It applies to the pairing records which allow entry to get to the diagnostics. It applies to the MDM which can be used to force pairing. It applies to the passcode decryption.

I'll edit to make it more clear, step by step.

...but it does not apply to the types of data exposed on Zdziarski's paper. If there is a diagnostic routine that involves access to the contents of my messages or photos, I would *love* to hear precisely what that is. After they explain that, they can tell me what convenience I am enjoying in return for the trade off of having those compromiseable services running, with access to that data. Yes, the chance may be remote, but, as long as it is nonzero, there has to be a reason, otherwise why would Apple actively maintain that code? I don't think it is unfair to demand what those reasons are, especially of a company that portrays itself as more protective of user privacy than its competitors.

Apple's press release does not even attempt addressing these issues, making it hard not to conclude Apple is trading some of *my* security for *their* convenience.

Sent from the iMore App

Yes, that bottom part is what I meant. The convenience for the diagnostician. "Just open up everything, it's all under the covers anyway" is a compromise that gets made in large companies with limited resources.

That's a design decision that, given the publicity now, will hopefully be reconsidered, fixed, and soon.

He published a video showing the process:

https://m.youtube.com/watch?v=z5ymf0UsEuw&feature=youtu.be

The amount of diagnostic data available from this is absolutely trivial in comparison to the amount of personal data - the contents of contact list, camera roll, text messages, and even the audio files of voice mail are exposed wirelessly once the trusted pairing is established, with no way to stop it short of wiping the phone completely. I can see why he thought the NSA might be using this when documents suggested they were intercepting devices before delivery - this would be absolutely *perfect* for their purposes.

That is not to say Apple is necessarily complicit in NSA efforts, of course, but if you think all of this is available purely through an inadvertent choice when designing diagnostics, you must think Apple holds a breathtakingly cavalier attitude about user privacy.

Sent from the iMore App

He published a video showing the process:

https://m.youtube.com/watch?v=z5ymf0UsEuw&feature=youtu.be

The amount of diagnostic data available from this is absolutely trivial in comparison to the amount of personal data - the contents of contact list, camera roll, text messages, and even the audio files of voice mail are exposed wirelessly once the trusted pairing is established, with no way to stop it short of wiping the phone completely. I can see why he thought the NSA might be using this when documents suggested they were intercepting devices before delivery - this would be absolutely *perfect* for their purposes.

That is not to say Apple is necessarily complicit in NSA efforts, of course, but if you think all of this is available purely through an inadvertent choice when designing diagnostics, you must think Apple holds a breathtakingly cavalier attitude about user privacy.

Sent from the iMore App

He published a video showing the process:

https://m.youtube.com/watch?v=z5ymf0UsEuw&feature=youtu.be

The amount of diagnostic data available from this is absolutely trivial in comparison to the amount of personal data - the contents of contact list, camera roll, text messages, and even the audio files of voice mail are exposed wirelessly once the trusted pairing is established, with no way to stop it short of wiping the phone completely. I can see why he thought the NSA might be using this when documents suggested they were intercepting devices before delivery - this would be absolutely *perfect* for their purposes.

That is not to say Apple is necessarily complicit in NSA efforts, of course, but if you think all of this is available purely through an inadvertent choice when designing diagnostics, you must think Apple holds a breathtakingly cavalier attitude about user privacy.

Sent from the iMore App

I agree with your overall sentiment but your explanations appear to directly contradict what Zdziarski said. I'm curious on where you came up with your data.

I recommend you read his latest entry on this topic:http://www.zdziarski.com/blog/?p=3466

Here is a relevant quote that you seem to be parroting:

"They appear to be misleading about its capabilities, however, in downplaying them, and this concerns me. I wonder if the higher ups at Apple really are aware of how much non-diagnostic personal information it copies out, wirelessly, bypassing backup encryption. "

I don't think I'm contradicting anything he said. Apple made PR and KB responses, that's it so far.

Did you watch the Security Now! segment?

I didn't watch the Security Now segment because Gibson has specious credentials as a security authority and has a history of being tragically very wrong. What's ironic is Gibson basically did the same thing in 2006 (https://www.grc.com/wmf/wmf.htm , warning, that site will make your eyes bleed) you are accusing other of doing now. Seems like an appeal to an odd authority to me. (some more info on him: http://radsoft.net/news/roundups/grc/ )

To me it's scary that Apple not only acknowledges these exploits but has also justified their existence.

Otherwise it's good to know that an Apple "Genius" can grab your Facebook credentials when you bring an iPhone/iPad in for repair all in the name of diagnostic data.

I'm not appealing to his authority, I'm thinking he did a good job breaking down a complex subject.

And yes, the data leaks absolutely need closing and asap.

I think this kind of thing happens from time to time knocking fanboys for a particular company back to reality. My example was that tweet after WWDC about Apple has security and Android has "I forget" now Apple has both security and "what ever it was". Now would you say this? I wouldn't.

Basically no OS is perfect. There is only so much time to get this stuff out the door. To make the deadline Apple in this case make the customer facing things finished and took short cuts in the things customers (being closed source) wouldn't find out about that only internal Apple employees would have access to. Outing this will get Apple to fix it. They want to make great products. I have not doubt about this. Same with Google. They may do something dumb but call them on it and they'll fix it. Maybe iOS 9 is missing something because they have to rework this but in the long run it's good this came out so we can get it fixed. Between Android, iOS, WIndows Phone, and BB10 you're pretty safe. People need to want something from you to exploit these systems. Use the lock code and use the encryption feature and the vast majority of people are safe enough.

Sadly, I less worried about the average layperson getting access to my phone than I am about my own government. I have nothing to hide, but with the issues we have faced from the IRS and targeting specific political parties, I wonder when "leaked" phone messages will be next. I am big on the individual and their right to privacy.

Anything that is computerized has a possibility of that data being got to somehow. Unless you buy something like the Black-phone (android security minded phone with encrypted text and calls) or have a blackberry running through BES you only so secure. Even running through BES the government can request keys to get at your messages and data (I believe).

https://www.blackphone.ch/

IMHO: Anyone consuming a an editorial must remember who the writer is and what they do or what their passions are, as that will color their content. While I enjoy and support Mr. Gibson, he loves the technology of cryptography. Not the content. What Mr. Zdziarski is talking about is the content and not the technology. These two are not on the same page.