Earlier this week forensic researcher Jonathan Zdziarski's work on security exploits in iOS pairing records and potential data leaks in diagnostic services went viral. Unfortunately, it was his slides, which used more provocative language and lacked the context of his talk, and not his pay-walled yet far more understandable journal article, that made the rounds. Tragically, many in the media pounced on the attention-getting potential, posting alarming articles that did nothing but spread fear, uncertainty, and doubt (FUD) to a mainstream customer-base that deserves much, much better. Apple responded with comments to iMore on Monday, and with a Knowledge Base (KB) article on Tuesday. However, there's been no word yet on whether or not the exploits and potential data leaks will be closed and, if they will be, how soon. So, what does it all mean?
In order to understand all of this, it's crucial to remember that security and convenience are eternally at war with each other. What makes something like an iPhone or iPad convenient for mainstream use can also compromise the absoluteness of its security. While some people might dearly wish every single action on their device required a long pseudorandom password, retina scan, and Lightning dongle, other people might wish to never see a single passcode or confirmation dialog cross their screens ever again in their lives. Apple's job is to find the best balance for the most people, most of the time.
Locking down the Lock screen
Lock screen is a classic example. The Lock screen has two jobs that exist in direct opposition. One job it to protect your iPhone from unintentional or unauthorized access. It does this via swipe-to-unlock, passcode, Touch ID or password.
The other is to pass through communications and functionality. That includes legally mandated services, like emergency dialing, desirable features like toggles and controls, interactive notifications and widgets, voice assistant and passes, media controls and more, and important functions like calls and messages.
You want to be able to answer the phone quickly, but not for anyone to be able to get into your contacts. You want to be able to toggle the flashlight on immediately, but not have anyone else toggle airplane mode to take your device offline. You want to be able to see who's messaging you, but you don't want people you don't trust seeing what they messaged. These conflicts of interest create incredible complexity, and that complexity is why we see Lock screen bypasses in the news, and in the patch lists of iOS updates.
To quote Steve Gibson from this week's Security Now! podcast, the Lock screen as it currently exists is, after first passcode entry following a reboot, an interface lock not a data lock. (Starting roughly 1:10:00 in.) That can and should always be improved, but those improvements will always be a compromise in furtherance of both its jobs.
Pairing the records
Pairing records are another example of the tension that exists between security and convenience. They were designed to bring the pre-post-PC era iTunes sync mechanism some manner of initial security, followed by persistent convenience. The upside is, after tapping the "Trust this Computer" dialog, you never have to do it again. The downside is, after tapping the "Trust this Computer" dialog, you never have to do it again.
This too can and should always be improved unless and until we fully leave the PC for the cloud. Pairing records should be secured against pilferage, expired after going unused for some period of time, and exposed in some interface so they can be visually audited and manually deleted when and as needed.
What "undocumented" means
Likewise, providing features intended to allow enterprise mobile device management (MDM) systems to reclaim a device from a deceased or fired employee, or an employee under investigation, opens the device up to those features being abused. Having a device wiped instead of re-provisioned unless an explicit consent action is performed to prove physical possession might strike a better balance.
Packet sniffers intended for developers means they can also be found on any device by those who know where to look. File relays intended to provide diagnostic information can be created far too broadly rather than too narrowly.
Of course, if someone has physical access to your device and has broken through the security, they could activate or install sniffers, dump data, and do pretty much anything their skills and resources allow. However, making it harder is always better. Keeping dev tools restricted to dev modes and eliminating data from diagnostics that isn't absolutely necessary to those diagnostics is always better.
That some of the services in question are "undocumented", however, is a red herring. With some notable exceptions like WebKit, iOS isn't an open source operating system. Apple keeps private interfaces and services private. Data leaks are absolutely a problem, but being "undocumented" isn't. That part is perfectly normal unless and until it becomes packaged for developers or customers and made public.
The war between security and convenience even plays out across the field of jailbreak. Many people anxiously await a new jailbreak. They can't wait to try it out and to spread the word. Yet jailbreaks also trade the security Apple has built into iOS for the convenience of custom tweaks and the ability to run unsigned code. That some exploits are celebrated and others used to induce panic shows just how poorly security is often communicated.
To quote Gibson again, to stand up is to be at constant risk of falling down. To provide convenience is to be at constant risk of compromising security.
Zdziarski, much to his credit, pleaded with journalists not to overreact to his findings. Not to scare people. He pointed out the advancements in iOS 7 security and how Apple has hardened the iPhone and iPad against typical attacks. Some of that got mentioned alongside the cries of "NSA backdoors" and "undocumented services" and "packet sniffers on zillions of phones". Yet those cries ultimately stole attention not only from readers but from the real issue: That iOS security has to get better. Always. As hard as it is. As tricky as it is. As sum-of-all-compromises as it is. It has to get better and always because that's Apple's job.
It's the job of people reporting on security to read the material, even the paywalled stuff, numerous times, to understand it as best as possible, to consult with experts as needed, to digest it and ultimately to explain it as sanely and succinctly as possible to people who shouldn't to be made scared for the sake of sensationalism, but informed and empowered.
Hopefully the FUD surrounding Zdziarski's work will burn itself out and the work itself and the attention will go where it belongs: fixing exploits and making things better. And hopefully Zdziarski and others like him keep finding and reporting anything that gets missed so even more can be fixed and made better.
Because, in the eternal war between security and convenience, we the customers are the ones who need to win.