Reuters detonated a huge controversy in the Apple and security spaces yesterday with this headline — Apple dropped plan for encrypting backups after FBI complained - sources.
First, what Reuters is referring to as "backups" here is specifically iCloud backups. There are still two ways to backup Apple devices — to iCloud and locally to a Mac or PC. For a very long time now, and still, to this day, you can locally back up to your Mac or PC and opt-in to those backups being password encrypted. It's not a plan, it's a checkbox. And none of that has changed.
Second. Based on reactions to the headline, many people, including very tech-savvy people, didn't seem to realize or remember that iCloud backups weren't quote-unquote encrypted.
I say quote-unquote because the backups actually are encrypted. But, Apple has their own set of keys and can access them.
Which… is not unusual. There are a couple of reasons for that: Features and fail-safes.
Features and fail-safes
I'll get to fail-safes in a bit, but, online storage not being end-to-end encrypted at the container level allows for additional features that can be highly convenient for the customer. Web-based access to single files within a backup, for example, including and especially photos, calendars, and contacts, like you get at iCloud.com.
There are a few end-to-end encrypted by design storage services, and services that provide options or local tools for end-to-end encryption, and you can always upload locally encrypted files.
But iCloud is by no means unique in how it treats online backups… which is probably why some people reacted so strongly to the headline — given how much Apple talks about privacy and security, some just assume it applies to everything.
Now, Apple does treat some data differently. For example, health data and keychain password data are end-to-end encrypted and Apple locks out everyone but you, including Apple themselves.
To explain the difference and maybe the dissonance, I'm going to flashback for a bit.
Two Steps Forward
Many years ago, there was a scandal involving celebrities and their private photos being leaked online. Many but not all of them came from iCloud backups. iCloud was never hacked, but if someone famous re-used the same password as another service that was hacked, or used security questions that could be looked up on Wikipedia, for example, attackers could and did get into them.
And Apple had to put a stop to it and fast. No one but you should ever be gaining access to your iCloud account.
So, Apple implemented two-step authentication. And the way they implemented it meant anyone using it had to write down or print out a long pseudo-random recovery key and keep it safe in case they ever forgot their iCloud passwords or couldn't supply the second step for authentication. Otherwise, they'd get locked out of their own accounts.
And, of course… people being people… they promptly turned on 2SA, lost their recovery key, and got locked out of their own iCloud accounts. Including and especially the irreplaceable data stored in those accounts like baby and wedding pictures.
Apple was flooded with requests to help people get back into their accounts, but without the recovery key there was nothing Apple could do. The data was lost. For all intents and purposes, destroyed.
I've done a few videos about this already but it's worth repeating again: For most people, most of the time, encrypted backups are a bad idea precisely because if anything goes wrong — and things often go wrong — they can't be recovered.
Even a physical hard drive, if it's encrypted, and it gets damaged, there's no amount of recovery in the world that can get your data back.
For Infosec — information security — people, that's the whole point. And they will always say to encrypt everything, because it's better to lose than to leak.
Data retention experts, though, will tell you never to encrypt those backups, precisely because they've seen so many people lose so much of their own data.
Unfortunately, given the times we live in, we've heard far more from the safe-everything than the save-everything crowd, and really it's critically important to always consider both sides.
Apple totally deserves some of the blame for this as well, given how much publicity they, like everyone else, give to security and how little they, like everyone else, give to education around what should and shouldn't be kept secure and what shouldn't.
Internally, though, Apple learned that being extremist about anything, including security, is not only inconsiderate but also harmful.
Apple had to put a stop to people losing access to their own data as well, and just as fast.
To do that, Apple deprecated the old two-step authentication system and rolled out a new two-factor authentication system that was not only easier for most people to manage but would also allow Apple to recover iCloud backups for people if and when they locked themselves out but could still prove ownership.
The downside to this was that, because Apple could access the backups, they were then legally obligated to hand over those backups in the event of a subpoena.
Why would Apple make that tradeoff? For the exact reasons I just explained.
For the vast majority of people, the risk of data loss is significantly — significantly — higher than the risk of data theft or subpoena.
In other words, for most people, most of the time, the biggest danger isn't someone else, including a law enforcement agency, getting access to your data — it's you losing access to it.
Apple, the company that would rather just lock up absolutely everything, decided it would actually be in the better interests of their customers to be a little less extremist in this specific case.
This is why, for the last five years or so, iCloud Backups have been encrypted but not end-to-end encrypted — unlike almost everything else on iCloud, Apple doesn't lock themselves out so they can help you if you happen to lock yourself out.
And, of course, for anyone not comfortable with that there was and continues to be the option for fully encrypted backups available via Mac or PC.
Encrypting the iCloud
Apple dropped plans to let iPhone users fully encrypt backups of their devices in the company's iCloud service after the FBI complained that the move would harm investigations. This according to six sources familiar with the matter.
Reuters probably means Apple devices rather than just the iPhone, because there's very little chance the iPhone would be treated differently than, say, the iPad when it comes to iCloud backups.
Now, yes, local backups to a Mac or Pac can still be fully encrypted but those aren't as convenient or consistent as iCloud backups.
That's why, after Apple made the change from 2SA to 2FA several people, myself included, understanding the risk of data loss, still asked for the option to turn on end-to-end encryption for iCloud backups as well.
The question even came up in a Speigle interview with Apple's CEO, Tim Cook, translated by Google:
Our users have a key there, and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back. It is difficult to estimate when we will change this practice. But I think that in the future it will be regulated like the devices. We will therefore no longer have a key for this in the future.
Regulated as in handled, not as in mandated by law.
Now, Reuters isn't citing that interview as the source, but, they are saying…
The tech giant's reversal, about two years ago, has not previously been reported.
And that makes Reuter's report problematic. Regardless of the source, it hasn't been "about two years" since Tim Cook's interview where he clearly says Apple is working on end-to-end encrypted iCloud backups. It's been barely more than a year. Not even 15 months.
If something as simple, as checkable, as the time-line is wrong, what else may be wrong?
Apple and Law Enforcement
It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers' information.
But does it though?
Apple's willingness to help law enforcement has never been in question. Through words and actions, Apple has repeatedly demonstrated a commitment to obeying local laws, including a willingness to help law enforcement as required by those laws. This doesn't show how much because, how much has always been shown to be… much.
The hard-line Apple took with San Bernardino and is taking with Pensacola is completely different in kind — because data on servers and on devices is different in kind, and legal requests and extra-legal requests are different in kind.
Apple's argument about devices has and continues to be that they are far more likely to be lost or stolen and therefor requires much strong protection – silicon-level encryption. And that Apple can't grant access to law enforcement because that would also grant access to anyone who finds, steals, or otherwise gains possession of someone else's device.
In other words, Apple isn't locking down iPhones to keep law enforcement organizations with valid warrants out, they're doing it to keep criminals out. The downside, for law enforcement, is that they get locked out as well.
It's fail secure instead of fail safe, the exact inverse of the iCloud backup situation, and the exact opposite trade-off.
Now, I personally believe in, and have made a couple videos advocating, that our devices contain so much deeply personal data they're de facto extensions of our persons and as such deserve 5th amendment-like protections and exemptions under the law. But that's me. As far as I know, Apple hasn't advocated for anything that extreme.
What Apple has done, though, is state that they shouldn't be compelled to provide access to data beyond the scope of existing laws. That there should be no extra legal requests.
That's why, when Attorneys General and Directors FBI demanded Apple create back doors and break encryption on the iPhones in San Bernardino and Pensacola, Apple said no.
But, even before it got to those pay grades, when Apple was legally asked for the iCloud backups, Apple provided them. Both of those facts were widely and jointly reported. There's no disconnect there, no surprise, no gotcha, not unless someone deliberately fabricates one.
More than two years ago, Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud, according to one current and three former FBI officials and one current and one former Apple employee.
OK, so, this is where the Reuter's report becomes highly problematic. Again, Apple is on record saying they will obey local laws. But encryption is not illegal and the idea that Apple would involve the FBI in that type of process would be seen as an incredible violation of trust given the relationship Apple has crafted with their customers. It would be, to many, a dealbreaker.
Apple's entire reputation is based on the commitment to product and customer, and whether that helps or hurts the FBI's or anyone else's extra-legal agenda shouldn't and can't matter. Product and customer have to come first.
Now, there's nothing on the record that I'm aware of that I can point to either prove or disprove this allegation.
It does seem to go against everything we know about how and why Apple operates to the point where, if I were in a Dungeons and Dragons campaign right now, I'd be leaping up and yelling "disbelieve!".
But that's a subjective opinion, not an objective set of facts.
I have had numerous off-the-record conversations over the years since this iCloud Backup system was implemented and I've only ever heard that it was done this way, 100%, to help customers who had previously been locking themselves out of their accounts. Any benefit to law enforcement was unintended but also unavoidable — the cost of prioritizing and preserving customer access.
John Gruber of Daring Fireball, who has as good sources inside Apple as anyone, wrote:
my own private conversations over the last several years, with rank-and-file Apple sources who've been directly involved with the company's security engineering, is that Menn's sources for the "Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud" bit were the FBI sources, not the Apple sources, and that it is not accurate.
Since anyone who wants to keep everyone, including Apple, locked out of their simply has to toggle iCloud Backup off or, like I said before, use a Mac or PC to create fully encrypted backups, the damage to Apple's reputation from backroom dealing like that wouldn't really be worth it.
Especially since there's no indication Apple has done anything to prevent iCloud Backups from being turned off, or to remove encrypted backup capabilities from the Mac or PC. Something that could easily have been done under the guise of dropping support for a legacy system.
Who's thwarting who?
Under that plan, primarily designed to thwart hackers, Apple would no longer have a key to unlock the encrypted data, meaning it would not be able to turn material over to authorities in a readable form even under court order.
This also rings untrue to me. I can't think of a single case where hackers have successfully gained access to Apple's iCloud backup keys.
In every case, access to data has been achieved by gaining physical access to a device that has the keys, or socially engineering or otherwise gaining credentials to access and restore the iCloud backup from another remote device.
An Apple plan to enable end-to-end encrypted backups would only really thwart two groups of people: The users who lose access to their own accounts, as has happened in the past, and law enforcement agencies who want to subpoena the iCloud backups.
In private talks with Apple soon after, representatives of the FBI's cyber crime agents and its operational technology division objected to the plan, arguing it would deny them the most effective means for gaining evidence against iPhone-using suspects, the government sources said.
I have no way to verify whether or not these private talks really happened — I suspect but cannot prove that there's a ton of broken telephone going on here — but if this sounds like something the FBI would argue it's because it's something the FBI does in fact argue.
It's also not accurate. Governments now have access to unprecedented amounts of data about all of us, almost all the time. In some cases that includes cameras and other forms of physical surveillance. In almost all cases, metadata about who we contact, when, where, and how.
When Apple spoke privately to the FBI about its work on phone security the following year, the end-to-end encryption plan had been dropped, according to the six sources.
So, two years ago, Apple had this plan and spoke to the FBI about it. A year ago they spoke to the FBI about it again and said it was dropped. But, that's also about when Tim Cook first mentions Apple was working on exactly this plan. This means, again, the timeline doesn't really make sense.
Reuters could not determine why exactly Apple dropped the plan.
Which is a really interesting thing to say right after quoting six sources about the plan being dropped?
"Legal killed it, for reasons you can imagine," another former Apple employee said he was told, without any specific mention of why the plan was dropped or if the FBI was a factor in the decision.
I can imagine many things, including Apple legal being worried about lawsuits from customers locked out of their data, even when it's their own fault.
That person told Reuters the company did not want to risk being attacked by public officials for protecting criminals, sued for moving previously accessible data out of reach of government agencies or used as an excuse for new legislation against encryption.
Why not? Apple already gets attacked by public officials, the highest of officials and very publicly on Twitter, for exactly those reasons. It's not a risk if it's already happening.
End-to-end encrypting backups is also currently legal, and Apple already does it for PC backups. They can't be sued for that, at least not successfully.
" They decided they weren't going to poke the bear anymore," the person said, referring to Apple's court battle with the FBI in 2016 over access to an iPhone used by one of the suspects in a mass shooting in San Bernardino, California.
Also why not? Pressuring Apple over encryption isn't just a risk for Apple, it's a risk for the government as well because, like we saw when they withdrew from the San Bernardino case, they're actually scared precedent wouldn't go their way.
As to new legislation against encryption, Apple has said they believe it should be a legislative decision. They'll fight it, of course, because it's in the best interests of their customers to fight it, but as we discussed before, Apple will ultimately follow the law. And, there's also no telling whether that law can or would pass. Overreaching information laws have been successfully defeated in the past.
Reuters then quotes two former FBI officials who weren't present for the talks, which is just the opposite of informative.
However, a former Apple employee said it was possible the encryption project was dropped for other reasons, such as concern that more customers would find themselves locked out of their data more often.
Which, like I said before, has been only and exactly the rationale I've heard from people at Apple over the last few years.
Once the decision was made, the 10 or so experts on the Apple encryption project - variously code-named Plesio and KeyDrop - were told to stop working on the effort, three people familiar with the matter told Reuters.
This I really wonder about. It's possible those specific projects were canned, but my understanding is that this is a discussion that is still ongoing at Apple.
And, it's not atypical for several similar projects to be canceled in favor of better projects that ultimately achieve the same thing. That happens all the time.
Implemention optional encryption
Implementing a new architecture that keeps out bad actors but doesn't lock out account owners but also allows for end-to-end encryption that's still as considerate and forgiving as possible is the definition of non-trivial and it absolutely has to be done right.
Look, all the arguments we have on Twitter and in comments about what Apple can and should do, those same arguments happen inside Apple. They're not a monoculture or hive-mind, they're a diverse group of passionate, over-achieving, type-a personality with a lot of strong opinions about what should and shouldn't be done and how. Up to and including the highest levels of the company.
And everything from the articles that get written to the videos that get made to the radars that get filed to the off-the-record conversations that take place help to inform and empower those arguments. Because everyone wants their opinion to win out and will take the best and brightest backup they can get to help make sure it wins and stay won.
Like with how health data and keychain password data is end-to-end encrypted, even when backed up.
That's ultimately why I'm really happy Reuters published this.
Not because it set off some needless panic, especially from people sharing it without doing even basic due diligence or critical thinking before panicking people with ginned up controversies, manufactured outrange, and conspiracy theories.
But because it's an incredibly important topic and it just may help propel it once again to the top of Apple's iCloud roadmap. Yes, even as they're still struggling to fix up and finish everything from Messages on iCloud to the last round of iOS 13 server-side changes.
Which, by the way, if you turn on and then turn off iCloud Backup, will still let you sync messages between your devices but will move the key from the iCloud Backup to your local device. This stuff is complicated.
Personally, I think it's critically important for Apple to provide opt-in end-to-end encryption for iCloud backups. Moreover, on a dataset by dataset basis.
Because, contrary to the hype, end-to-end encryption isn't always for the best. In many cases, it can be for the worst. Maybe I want my messages backup totally secure but still want to access my Photos on iCloud.com? I should be able to do that.
Basically, anything that would be more damaging and harmful to you if it were leaked than lost, you should be able to encrypt it. Again, Apple already does that by default for things like passwords and health data, but you should get to choose other types of data, any types of data that concern you.
And, anything that would be more damaging and harmful to you if it were lost than leaked, you should absolutely not encrypt even if you have the option. That's the way iCloud backup works now and should still be the default, because it's in the best interests of 99% of people 99% of the time.
Totally not an easy system to architect in a way that isn't unnecessarily burdensome or error-prone for end-users, but totally Apple's job to figure out.
And I hope Apple figures out and ships it, and soon, even if I personally would never turn it on, for exactly the reasons I've repeated here… repeatedly.
But for the benefit for every dissident, whistleblower, journalist, oppressed minority, persons at risk, or privacy advocate who would.