Law enforcement agencies across the globe make requests for data from Apple and other tech companies all the time. So "all the time", there are processes in place to handle them. They are… routine.
But, every once in awhile, specific cases still show up in the papers. They're the most sensational, horrible, heart-breaking, flag-waving cases, and the papers lap them up, few questions asked, and the people who then read the stories get all riled up.
Which is, I think, is the entire point of getting those stories into the papers to begin with.
The biggest and most public fight over encryption in the U.S., so far, was the San Bernardino case.
This channel didn't exist back then but I covered the story extensively online, including sitting in on innumerable calls and reading endless reams of legal statements and filings, and the TL;DR is that the United States Federal Bureau of Investigation (FBI) wanted Apple to not just hand over whatever data they might have on the suspects. No, the FBI wanted Apple to create a version of iOS that would allow authorities to circumvent the hardware encryption on any iPhone, at any time.
Which showed, even back then, either a staggering ignorance about how encryption works or a staggering willingness to manipulate the public in an attempt to stop encryption from working.
Apple believed the request itself was extra-legal, in conflict with existing laws, and in violation of the First and Fifth Amendment of the United States Constitution.
The FBI tried to justify the request by using the All Writs Act — an arcane, two-hundred-year-old piece of legislation that, and I'm just guessing here, probably never had digital encryption in mind when it as codified.
But Apple said "no".
More specifically, Apple's CEO, Tim Cook, said — and I'm going to read it verbatim because it's so on point:
In the San Bernardino case, the FBI and Justice Department eventually gave up and, reportedly, paid a third-party vendor to hack the iPhone for them.
It removed the immediate pressure from Apple but it also removed the danger of the FBI action being ruled undue or illegal by the courts and that precedent being set.
Flash forward to this week, and now the papers are being fed a similar story, this time about the FBI's need to get into the phones in the recent attack in Pensacola.
From The Washington Post, reporting on a letter sent to Apple by FBI General Counsel:
NBC, reporting on the same letter:
In response to the letter, Apple said:
Which, of course.
The Justice Department, not content with an answer based on how the technology of encryption actually works, escalated. Via the New York Times:
Apple has since followed up with a complete statement:
The politics of encryption
Take out the politics. Take out the attempts to manipulate the press and the people, and the simple truth remains: Apple has no way to break into modern iPhones. They're not like nation-states and gray market vendors. They don't stockpile 0day exploits to use on their own customers. Any time they find any, they push out patches for them as fast as possible, because any of them could be used or discovered or disclosed by other people at any time, and then — yeah — there are the worst kinds of headlines.
And the FBI knows this. They know it. Which is where the papers come in. Because, again, they don't want to get into one phone. They want the ability to get into any phone, the courts of public opinion can be a much better vehicle than the courts of law.
Because the papers can be used to make it look like Apple is standing up for the rights of criminals, rather than standing up for our rights. "What would you want them to do if it was your family?" Is the question that inevitably gets asked, each and every time. As though the answer would ever be anything other than everything, even things that would absolutely be crimes in their own right.
So, what's critical is to step back and really look at what's being asked for here. No more secrets. The ability to get into not just a single criminal's phone, but everybody's phone. Yours and mine. And the ability for not just the FBI to get into it, but everybody. Foreign agencies and criminals.
Substitute the FBI for, say, Russian or Chinese Intelligence, or one of the myriads of countries where dissidents, journalists, everyday citizens have nowhere nearly the rights, freedoms, or protections under the law.
Hell, any border crossing or even traffic stop, any place in the world, where suddenly the contents of every private photo and message, medial and financial record are suddenly at risk.
Right to remain private
Tim Cook said in a recent interview that China had never asked Apple to compromise iOS security but the U.S. had. Luckily, the U.S., a system still exists to push back against those types of requests. But what happens when China does? Especially if they're emboldened by America and the FBI? Based on recent history, it'll be nowhere nearly as easy for Apple to push back.
Worse, what happens when the backdoor falls into the hands of organized crime and terrorists and lone hackers and criminals?
Government agencies have proven woefully incapable of containing dangerous technologies. Information abhors a vacuum and from the NSA spy programs to the worms created to wage cyber warfare on other countries, we're all still dealing with the devastating consequences of the government's repeated failure to keep exactly these kinds of secrets.
A skeleton key into every one of the billions of iOS devices in the world? Who'd ever pick one up again?
It's the nature of law enforcement to overreach. To want our every fingerprint on file, all our DNA on record, from conception, and one day to want trackers and monitors implanted into all of our bodies. And they have a clear and understandable point-of-view for doing so — their goal isn't your privacy; it's prosecution and safety.
But we have to be able and willing to push back against that overreach. It's the duty of all of us to say, clearly and with unyielding certainty: "No."
Because the precedents we set now will echo throughout the next few decades.
I've already done a column on the right to remain private, but I'll TL;DR it now: Our phones enhance our most timely memories, they store our most private data, they sense everything about us and our surroundings.
Not all countries and laws are the same, of course, but many have the concept of a right to remain silent, of a right against self-incrimination. Even spousal privilege.
I've argued before and I'll keep arguing the same should be extended to our phones because they are becoming closer to us than even spouses. They're becoming part of us.
They've already become external cybernetics. And the way we treat them will, in part, determine the way we will treat internal cybernetics and neural connections one day.
If the thought of a backdoor into your phone doesn't creep you out, the idea of a backdoor into your mind and thoughts surely should.
And if it sounds like that's a bunch of crazy talk, again, I point you back to the coverage. Asking Apple or any tech company for assistance is routine. The only time it shows up in the papers is when they want to make it a spectacle. And because the papers want a spectacle as well, they seldom if ever stop to consider why they're being handed one. But it is absolutely to keep stirring up sentiment against the right to privacy, to chip away at it from oblique angles in the courts of public opinion and then the courts of law.
And it's much better and easier to be hyperbolic about it now than it will be when we lose it, and every agency and attacker is swimming in our personal data.
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
Well said, thank you.
I always hear the old saw, I don't have a problem with it, I have nothing to hide. Well if you have nothing to hide, why do you lock your phone? There is stuff in there you don't want some people to get hold of. Those people are exactly who will have access, if access is possible. Me, in my country, I really don't have a big problem with 'most' law enforcement having access, though there have been a few members of that community with less than exemplary integrity. Be assured though, the capability won't long be limited to 'authorized' parties. You can buy TSA keys on-line you know. A cyber backdoor won't be any different.
Get the best of iMore in in your inbox, every day!
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.