Apple, the FBI, and your privacy under siege
The United States Federal Bureau of Investigation (FBI) wants Apple to create a version of iOS that would allow authorities to breach the strong encryption on the iPhone and iPad and access the personal data contained within. The demand comes as part of the investigation into the San Bernardino terrorism case, but the implications and ramifications go far beyond any one case, no matter how heinous.
Earlier today, Apple's CEO Tim Cook wrote a rare letter to the company's customers on Apple.com (opens in new tab). Here's the crux:
Make no mistake, what is being asked of Apple should horrify not just those in the U.S. but around the world. Nothing made can be unmade. Nothing used once will only ever be used once. The moment after an easy way to brute-force passcodes exists we, none of us, will be safe. A few criminals may be more easily investigated, but catastrophically more people will be subject to unlawful searches, hacks, theft, blackmail, and other crimes. Everywhere.
Read Cook's letter again, but substitute the FBI for Chinese Intelligence. Imagine China, soon to be a bigger market for Apple than even the U.S., making this demand so they can more easily track and prosecute those they claim to be criminals. Then imagine it being used by governments at war with their own citizens. Now do it again, but this time with Russia's FSB. Or once more with the NSA.
Imagine when it falls into the hands of everyone from organized crime and terrorists to lone hackers and criminals. Imagine falling asleep while the person you just met sneaks into the other room, replaces the software on your phone, and slips out with your every picture, password, message, and location. And if caught, they're just fine — they used the same back door to replace the software with a underground version eliminating the back door.
It's the nature of law enforcement to overreach. To want our every fingerprint on file, all our DNA on record, and one day to want trackers and monitors implanted into all of our bodies. And they have a clear and understandable point-of-view for doing so — their goal isn't your privacy; it's prosecution and safety. But we have to be able and willing to push back against that overreach.
It's the duty of all of us to say, clearly and with unyielding certainty: "No."
Tim Cook, by publicly standing up and voicing his concerns, is doing just that. He's paving another brick on the sunlit path towards justice. We, all of us, need to pave these bricks too, and as quickly as we can.
February 16, 2016
A Message to Our Customers
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers' personal data because we believe it's the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government's efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that's in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we've offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by "brute force," trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government's demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI's demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI's intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Get the best of iMore in your inbox, every day!
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
protecting both party with the same tools!
Do you really want to allow anybody to enter your phone, the phones of your kids, of your friends?
A backdoor is a backdoor and this one will have no lock. Did your house have such a door?
Seriously how on earth do you think Apple get market research data? I promise you is isn’t JUST from WATCHING THEIR own CUSTOMERS habits.
As no doubt with Google et al, they use all sorts of holding/phantom companies to distribute money and manipulate information amongst and between themselves.
Oh, so Company A doesn't allow you to do B, but guess what it owns company X and that does it, or buys the service/item from company Y instead and this makes the circle back to A by going through company Z in the cayman islands or somewhere ridiculous.
The corporate sharks are ethical when it suits them. Back to the topic at hand, it’s a very very difficult balance. You need to put yourself in the shoes of one who has been saved by the use of such tactics and guess what………suddenly you think different, if you pardon the pun.
I don’t know where I stand but the only part of Tim Cooks recent boring and contrived novel that is correct is where he says that this needs to be discussed. In depth. All companies are interested in protecting their customers, it doesn’t necessarily mean that they care about them.
Don't care. I don't do anything illegal. Snoop away.
FBI. That way they don't have to worry about their software crack getting into the hands of hackers...who would still need your phone to get into it anyhow.