FAQ: Everything you need to know about Apple, encryption, and the FBI

Apple's privacy, security and encryption policies are getting quite a bit of air time and column space this week after company CEO Tim Cook's open letter on privacy. If you're not sure what's going on, why Apple's refusing the FBI, or what's at stake, we've got you covered: Read on, and we'll tell you everything you need to know about the situation.

So Tim Cook wrote a letter about privacy?

He did. You can find it at apple.com.

Why did he do this? And on Apple's website?

Over the past year, some officials inside U.S. law enforcement have been advocating for a "back door" into the encryption systems that Apple (and other tech companies) use to secure your data. This exploit would allow the government to more easily gather evidence that may exist on iPhones — messages, photos, location data, and more — and theoretically use it to take down criminals.

Unfortunately, once an exploit exists — even if it's supposed to exist solely for the use of law enforcement — it's almost impossible to keep it from falling into the wrong hands. As such, Tim Cook feels very strongly about protecting the overall integrity of Apple's encryption systems.

The reason we saw this letter this week, however, has to do with the FBI's San Bernadino terrorism case: One of the shooters owned an iPhone 5c — an iPhone 5c that the FBI has yet to be able to crack.

As such, the FBI is using an archaic law from 1789 called the "All Writs Act" to demand Apple's help: The bureau wants Apple's engineers to create a new version of iOS, specific to the iPhone 5c, that would circumvent security and allow an iPhone to be more easily broken into using what's called a "brute force attack" — trying the passcode over and over again until the right one is found.

To clarify, they're currently not asking Apple to break its own encryption — the FBI just wants Apple to make it easier for the bureau to find the passcode to the phone.

Can Apple even do this? I thought iPhones had hardware encryption to prevent software side-loading.

While the iPhone 6 and later models offer the Secure Enclave, which adds a specific hardware-based encryption key to the device's passcode and iOS encryption key, all iPhone models can be potentially exploited with a software side-load from Apple and Apple alone.

That's because Apple has a private key that allows the company to deliver software updates to your iPhone. It's how you can securely download normal software updates online; when Apple pushes out an update, it signs that update with its private key, which tells iOS that the software is legitimate.

For more on what the FBI is asking, read Christina Warren's excellent Mashable piece and Matthew Panzarino's TechCrunch explainer on Apple and side-loading software.

In short: While Apple does have the technical ability to create such an exploit, the company and Cook are resisting on the grounds that once security is violated, no matter what the reason, it will be infinitely easier to violate it again in the future by governments and criminals alike.

Why is the FBI coming to Apple? Don't they have super hackers who can crack the company's software?

While it appears that government agencies have been putting extensive work into finding exploits, Apple's hardware and software encryption is exceedingly difficult to crack. The company made it that way for a reason: If hackers employed by the FBI could crack iOS's encryption without help, chances are there's a computer expert in China or Russia who could do the same.

As such, instead of trying to crack Apple's unbreakable encryption, the FBI wants Apple's cooperation in creating a version of iOS that makes it possible to brute force the iPhone's password without erasing the device. (Currently, iOS devices can be set to auto-erase after ten failed passcode attempts.) The bureau also wants Apple to put in code to brute force the passcode electronically, so computing power can be used over the FBI's human resources.

Getting Apple's help would set a precedent for future cases involving tech companies and accessing encrypted data. If the FBI gets its way, it could give government agencies a path to compel any private company to put back doors in its systems in the interest of national security — and we've seen what the government has done with such exploits in the past.

Why didn't the FBI just ask Apple for the phone's iCloud backup?

The FBI did, in fact. From the Guardian:

With a search warrant, Apple provided the FBI data from weekly backups Farook made with Apple's iCloud service. But those backups stopped on 19 October, according to a federal search warrant request.FBI investigators believed there was more data about Farook's motives in the phone but couldn't get to it without unlocking the device.

In order to further investigate, the bureau wants Apple to help them unlock the iPhone so that they can potentially access two more months of locally-stored data before the attack.

How does the All Writs Act come into this?

The All Writs Act, part of the Judiciary Act of 1789, is an ancient law that, in recent years, has been used by federal and state law enforcement to compel companies to help them break into smartphones.

The text of the act is as follows:

(a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction.

In regards to this case, the All Writs Act is being used because no specific legislation has been passed to otherwise govern the rules over how the government should handle strongly encrypted digital devices. Apple is, in turn, arguing that the FBI is overstepping its bounds with its interpretation of the law.

Yes, encrypting your data is currently legal. There are bills floating in certain state legislatures at present that would make encrypting illegal, but as Wired explains, those proposed laws would be nigh-impossible to enforce without a national ban. And cryptographers and security researchers all over the world are against a national ban for the sake of making it easier for law enforcement, arguing that it would greatly weaken an average individual's privacy and security in exchange for receiving data on a small number of criminals.

Why is encryption important to me?

Encryption keeps your data safe. That includes your pictures, messages, locations, financial information, health data, and more. If this data isn't safe, it can be stolen and potentially used to access your money, to blackmail you or humiliate you, to find evidence against you for crimes for which you have not even been suspected, and more.

What if you were the victim? Wouldn't you want Apple to help the FBI get any and all data possible?

More than anything. That said, we've heard this argument before: It's been used to rationalize other invasions of privacy, torture, and other extreme measures. It's also in part why victims aren't allowed to be involved in the criminal justice process.

The FBI's San Bernadino case is an emotional one, and I don't think anyone at Apple has any great desire to prevent the bureau from doing its job. The company has always complied with legal warrants and subpoenas, and even has a page documenting requests and what kind of information can be released.

But when making a decision that has the potential to impact millions of customers — most of them law-abiding — Apple has to think beyond the case at hand to the larger impact. The FBI may be pitching this as a one-time, specific-to-this-iPhone-5c unlock, but it would set precedent for decades to come.

As with all legal matters — even emotionally-charged ones — it's important to take in the big picture. Once a process exists for one phone, that process could be replicated against others, legitimately or otherwise.

I don't get it. If it's only a one-time code and only Apple and the FBI have it, how could it affect my devices?

Two reasons: Precedent, and leaks.

We've talked about precedent above, but the one thing we haven't covered is international precedent: namely, what happens if the U.S. government wants to demand information from Samsung or another internationally-held company? Say an international company agrees to let the U.S. have a back door into their software. Once this happens, the chances are pretty great that the company's home country will want that back door, too, as will other international entities. You may feel comfortable letting the FBI look at the phone of a domestic terrorist, but what happens when it's Chinese intelligence deciding to take an American tourist's phone and hack it because they suspect spycraft?

As we wrote here on iMore:

Make no mistake, what is being asked of Apple should horrify not just those in the U.S. but around the world. Nothing made can be unmade. Nothing used once will only ever be used once. The moment after an easy way to brute-force passcodes exists we, none of us, will be safe. A few criminals may be more easily investigated, but catastrophically more people will be subject to unlawful searches, hacks, theft, blackmail, and other crimes. Everywhere.Read Cook's letter again, but substitute the FBI for Chinese Intelligence. Imagine China, soon to be a bigger market for Apple than even the U.S., making this demand so they can more easily track and prosecute those they claim to be criminals. Then imagine it being used by governments at war with their own citizens. Now do it again, but this time with Russia's FSB. Or once more with the NSA.

Leaks are the other dangerous factor in this equation: If Apple is forced to comply with the FBI's order, chances are the company will be forced to do this again for other devices in the future, and for other agencies. It's easy for the FBI to say "task your software engineers on a custom version of iOS for this very high-profile terrorist case," but Apple doesn't have the resources to do this for, say, an NYPD case involving a missing person with the criminal's phone left at the scene of the crime. Chances are, government agencies would ask Apple and other tech companies to create a more free-form back door — one that could be used on more iPhone models.

Once that back door exists, every law enforcement agency in the U.S. is going to want it. And once every agency has access, that widely opens up who has direct control over the code. From there, all it takes is one corrupt or angry official to put that information online.

We're a long way from that scenario becoming reality right now, but that's the big reason why Apple's fighting — to keep it firmly in the "what if" column.

If you want more info, Macworld contributor Rich Mogull wrote an incredible piece on the long-term implications of the FBI's request.

What's happening now?

Apple has very publicly rejected the FBI's court order and will likely appeal; from there, it will likely bounce around the court system. In addition, the House of Representatives's judiciary committee is expected to hold a hearing on the first of March regarding the issue and has invited Apple, according to the Guardian.

There's also the question of whether Congress will act to pass a law that compels Apple — and any other tech company — to do what the FBI is demanding of them. In an interview with PBS, California U.S. Senator Diane Feinstein implied that the Senate would examine passing such legislation if Apple refused the court order.

What can I do to help?

First and foremost: Educate people who don't understand the issue. Security issues can be murky and hard to make sense of; by walking your friends and family through the conflict, you can help clear up what's going on.

You can also write to your U.S. senators, or to California Senator Diane Feinstein of the Senate Intelligence Committee, who's calling for Apple to allow the FBI access.

In addition, both the Electronic Frontier Foundation and the ACLU are throwing their support behind Apple.

I still have questions about this whole thing.

Pop them in the comments and we'll try our best to help.

Serenity Caldwell

Serenity was formerly the Managing Editor at iMore, and now works for Apple. She's been talking, writing about, and tinkering with Apple products since she was old enough to double-click. In her spare time, she sketches, sings, and in her secret superhero life, plays roller derby. Follow her on Twitter @settern.