Skip to main content

FAQ: Everything you need to know about Apple, encryption, and the FBI

Apple's privacy, security and encryption policies are getting quite a bit of air time and column space this week after company CEO Tim Cook's open letter on privacy (opens in new tab). If you're not sure what's going on, why Apple's refusing the FBI, or what's at stake, we've got you covered: Read on, and we'll tell you everything you need to know about the situation.

So Tim Cook wrote a letter about privacy?

He did. You can find it at apple.com (opens in new tab).

Why did he do this? And on Apple's website?

Over the past year, some officials inside U.S. law enforcement have been advocating for a "back door" into the encryption systems that Apple (and other tech companies) use to secure your data. This exploit would allow the government to more easily gather evidence that may exist on iPhones — messages, photos, location data, and more — and theoretically use it to take down criminals.

Unfortunately, once an exploit exists — even if it's supposed to exist solely for the use of law enforcement — it's almost impossible to keep it from falling into the wrong hands. As such, Tim Cook feels very strongly about protecting the overall integrity of Apple's encryption systems.

The reason we saw this letter this week, however, has to do with the FBI's San Bernadino terrorism case: One of the shooters owned an iPhone 5c — an iPhone 5c that the FBI has yet to be able to crack.

As such, the FBI is using an archaic law from 1789 called the "All Writs Act" to demand Apple's help: The bureau wants Apple's engineers to create a new version of iOS, specific to the iPhone 5c, that would circumvent security and allow an iPhone to be more easily broken into using what's called a "brute force attack" — trying the passcode over and over again until the right one is found.

To clarify, they're currently not asking Apple to break its own encryption — the FBI just wants Apple to make it easier for the bureau to find the passcode to the phone.

Can Apple even do this? I thought iPhones had hardware encryption to prevent software side-loading.

While the iPhone 6 and later models offer the Secure Enclave, which adds a specific hardware-based encryption key to the device's passcode and iOS encryption key, all iPhone models can be potentially exploited with a software side-load from Apple and Apple alone.

That's because Apple has a private key that allows the company to deliver software updates to your iPhone. It's how you can securely download normal software updates online; when Apple pushes out an update, it signs that update with its private key, which tells iOS that the software is legitimate.

For more on what the FBI is asking, read Christina Warren's excellent Mashable piece and Matthew Panzarino's TechCrunch explainer on Apple and side-loading software.

In short: While Apple does have the technical ability to create such an exploit, the company and Cook are resisting on the grounds that once security is violated, no matter what the reason, it will be infinitely easier to violate it again in the future by governments and criminals alike.

Why is the FBI coming to Apple? Don't they have super hackers who can crack the company's software?

While it appears that government agencies have been putting extensive work into finding exploits, Apple's hardware and software encryption is exceedingly difficult to crack. The company made it that way for a reason: If hackers employed by the FBI could crack iOS's encryption without help, chances are there's a computer expert in China or Russia who could do the same.

As such, instead of trying to crack Apple's unbreakable encryption, the FBI wants Apple's cooperation in creating a version of iOS that makes it possible to brute force the iPhone's password without erasing the device. (Currently, iOS devices can be set to auto-erase after ten failed passcode attempts.) The bureau also wants Apple to put in code to brute force the passcode electronically, so computing power can be used over the FBI's human resources.

Getting Apple's help would set a precedent for future cases involving tech companies and accessing encrypted data. If the FBI gets its way, it could give government agencies a path to compel any private company to put back doors in its systems in the interest of national security — and we've seen what the government has done with such exploits in the past.

Why didn't the FBI just ask Apple for the phone's iCloud backup?

The FBI did, in fact. From the Guardian:

With a search warrant, Apple provided the FBI data from weekly backups Farook made with Apple's iCloud service. But those backups stopped on 19 October, according to a federal search warrant request.FBI investigators believed there was more data about Farook's motives in the phone but couldn't get to it without unlocking the device.

In order to further investigate, the bureau wants Apple to help them unlock the iPhone so that they can potentially access two more months of locally-stored data before the attack.

How does the All Writs Act come into this?

The All Writs Act, part of the Judiciary Act of 1789, is an ancient law that, in recent years, has been used by federal and state law enforcement to compel companies to help them break into smartphones.

The text of the act is as follows:

(a) The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.(b) An alternative writ or rule nisi may be issued by a justice or judge of a court which has jurisdiction.

In regards to this case, the All Writs Act is being used because no specific legislation has been passed to otherwise govern the rules over how the government should handle strongly encrypted digital devices. Apple is, in turn, arguing that the FBI is overstepping its bounds with its interpretation of the law.

Yes, encrypting your data is currently legal. There are bills floating in certain state legislatures at present that would make encrypting illegal, but as Wired explains, those proposed laws would be nigh-impossible to enforce without a national ban. And cryptographers and security researchers all over the world are against a national ban for the sake of making it easier for law enforcement, arguing that it would greatly weaken an average individual's privacy and security in exchange for receiving data on a small number of criminals.

Why is encryption important to me?

Encryption keeps your data safe. That includes your pictures, messages, locations, financial information, health data, and more. If this data isn't safe, it can be stolen and potentially used to access your money, to blackmail you or humiliate you, to find evidence against you for crimes for which you have not even been suspected, and more.

What if you were the victim? Wouldn't you want Apple to help the FBI get any and all data possible?

More than anything. That said, we've heard this argument before: It's been used to rationalize other invasions of privacy, torture, and other extreme measures. It's also in part why victims aren't allowed to be involved in the criminal justice process.

The FBI's San Bernadino case is an emotional one, and I don't think anyone at Apple has any great desire to prevent the bureau from doing its job. The company has always complied with legal warrants and subpoenas, and even has a page documenting requests and what kind of information can be released (opens in new tab).

But when making a decision that has the potential to impact millions of customers — most of them law-abiding — Apple has to think beyond the case at hand to the larger impact. The FBI may be pitching this as a one-time, specific-to-this-iPhone-5c unlock, but it would set precedent for decades to come.

As with all legal matters — even emotionally-charged ones — it's important to take in the big picture. Once a process exists for one phone, that process could be replicated against others, legitimately or otherwise.

I don't get it. If it's only a one-time code and only Apple and the FBI have it, how could it affect my devices?

Two reasons: Precedent, and leaks.

We've talked about precedent above, but the one thing we haven't covered is international precedent: namely, what happens if the U.S. government wants to demand information from Samsung or another internationally-held company? Say an international company agrees to let the U.S. have a back door into their software. Once this happens, the chances are pretty great that the company's home country will want that back door, too, as will other international entities. You may feel comfortable letting the FBI look at the phone of a domestic terrorist, but what happens when it's Chinese intelligence deciding to take an American tourist's phone and hack it because they suspect spycraft?

As we wrote here on iMore:

Make no mistake, what is being asked of Apple should horrify not just those in the U.S. but around the world. Nothing made can be unmade. Nothing used once will only ever be used once. The moment after an easy way to brute-force passcodes exists we, none of us, will be safe. A few criminals may be more easily investigated, but catastrophically more people will be subject to unlawful searches, hacks, theft, blackmail, and other crimes. Everywhere.Read Cook's letter again, but substitute the FBI for Chinese Intelligence. Imagine China, soon to be a bigger market for Apple than even the U.S., making this demand so they can more easily track and prosecute those they claim to be criminals. Then imagine it being used by governments at war with their own citizens. Now do it again, but this time with Russia's FSB. Or once more with the NSA.

Leaks are the other dangerous factor in this equation: If Apple is forced to comply with the FBI's order, chances are the company will be forced to do this again for other devices in the future, and for other agencies. It's easy for the FBI to say "task your software engineers on a custom version of iOS for this very high-profile terrorist case," but Apple doesn't have the resources to do this for, say, an NYPD case involving a missing person with the criminal's phone left at the scene of the crime. Chances are, government agencies would ask Apple and other tech companies to create a more free-form back door — one that could be used on more iPhone models.

Once that back door exists, every law enforcement agency in the U.S. is going to want it. And once every agency has access, that widely opens up who has direct control over the code. From there, all it takes is one corrupt or angry official to put that information online.

We're a long way from that scenario becoming reality right now, but that's the big reason why Apple's fighting — to keep it firmly in the "what if" column.

If you want more info, Macworld contributor Rich Mogull wrote an incredible piece on the long-term implications of the FBI's request.

What's happening now?

Apple has very publicly rejected the FBI's court order and will likely appeal; from there, it will likely bounce around the court system. In addition, the House of Representatives's judiciary committee is expected to hold a hearing on the first of March regarding the issue and has invited Apple, according to the Guardian.

There's also the question of whether Congress will act to pass a law that compels Apple — and any other tech company — to do what the FBI is demanding of them. In an interview with PBS, California U.S. Senator Diane Feinstein implied that the Senate would examine passing such legislation if Apple refused the court order.

What can I do to help?

First and foremost: Educate people who don't understand the issue. Security issues can be murky and hard to make sense of; by walking your friends and family through the conflict, you can help clear up what's going on.

You can also write to your U.S. senators, or to California Senator Diane Feinstein of the Senate Intelligence Committee, who's calling for Apple to allow the FBI access.

In addition, both the Electronic Frontier Foundation and the ACLU are throwing their support behind Apple.

I still have questions about this whole thing.

Pop them in the comments and we'll try our best to help.

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.

28 Comments
  • And here I was under the impression that our iCloud back ups were encrypted. Guess I'll be turning that off now and deleting my backups.
  • So were the employees at my local Apple Store. I didn't argue, but to my knowledge they are wrong. iCloud backups are password protected, but not password encrypted, which was the employee's understanding. (I could be wrong)
    What's not clear in this article is exactly what data they got and wants from the iCloud backup; and if any part of it is in fact encrypted.
    I know we can set out local iTunes backups of iOS devices to be encrypted, which requires a new, separate password for the encryption.
    Obviously iCloud backup doesn't require this, but I have no way of verifying if it is encrypted with our iCloud password, as the Apple employees stated. Even if it is, if someone has your password to buy apps, then they access to all of your data. Sent from the iMore App
  • Doesn't this page indicate iCloud data is encrypted? https://support.apple.com/en-us/HT202303 Or does this mean it's encrypted, but Apple has the master key?
  • Yeah I read that and thought the backups were encrypted also.
  • I've always understood that your data is fully encrypted, even the backups however Apple does have the ability to decrypt icloud backups. If you don't want the government to get access to your data then turn off icloud backup but if you lose phone or it breaks ALL your data is toast. The fact Apple is taking a firm stand gives me the confidence that my data is fully protected to the best of their ability & that ability is clearly superior to the federal gov't. While Apple could decrypt my data by court order I really have nothing to hide. I just don't want criminals to decrypt my phone if stolen.
  • Your backups are encrypted, but Apple (and Apple alone) can decrypt certain information when presented with a warrant. Here's more info about that: https://www.imore.com/e?link=https2F2Fc2F4736573FsubId126subId226u253A252Fwww.apple.com252Fgovernment-information-requests%252F&token=VUwPXhCR
  • So no different than all of the other explanations against the Judge request. Everyone says they shouldn't do it because it sets precedent meanwhile it still goes on with segments of Apple data.
  • Slightly different in that there's no way for anyone to decrypt those backups outside Apple Corporate; with a back door in place, anyone with the knowledge of the back door or access to the exploit code could hypothetically get at your information. A back door could also open up the data stored on the Secure Enclave (Touch ID fingerprints, Apple Pay, Health data, etc) that's not backed up to iCloud.
  • Backdoor shmack door. They already decrypted his personal stuff with a MASTER KEY. The FBI wants them to one time disable the security wipe feature. Also doable with their MASTER KEY. Zero difference. Apple has the ability to do both already and is trying to use a term like back door to simplify a very complex situation. I get the fear of precedence. But it is not so simple people. They want it this once. Yeah they may ask again. But most people want to keep privacy a high priority. No law would pass giving blanket access at will without catastrophic consequences to the whole system. It would never fly. Saying no to it(FBI) in this situation or ones like it, build up an unnecessary argument for the opposing side of that privacy. Don't give the other side fuel for laws that attack that right for law abiding citizens. Sent from the iMore App
  • RE: "No law would pass giving blanket access at will without catastrophic consequences to the whole system. It would never fly." The FBI (and NSA, CIA, HIV, etc) want to get a law passed, hence the current brouhaha. If everyone were sane, it would never fly, but let's not forget that there's currently a presidential candidate making hate statements - some of which are at levels we haven't heard since Hitler - against all sorts of religious, ethnic and rights groups and has millions of followers, I'd say sanity is at an all time low amongst the American people.
  • They are encrypted. The phone didn't belong to the terrorist, it was owned by the County, as was the iCloud account they reset the password for.
  • I truly don't have anything to hide so I'm not really worried about anyone wanting to go through my phone. The main thing that I'd prefer to avoid is criminals being able to obtain sensitive data from my phone. But according to Apple none of my bank information is stored on my phone so that shouldn't be an issue. What are the other items that could cause problems?
  • Your bank info may not be stored on your phone, but if your Social Security number is, you're at risk. If your username or password is in Notes or Messages, you'd also be at risk. Phishing has become stupidly sophisticated in the last few years, and hackers can get access to your bank records with a smattering of personal information and lots of lies. If Apple were asked to compromise the Secure Enclave, it would put your fingerprint, Apple Pay, and credit card info at risk locally (if someone physically stole or took your device).
  • Everybody has to hide something.... you don't want that someone get access to your account, write your emails, send your sms and read answers, take your telephone calls, use all this to get your identity. Maybe it's not the mails stored on your phone, your music but maybe your identity? And if your still didn't know where I am talking about: how did you request new passwords, new access to accounts, to your adminstrated data? Today you often do this via sms or email identification. Your phone will give it to me.
  • One thing they are asking for is to eliminate the time delay between passwords.
    Someplace else I read this is in the secure enclave but can be changed by flashing it.
    We can expect new phones soon. Want to bet the time delay becomes non-flashable?
    Making that change and advertising it should sell a *LOT* of phones.
  • So can someone explain the difference between this situation and the iCloud celebrity picture leak last year? Seems like if a hacker can steal those photos, it wouldn't be that difficult for the FBI to hack into that phone, but maybe these are completely different.
  • I think the celebrity thing was a case of people guessing their iCloud passwords correctly based on stuff they knew about them.
  • "We the People" petition has started over at https://petitions.whitehouse.gov/petition/apple-privacy-petition
  • The petition is growing slightly faster than the petition on legalizing cockfighting which you will also find on the same site.
  • Who cares.. It's good if fbi and other agencies can access phones and other tech items in order to maybe prevent future terrorist acts. Don't see what the fuzz is about, only criminals are arguing against this. It's not like they will take my phone and just search through it, they will only do so if they need to. Sent from the iMore App
  • It's not as simple as that.
    Before I upgraded my iPad my older tablet had my Social Security number along with other important information stored that would be extremely problematic if it was lost or gained by anyone.
    If Apple creates a backdoor, yes it will be able to unlock the terrorist's phone allowing for the FBI to search for information. But If Apple does that, then other countries will strive to gain this software. China, Korea, Russia, all of them will demand and overexploit decryption from Apple or pay millions or billions for someone to provide that backdoor software. It will not be a 'one time' use as the FBI says. The FBI has proved many times that they cannot be trusted in situations such as this. They will keep using it until the day will arrive that even a local deputy will be able to unlock your iPhone upon demand.
    This backdoor will be available for hackers, as Cook said, to use and exploit. If I was a hacker nothing would please me more than to get that 'master key' and use it. If a hacker hacks through the backdoor, identity thefts would rise, email accounts, bank accounts would be hacked, it would lead to havoc.
    Also, this can be seen as a moral debate as well, human individual rights vs a government order. So to answer your question "who cares?" Millions of people care. No matter what people may believe, despite that it was a terrorist attack, albeit a really small one, national security is not worth more than giving up our human right of privacy, especially on our smartphones. Too much risk is involved as this affects millions of people. It's like being forced to develop a key to open a safe that may or may not have information to solve a small crime, but in that safe also holds millions of people's private important information - information that some people's jobs or even lives actually depend on - that can get loose. So it's much much more than just 'so what if the government want to spy, I've got nothing to hide?'. There is a much bigger picture As Ben Franklin quoted, "They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." And in these past 2 or 3 days it is sad to see that many Americans were being more willing to give up their human right and their American constitutional right and freedom, all to unlock a phone that may or may not be helpful and for the illusion that the national security is doing so only to prevent more terrorism because two shooters attacking San Bernardino got people too emotional to see beyond a scenario as this privacy debate and are quick to say that Apple is helping the terrorists.
  • ...and this is why you guys are some of the best tech iJournalists today.... Thank you.
  • Just for privacy we've developed MultiPasswords for iOS so you can keep passwords, photos and videos secure in your iPhone or iPad. USA and Mexico countries with the more downloads. Sent from the iMore App
  • "Then imagine it being used by governments at war with their own citizens." That's easy to imagine because that's what's happening here. Make no mistake, "bad guys" can use their own encryption. This is about control.
  • So they're trying to stuff the genie back in the bottle: "Apple unlocked at least 70 iPhones before refusing to hack into terrorist’s device"
    http://www.nydailynews.com/news/national/apple-unlocked-70-iphones-refus...
  • Yes, iCloud Backups are encrypted. How else could health data from a device that can track health data be backed up? Think about existing laws and regulations regarding storing private personal financial and health data. The other issue that a lot are missing is that the FBI could easily enlist the NSA help to access the data. The problem is that the NSA is foreign intelligence and the FBI is domestic intelligence. Can't use NSA help as a result. As for All Writs Act, the reason that law enforcement is going back so far is because of existing security and privacy laws and regulations that companies have to comply with in order to secure, store and protect private information. The All Writs Act precedes all of these and would negate them. It's all law enforcement would need for precedent. Sent from the iMore App
  • I said this in another article, but I wanted to say that I'm pleased to see that iMore is taking a hard line stance on privacy protection.
  • Just repeating Apple's public statements about this isn't news - it is acting like an unpaid employee of Apple, kissing their feet, and bowing to Cook for a scrap of his approval. I worked at Apple for 10+ years (enjoyed every minute), and you may be assured that Apple doesn't do ANYTHING that doesn't contribute to the bottom line. In this case, Apple is afraid sales in China (HUGE market) will tank if they unlock the phone. That's all there is to this, folks: Apple is trying to protect sales in China. Chinese consumers will abandon the iphone if they think Apple will unlock it after a government request.