You might have read the Apple CEO's open response to the U.S. government earlier today. They are demanding access to an iPhone that was involved in the San Bernardino attacks. Here's the original court order Apple is facing. In short, they want to have a "backdoor", which Tim Cook is strictly opposing. But what exactly is the FBI demanding here, how does all this actually work, and why is this such a big deal?
The iPhone uses "Full Device Encryption". Basically it does what it says: It encrypts the full disk, everything, from top to bottom, with a key. But do you remember ever entering it? Me neither. And where do they store it? Here's where Apple did some clever things. Bear with me.
For any encryption to be secure the key would have to be entered every time the user wants access the encrypted data. And for the key to be secure against someone that just rips out the chips and tries out all the possibilities, it has to be long. And it is: It's a 256 Bit AES Key.
Actually, this device key is generated on the device and stored in a part of memory that can be securely erased -- which is not a given with todays memory technology. Also, the key is generated using the device's ID, so if the chips are transferred to another device, it's impossible to use the data on them. But if the encrypted data is on the device and the key is too, it is hardly a secure mechanism! The answer is: It's not.
"In case something goes wrong, the key is erased -- and immediately, everything is unreadable. This is one thing that the FBI wants to disable."
The key's purpose is not to protect your personal data. It's here so that it cannot be read outside of the device and as a kill switch for quickly erasing anything in case this is necessary. In Berlins Teufelsberg surveillance station, an NSA outpost during the cold war, they are said to hold some document stashes on top of a pool of acid. In case the cold war got hot, they would just have dropped the documents very quickly into the acid pool. Think of device encryption as a mechanism like this. In case something goes wrong, the key is erased -- and immediately, everything is unreadable. This is one thing that the FBI wants to disable.
But what protects your personal data then? It's a second, very sophisticated mechanism of interdependent keys, "protection classes" and your passcode.
Each file in the "User Partition", which is basically everything that isn't part of the operating system, is encrypted with a separate individual key. Which key? Glad you asked.
An iOS device comes with a security module that holds a key which was set at the devices production and which can not be read by even the most low level software on the phone. However, it can be used to generate or sign other keys. Your file keys have to go through this in order to be used, so any decryption would require to be done on your device. This way the system, which still has to be active during an attack, can limit the attempts to get to the keys. This is important.
"If Snowden proved anything, then it's very likely that something like this would be misused on a big scale."
All these things (and others) make sure the OS is always in control and not replaced by something else, but it doesn't add anything that the phone does not "know". This is where your passcode comes in. the individual file keys can be wrapped with you passcode. The longer and more complicated it is, the better. Standard were 4 numbers, iOS 9 defaults to 6, but you can use any amount of digits or even use a "normal" passwords with whatever you find on a keyboard.
And some ingredients to the file keys are things that make them individual to a file, so finding a "master" that unlocks them all if you managed to unlock one is much harder. Essentially, to access a file you have to have device's file system key, the individual file key, and the passcode. Quite a hard nut to crack!
But say you have full access to the OS on the original device and can manipulate everything in the OS and even the lowest level firmware that controls the system. This is the second thing the FBI wants. They want a "special version" of the OS, that Apple has to sign, because nobody else can. And Apple refuses to do so.
With this special version they still don't have the passcode, they still have to guess it, and all of this has to happen on the device.
Currently the OS requires you to use your fingers to unlock your phone. After the 4th wrong attempt you have to wait 15 seconds, and on the tenth, you have to wait an hour for your next try. You have to enter a million of the now-standard 6-digit passcodes to try them all, and longer alphanumeric passwords are even crazier to guess. And if the user set it up that way, after the 10th failed attempt all the device key is dropped into the acid bath and there's no point trying after that.
"It's literally Pandora's box: Once you open it, even a bit, there's no way back."
A special version that the FBI demands could disable all that and provide an automated way to try a LOT of passcodes per second. This is the Backdoor they are asking for.
There's a lot of fundamental criticism to such an approach. First, anything that Apple would provide to the US government, others would want too. And Apple can't go without the Chinese market anymore. (UK? Maybe. They could try...) And if Snowden proved anything, then it's very likely that something like this would be misused on a big scale. The complicated encryption mechanisms that iOS puts in place are what keeps the lid on the users privacy. And it's literally Pandora's box: Once you open it, even a bit, there's no way back.
Here's how to make your iPhone more secure
Use a long passcode. If you don't use a passcode, you have NONE of the protection. Formerly, 4 digits were required, now it's 6. You can use as many as you want. And if you only use digits, no matter how many, you can still get the convenient number keypad that you can operate while walking, instead of the tiny keyboard.
Use TouchID. Laziness is the biggest enemy of security, and we're all lazy. Touch ID is a convenient way to store your passcode on top of the system I just described, and makes long, secure passcode convenient. It saves and enters the passcode for you, but if you used the wrong finger 5 times, it forgets the passcode completely and you have to re-enter it. If you're in a place where you worry about security, use the wrong finger 5 times, and TouchID is completely deactivated until you enter the passcode manually again.
Use encrypted Backups. Files have different protection classes, that is actually another wrapper for the indidual file keys. It's beyond the scope of this article to describe protection classes, but it's worthwhile to know that it has implications on security. You can have files that are only accessible to anything in the system while the phone is unlocked. Or you can have files that are excluded from unencrypted backups or even from backups at all. So if you back up over iTunes, there's acually more getting backed up if you mark the "encryption" checkbox. And if you use iCloud Backup, be aware that can't be protected in a way that Apple can't give it to the authorities if they ask for it.
There are a lot more things to say about how the security architecture of the iPhone works, and Apple did a very good job in explaining them in their iOS Security Whitepaper. If you're interested in these things, I'd recommend you read it.
This post originally appeared on Medium.