So recently, the FBI has obtained an order to have an iPhone compromised for an investigation.
The issue is thus:
- The iPhone is locked with a 4 digit passcode and the FBI doesn’t know what that passcode is.
- The iPhone’s data is encrypted – so they can’t just yank out the flash memory and attempt to read the contents. The passcode is required, through the operating system on the iPhone to decrypt that data.
- Because 4 digit codes aren’t really very secure (only 10,000 possible combinations), iOS will gradually force longer and longer delays between failed attempts to unlock the phone. (Edit: As Kieran points out below, codes in recent versions of iOS allow up to 6 digits, or 100,000 combinations)
- As an added layer of security, a user can set their iPhone to wipe its data after 10 consecutive failed attempts.
The FBI wants the data on that phone. But the process of brute forcing an unlock might wipe that data out and even if not, will still take a long time with the lockout delays and manual passcode entry. So a US federal magistrate has ordered Apple to do whatever is necessary to work around these safeguards so the FBI can access the data quickly and safely.
They’re refusing on the grounds that (among other things) this will create a “backdoor” that will compromise all iPhones ever.
This is not true. iPhones are already all compromised. The backdoor is Apple.
Apple has an unsettling fixation with keeping control of all of their devices (and you thought the $1,100 you spent meant it was your device) – as a result, they go to great lengths to prevent and discourage jailbreaking their devices. They also (apparently) are able to force a device to update its system files without being unlocked or wiped.
And that’s the backdoor.
Somewhere, Apple has the keys necessary to create an iOS update that iPhones will cheerily install without being unlocked or factory reset. And that update can be… anything. In this case, the FBI wants it to remove the delay on failed passcode attempts and circumvent the auto-wipe when 10 consecutive attempts fail. They’d also really like to be able to enter passcodes electronically (so the intern doesn’t have to sit there typing 0-0-0-0, 0-0-0-1, 0-0-0-2… ummm what was that last one again?).
That can all be done – provided you have Apple’s keys (presumably for digitally signing such an update).
The fact that they’ve allowed this possibility is irresponsible and reckless, regardless of how much they resist the order to roll over for the authorities. It should not be possible for a locked, encrypted phone to install a system update.
So what would be better?
If you’re in the market for a personal electronic device that isn’t vulnerable to this… well… too bad. At some level you’ll need to trust the manufacturer to be responsible about your security.
Apple isn’t trustworthy.
Edit:
To clarify, this isn’t an “Android is better than iOS” post – it’s not. There isn’t a similar device I can think of which would definitely be any harder to break. But Apple have left themselves open for these kinds of orders by insisting on allowing for their own updates to be installable on locked, encrypted devices.
Once encryption is enabled, it should not be possible to make changes to the OS without unlocking or wiping it. It’s not like this backdoor would have been a surprise to the engineers involved; yet the fact that Apple is capable of compromising their devices would likely surprise their users (and has). Apple’s “customer letter” is disingenuous – the FBI aren’t overreaching, Apple failed to account for the backdoor they deliberately left on their devices.
The crux of the argument is Apple can install system files on locked phone without user authorisation. You got a citation for this? Also recent iOS devices on iOS 9 have a six digit code.
DFU mode allows for the installation of Apple signed firmware to an otherwise locked iPhone.
The clincher is that Apple haven’t simply stated, “this is not technically possible” – they are instead fighting it through the courts.
6 digits or 4 digits doesn’t make a significant difference; should Apple be forced to comply, a software method of entering codes won’t care if it has to bruteforce 10000 or 100000 – provided the delay and the data wipe failsafe are disabled. But fair call – I’ll update the post.