First, some background: An iPhone 5C used by a gunman, Syed Farook, in the San Bernardino attack was recovered by the FBI. The phone — which was running iOS 9 — was locked with a passcode. The FBI believes the phone contains information that could help the ongoing investigation. Although Apple has turned over digital data in its possession related to the gunman’s account, the bigger problem is that authorities cannot unlock the phone.
Moreover, the security provisions built into iOS 9 mean the FBI can’t simply guess the passcode. If someone tries to enter passcodes again and again in the hope of eventually finding the right one, the phone may brick itself after too many attempts.
Putting aside the broader legal and ethical questions about what the FBI is asking Apple to do — and Apple’s pushback on this goal — is the request the FBI is making to Apple even technically feasible?
For this particular phone and this particular circumstance, security and iPhone experts seems to think it is.
The technical challenge
As with virtually all practical technical questions, both the hardware and software matter. In this case we’re talking about an iPhone 5C and iOS 9. The latter is actually the bigger problem here, but more on that in a minute.
As stated, iOS security features prevent someone from accessing an iPhone if an incorrect passcode is input 10 times in a row. By default, the phone will lock the phone for a specific amount of time before it will accept passcodes again, but if the owner has enables the auto-wipe feature, the phone will wipe itself of all the data on the phone.
On older iPhones and versions of iOS, it was possible to bypass the auto-wipe feature. There were also forensic tools available — and frequently used by law enforcement — that could allow a someone with possession of the phone to enter passcodes in rapid succession to speed up the unlock process, basically what’s called a “brute force” hack.
After iOS 7 came out in 2013, Apple has made changes to the way its unlock system works. Although there are still some forensic tools that can reportedly help unlock some iOS devices running iOS 8.4 or lower, they won’t work on devices running iOS 9.Moreover, even though the iPhone belonged to the San Bernardino County Department of Health (it was a work-issued phone), it doesn’t appear there was any multi-device management software running on the phone that would allow IT to override a user-set unlock key.
From the government’s perspective, the only way to unlock this iPhone without the passcode is for Apple to craft special software that will make a brute-force hack possible without fear of a security wipe, without delays, and allowing an input mechanism other than physically tapping in combinations.
What the FBI is asking Apple to do
Dan Guido, the CEO of Trail of Bits, an information security startup in New York City, wrote a blog post explaining exactly what the FBI is directing Apple to do, and why only Apple can do it.
He writes:
In plain English, the FBI wants to ensure that it can make an unlimited number of PIN guesses, that it can make them as fast as the hardware will allow, and that they won’t have to pay an intern to hunch over the phone and type PIN codes one at a time for the next 20 years — they want to guess passcodes from an external device like a laptop or other peripheral.
Guido continues:
the FBI wants Apple to create a special version of iOS that only works on the one iPhone they have recovered. This customized version of iOS (ahem FBiOS) will ignore passcode entry delays, will not erase the device after any number of incorrect attempts, and will allow the FBI to hook up an external device to facilitate guessing the passcode. The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus.
As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.
Is this possible?
So — can Apple do it?
According to Jonathan Zdziarski, a forensics expert who literally wrote the book on iPhone forensics, yes, this is absolutely possible.
On his blog, Zdziarski lays out exactly what Apple can do — in technical terms — to comply with the court order.
He says, “Apple can, on a technical level, comply with the court’s order to brute force the PIN on an iPhone 5C.”
Part of what the FBI is requesting is custom iPhone firmware that will allow it to bypass the auto-wipe functionality and the delays between trying new PIN entries.
According to Zdziarski:
Apple has firmware signing capabilities for all of their devices, and are the only ones in the world that can boot custom software without exploiting a device.
Firmware updates run as a RAM disk on iOS devices, which is similar to booting off of a USB stick.
Apple CAN write a custom RAM disk (as a “SIF”), sign it, and boot it on any iOS device from restore or DFU mode to run from memory.
What iPhone this affects
It’s important to note that this particular vector — or backdoor — would likely only apply to the iPhone 5C and below. That’s because the iPhone 5S and higher come equipped with Touch ID. Part of Touch ID includes a Secure Enclave, a part of the hardware that’s designed to resist tampering, and that hardware affects what can be done to the phone.
As Guido writes, “since the iPhone 5C lacks a Secure Enclave, nearly all of the passcode protections are implemented in software by the iOS operating system and, therefore, replaceable by a firmware update.”
How would this be different with an iPhone 6?
I talked to Jonathan Zdziarski about whether the FBI’s approach to breaking into the iPhone 5C would work on an iPhone 6 or higher device.
The short answer is, although this specific method probably wouldn’t work on newer devices, that doesn’t mean it’s impossible to design a method that would. It’s not inconceivable there wouldn’t be workaround measures Apple could take to craft firmware to disable parts of the system that would bypass some or all of the security measures built into the Secure Enclave.
Zdziarski told me that on newer devices, “the PIN delay is built into the Secure Enclave.” In other words, for the iPhone 5S or later, it wouldn’t be as simple as simply writing some custom software to bypass the passcode delay to brute-force an iPhone 6 because the delay is attached to the Secure Enclave itself.
As we’ve seen with Error 53, the Touch ID sensor is paired to specific hardware. That would make creating specific software for just that phone much more difficult — exactly what Guido also asserts.
But Zdziarski says the PIN detection “may be dependent on the system clock, which can potentially be changed by software.” He also says that “if the microcode inside the Secure Enclave can be updated to remove those delays (without wiping the enclave) then that may also work.”
In other words, even if the process for gaining access to an iPhone 6 is different, it may still be possible.
Is this really restricted to just one phone?
The official word from the FBI and the court is that this is a request specific to this phone and this circumstance. That’s partially because the custom firmware Apple is being asked to create could only work with the software keys on this particular iPhone 5C.
But security experts point out that creating this sort of firmware would be a repeatable activity. As Jonathan Zdziarski pointed out on Twitter, the only technical aspect limiting the custom firmware to one specific iPhone 5C would be the universally unique identifier (UUID) of the phone, which Apple would hard-code into the firmware. That, as he points out, could be easily changed with a warrant.
Apple’s previous compliance
Although this particular case is the most high-profile example of law enforcement reaching out to Apple about gaining access to a device, it is not the first time the company has been faced with this question.
The All Writs Act (AWA) is one of the laws the federal court has cited in order to compel Apple to create custom firmware for its phones. The AWA has been used before to compel phone makers — including Apple — to unlock phones.
The Daily Beast obtained court documents pertaining to a separate phone unlock case in New York. In this case, prosecutors are also asking Apple to unlock a phone and as in San Bernardino, Apple is refusing.
Tellingly, however, prosecutors in that New York case say Apple has agreed to unlock phones 70 time for authorities since 2008.
This matches what Mashable has heard from security professionals, who say tell Mashablethey know of instances where law enforcement has successfully unlocked phones with a court order.
The New York case is different than the San Bernardino case in one key way: That phone was running iOS 7 and thus, didn’t need Apple to unlock it anyway — as there were plenty of forensic tools on the market that could do that already.
As Apple has steadily increased security on iPhones, that has limited law enforcement’s technical options for unlocking them. Now they’re turning back to Apple to give them access again. Looking purely at the technology, it appears possible. How such a precedent will affect iPhone security in the long term, however, is harder to picture.