Some have called it the battle for our future: the clash between the world's biggest tech company and the world's biggest government.
But first, some background: on December 2nd, 2015, gunman Syed Farook and his wife, Tashfeen Malik, opened fire, killing 14 people and injuring 22 in a terrorist attack in the town of San Bernardino, California. After the shooting, the couple left in an SUV, only to be found hours later and killed in a shootout with the police. The FBI seized an iPhone 5c running on iOS 9 and locked with a passcode. The Federal Bureau believes that the phone has information vital to the investigation and it is pushing Apple to take unprecedented measures to crack the device.
A federal judge has issued a court order requiring Apple to build a backdoor that would allow the FBI to hack the iPhone of the San Bernardino shooter. Apple says that there are no guarantees that such a backdoor - that currently does not exist - would be used for this case alone and will allow the government to spy on anyone with an iPhone. The company will appeal. While the legal process will likely take months, it's good to know why this is important not just for the personal data of everyone with an iPhone, but for the personal data on any phone, period.
"If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge,"
Tim Cook said in an open letter to Apple customers explaining the situation
What Apple is opposing here is Big Brother, in a very real, modern reincarnation.
First, though, let's try to understand why the all-powerful FBI finds it impossible to break into the San Bernardino shooter's iPhone on its own and has gone to the trouble to require Apple's assistance.
iPhone security 101
It's important to know that iPhone security can be roughly divided in two eras: pre-iPhone 5s (aka pre-Touch ID) and post iPhone 5s.
With the introduction of its Touch ID fingerprint scanner, Apple has introduced an overhaul of the iOS system security, making its platform much more secure. Before we dive into the details, we should clarify that the San Bernardino shooter used an iPhone 5c that the FBI now has. It is an old phone from the first, pre-iPhone 5s era of security. However, the FBI finds it impossible to crack even this phone within reasonable amounts of time.
This brings us to the core features of iPhone security.
There are three key protections on iOS that prevent the FBI from breaking into the San Bernardino shooter's iPhone:
- iOS may completely wipe the user’s data after too many incorrect PIN entries
- PINs must be entered by hand on the physical device, one at a time
- iOS introduces a delay after every incorrect PIN entry
What the FBI wants
As you'd expect, the court order (PDF here)
asks Apple to remove all three in what would create a backdoor for the FBI to use to 'brute-force' the PIN code on the phone. Brute forcing simply means that the FBI will hook up the iPhone to a powerful computer that would quickly run through all possible PIN combinations until it guesses the one that the shooter has used on his iPhone. Here is what the FBI wants Apple to do to allow it to brute-force the phone:
- Disable the iPhone function that wipes the phone after too many incorrect PIN entries
- Enable PIN input to happen not on the iPhone itself, but from another device, so that the FBI could have a computer doing this work
- Disable the delay so that the computer that guesses PINs can do this as fast as possible
Two important notes here: some research firms claim they are able to hack into iPhones before the 5s that are running on up to iOS 8.4, so one can assume the iPhone 5c in question runs on iOS 9. Also, encryption would not be that critically locked down and could be bypassed easier on a phone that is not powered down. This suggests that the FBI either allowed the phone to run out of battery, or obtained it powered down. In either case, all evidence suggests that the FBI cannot crack into the shooter's iPhone on its own.
The FBI cannot crack the shooter's phone... so it wants to be able to crack everyone's phone
Put in simple terms, the FBI has ordered Apple to build a custom, signed version of iOS that would disable the protection that Apple itself implemented. The version will bypass passcode delays, won't wipe the phone after a few incorrect attempts, and will allow the FBI to hook up its computer to guess the passcode faster. This, by all means, is a backdoor.
So why cannot the FBI itself build such code and flash it onto the iPhone? The reason is in the way iPhone firmware updates work: they are flashed via the Device Firmware Upgrade (DFU) Mode. Once your iPhone is in DFU mode, you can add new firmware to your iPhone via a USB connected device. However, before installing the firmware, the iPhone always checks whether the firmware file has a valid signature key. Only Apple has the signature keys, and this is why the FBI cannot simply load its software on its own terms.
What if it was a newer iPhone: enter the Secure Enclave
The hacking of an iPhone, however, might have been even harder if the shooter used a newer iPhone - the 5s, 6 or 6s.
With the introduction of Touch ID, Apple has placed a separate hardware chip, the poetically named Secure Enclave (SE), a separate computer (or co-processor, if you prefer) in the iPhone. The Secure Enclave takes care of the privacy of file encryption, Apple Pay and Keychain Services. When you enter your iPhone passcode on a device with Secure Enclave, the passcode is bundled together with a key that is embedded in the SE, so in order to break into the phone, you now need both the passcode and this key. Keys from the Secure Enclave cannot be read by iOS in any way, so that's why even a modified version of iOS would not be of any help to the FBI - had the shooter used a newer iPhone.
Even if the FBI succeeds in forcing Apple to build a custom iOS version (FBiOS?), if it were dealing with a Touch ID iPhone, the FBI agents would not be able to crack the phone. The obstacle in the way is the fact that the Secure Enclave (SE) keeps its own, separate record of failed PIN attempts and separately mandates a delay. After 9 failed PIN attempts, SE will introduce a 1-hour delay between attempts, making brute-forcing the password practically impossible.
However, since the San Bernardino shooter's iPhone 5c does not have this Secure Enclave chip, it relies only on software to dictate PIN attempt delays that prevent brute-force attacks. Hence, the FBI can order Apple to build such software, disable the delays and this would be enough to brute-force an iPhone 5c.
To illustrate the power of the Secure Enclave, you need to look no further than the recent scandal over 'Error 53'
. The 'Error 53' is a fatal iPhone error that users who have serviced their iPhones in unauthorized centers get when their iPhone has been serviced with a third-party Touch ID fingerprint scanner. Apple has restricted iPhones to work with a single Touch ID sensor via the Secure Enclave, a security measure that prevents hackers from bundling fake Touch ID sensors to brute-force fingerprint authentication.
Is it even possible to crack a Touch ID iPhone?
Going one step further, let's ask the question: what if the shooter had a newer iPhone? Building an iOS backdoor - as the FBI requires - would not be enough then, but is it even possible to crack the Secure Enclave? The answer is unclear. Apple is not providing details about the Secure Enclave to the public, but security expert Dan Guido suggests that Apple has changed passcode delay times in the past on Touch ID phones, which would be possible only if it could update the firmware for the Secure Enclave chip. Hence, if it was a newer iPhone (and, we bet, in the near future) the FBI would be asking Apple for not only an iOS backdoor, but a separate Secure Enclave backdoor as well.
An unconstitutional order
The fight for consumers privacy has been going on for eons, but for the first time in recent history, we have a company the scale of Apple make such a bold step to protest the government's requests. The American Civil Liberties Union and the Electronics Frontier Foundation (EFF) have taken a firm stand, supporting Apple's position and the right to privacy. Cryptologists and national security experts have long held this position. Google's Sundar Pichai has expressed (lukewarm) support as well
. Other high-profile figures like Whatsapp chief Jan Koum has also taken a stand with Apple. But it is shocking to see giants such as Facebook and Microsoft, to name a few, remain in worrying silence.
Admittedly, Apple has positioned itself as one of very few that puts security at the forefront and makes it a key value for Apple as a brand, but this is a fight about much more than just Apple.
"Code is speech, and forcing Apple to push backdoored updates would constitute “compelled speech” in violation of the First Amendment. It would raise Fourth and Fifth Amendment issues as well," the EFF adds
. Yes, this would be in direct violation of The Constitution.
This is a battle between the world's biggest tech company and the world's most powerful government
What's really at stake? Put simply, law enforcement would typically request access to information by a warrant, but it cannot mandate a company to change its product, as that would mean interfering in its business. This would be comparable to the FBI ordering carriers to start recording everyone's calls, so that the FBI can listen in (currently, carriers only hold the numbers of contacts and lengths of calls, but not the actual call recordings). That is the type of precedent that is at stake.
The public backlash
Apple has not taken an easy decision: it stands firmly to protect users' privacy and security in a very sensitive case of terrorism that populists can easily use to manipulate the debate and put the blame on Apple. The headlines do not disappoint:
"Apple chose to protect a dead ISIS terrorist’s privacy over the security of the American people," Sen. Tom Cotton says, while Sen. Dianne Feinstein is about to introduce a bill to force Apple to comply with the court order.
Trump and others have already started the smear campaign against Apple
Modern-day buffoons like Donald Trump have also quickly jumped in on this, in an attempt to rape in the benefits of a nation hurt by gun violence. "Who do they think they are?" Trump throws a tantrum in front of the media
, but fails to consider the implications of a backdoor to the privacy of millions of people.
Those reactions will only intensify as public figures try to reap the political dividends of a highly sensitive issue. It's commendable that Apple is taking a firm stand to protect users privacy despite the very high possibility that it will be bad-mouthed by influential public figures.
Conclusion: Here's why this is important
Finally, to wrap things up, let us repeat the main concerns around this unprecedented fight for the people's privacy: if Apple is required to crack an iPhone for US law enforcement agencies, why should not it do the same when the Chinese, Iranian or Russian governments request the same?
If Apple provides code that allows the FBI to crack the iPhone 5c of the San Bernardino shooter, what guarantees are there that a malicious hacker won't some day get hold of that code and get the capabilities to break into millions of other iPhones?
Furthermore, after the Snowden revelations in 2013, what guarantees are that our government itself won't hack into Americans' phones at will?
- Which side are you on?