Apple is looking really silly at the moment, right? After months of refusing to help the FBI unlock the iPhone of a suspected criminal, and calling those requests “a slippery slope” and “unconstitutional”, all it took was a week with some Israeli hackers-for-hire and bingo, problem solved. So what was all the fuss about?
In case you’ve been living under a rock, here’s a recap: the FBI wanted access to the contents of Syed Rizwan Farook’s iPhone. Farook, along with his wife, killed 14 people in a shooting spree in December 2015.
Unfortunately for the FBI, one of Apple’s new security features made the bureau’s existing data extraction techniques useless. The feature in question wipes a phone’s content if the incorrect passcode is entered 10 times in a row. As such, the bureau could no longer use existing tools to makes thousands of guesses at the passcode without destroying the very evidence it was seeking.
Out of options, the FBI demanded Apple build a backdoor to allow it access to the data directly without using a passcode. Apple refused and so the FBI went public and started legal proceedings to compel Apple’s cooperation.
Then, on 21 March, the FBI abruptly postponed the case, and dropped it completely a week later. The reason given? “An outside party demonstrated to the FBI a possible method for unlocking Farook’s iPhone.” It has since emerged that the most likely outside party is Cellebrite, an Israeli firm specialising in mobile forensics and data extraction.
So, why did Apple make such a stink about the issue? For two reasons: principle, and public relations.
The principle is simple: if you build a master key for any operating system, you permanently and irrevocably compromise the security of that operating system. The FBI promised that the backdoor would only be used once and destroyed immediately thereafter. But that is both completely naive and entirely disingenuous.
The first problem is that, once a backdoor exists in code, it is effectively impossible to keep it private. One leak by a disgruntled or crooked law enforcement employee and it would be in the hands of every hacker on the planet in a week.
The nature of backdoors is that they do not care who is using them. They let in criminals just as easily as law enforcers. There are hundreds of millions of iPhones in use on the planet. Do you really want every petty crook to have complete access to your e-mails and bank details after stealing your phone?
The second problem is that the FBI (along with other law enforcers) would never be able to resist the urge to use the backdoor again. Already, the FBI has agreed to help prosecutors in Arkansas to unlock another iPhone. And it hasn’t even been a week!
But can’t criminals use Cellebrite’s method as well? Perhaps, but they would have to be extremely specialised, well-funded and well-motivated.
If speculation by security analysts is correct, then Cellebrite used a technique called “Nand mirroring” where a physical chip is removed from the phone and then effectively tricked into accepting millions of simulated access attempts without any risk of wiping the phone.
This technique is both very advanced and very expensive. Backyard hackers and criminals would not easily be able to replicate it, nor would they want to. Spending weeks hacking the phone of an ordinary person is not good business. Their data is not valuable enough.
And this is the core of the principle: the difference between an “official” backdoor and techniques like Nand mirroring is like the difference between intentionally leaving your wallet lying in the street, and it being stolen out of the safe in your hotel room as part of an inside job.
It’s interesting how swiftly the US government dropped its legal challenge. The case was about to set a vital precedent about the state’s powers to spy on its citizens and to compel companies to cooperate with its agents. I strongly suspect that the FBI is relieved to not have to open that Pandora’s box. Were it more confident in its case, I have no doubt that it would have proceeded.
But while the principle may be noble, Apple is first and foremost a business. There’s an enormous amount of free publicity to be had from simply refusing to cooperate with the FBI. After rampant abuses by America’s National Security Agency, Britain’s GCHQ and other state-sponsored snoops, people are increasingly angry with and suspicious of their governments.
Apple also has a lot to gain by appearing tough on privacy and security. When its iCloud platform was hacked in 2014, exposing the private photos of several celebrities, Apple lost a good deal of public trust and also a chunk of business. There’s nothing like a tussle with the big mean government to wash that inconvenient memory out of the collective consciousness.
The bottom line, though, is that Apple did the right thing here. The fact that it also happens to serve the company’s financial and PR interests does not make it any less right. — (c) 2016 NewsCentral Media