Apple was right to refuse to help the FBI - TechCentral

Apple was right to refuse to help the FBI

alistair-fairweather-180Apple is looking really silly at the moment, right? After months of refusing to help the FBI unlock the iPhone of a suspected criminal, and calling those requests “a slippery slope” and “unconstitutional”, all it took was a week with some Israeli hackers-for-hire and bingo, problem solved. So what was all the fuss about?

In case you’ve been living under a rock, here’s a recap: the FBI wanted access to the contents of Syed Rizwan Farook’s iPhone. Farook, along with his wife, killed 14 people in a shooting spree in December 2015.

Unfortunately for the FBI, one of Apple’s new security features made the bureau’s existing data extraction techniques useless. The feature in question wipes a phone’s content if the incorrect passcode is entered 10 times in a row. As such, the bureau could no longer use existing tools to makes thousands of guesses at the passcode without destroying the very evidence it was seeking.

Out of options, the FBI demanded Apple build a backdoor to allow it access to the data directly without using a passcode. Apple refused and so the FBI went public and started legal proceedings to compel Apple’s cooperation.

Then, on 21 March, the FBI abruptly postponed the case, and dropped it completely a week later. The reason given? “An outside party demonstrated to the FBI a possible method for unlocking Farook’s iPhone.” It has since emerged that the most likely outside party is Cellebrite, an Israeli firm specialising in mobile forensics and data extraction.

So, why did Apple make such a stink about the issue? For two reasons: principle, and public relations.

The principle is simple: if you build a master key for any operating system, you permanently and irrevocably compromise the security of that operating system. The FBI promised that the backdoor would only be used once and destroyed immediately thereafter. But that is both completely naive and entirely disingenuous.

The first problem is that, once a backdoor exists in code, it is effectively impossible to keep it private. One leak by a disgruntled or crooked law enforcement employee and it would be in the hands of every hacker on the planet in a week.

The nature of backdoors is that they do not care who is using them. They let in criminals just as easily as law enforcers. There are hundreds of millions of iPhones in use on the planet. Do you really want every petty crook to have complete access to your e-mails and bank details after stealing your phone?

The second problem is that the FBI (along with other law enforcers) would never be able to resist the urge to use the backdoor again. Already, the FBI has agreed to help prosecutors in Arkansas to unlock another iPhone. And it hasn’t even been a week!

But can’t criminals use Cellebrite’s method as well? Perhaps, but they would have to be extremely specialised, well-funded and well-motivated.

If speculation by security analysts is correct, then Cellebrite used a technique called “Nand mirroring” where a physical chip is removed from the phone and then effectively tricked into accepting millions of simulated access attempts without any risk of wiping the phone.

Apple CEO Tim Cook said recently that "people in the US and around the world deserve data protection, security and privacy". Image: iphonedigital

Apple CEO Tim Cook said recently that “people in the US and around the world deserve data protection, security and privacy”. Image: iphonedigital

This technique is both very advanced and very expensive. Backyard hackers and criminals would not easily be able to replicate it, nor would they want to. Spending weeks hacking the phone of an ordinary person is not good business. Their data is not valuable enough.

And this is the core of the principle: the difference between an “official” backdoor and techniques like Nand mirroring is like the difference between intentionally leaving your wallet lying in the street, and it being stolen out of the safe in your hotel room as part of an inside job.

It’s interesting how swiftly the US government dropped its legal challenge. The case was about to set a vital precedent about the state’s powers to spy on its citizens and to compel companies to cooperate with its agents. I strongly suspect that the FBI is relieved to not have to open that Pandora’s box. Were it more confident in its case, I have no doubt that it would have proceeded.

But while the principle may be noble, Apple is first and foremost a business. There’s an enormous amount of free publicity to be had from simply refusing to cooperate with the FBI. After rampant abuses by America’s National Security Agency, Britain’s GCHQ and other state-sponsored snoops, people are increasingly angry with and suspicious of their governments.

Apple also has a lot to gain by appearing tough on privacy and security. When its iCloud platform was hacked in 2014, exposing the private photos of several celebrities, Apple lost a good deal of public trust and also a chunk of business. There’s nothing like a tussle with the big mean government to wash that inconvenient memory out of the collective consciousness.

The bottom line, though, is that Apple did the right thing here. The fact that it also happens to serve the company’s financial and PR interests does not make it any less right.  — (c) 2016 NewsCentral Media

18 Comments

  1. Greg Mahlknecht on

    This article relies a little to much on Apple PR for facts, and not on the actual realities. The mistake that most reports and analysis makes is that thinking this would be a backdoor that could make its way to the public. However, the reality is that FBI was more than happy for it to be once off thing inside Apple Labs in an Apple controlled environment and it certainly wasn’t to “access to the data directly without using a passcode”.

    The notion that it might leak given that Apple was controlling the environment is naive. If the department in Apple that holds the signing codes for iOS had to be compromised, what they did for the FBI would be the least of their worries! If you have to take Cook’s comments on the matter at face value, he’s saying he’s worried that iOS source code and signing keys will get leaked.

    The net effect is, that instead of Apple having total control over the whole narrative, there is now a well publicized known vulnerability out there that Apple has no control over.

    >It’s interesting how swiftly the US government dropped its legal challenge.

    Not really. The original court case had FBI wanting the help from Apple because they couldn’t unlock the phone. The phone was unlocked, so the case became redundant. If the Cellebrite theory is correct, NAND mirroring is an intrusive technique and they wouldn’t have been able to hide the fact their argument was null and void.

    I believe that Apple was right to resist the FBI on the grounds of the precedent it would set, but they did this 100% for PR (as the Security concerns are BS) with a very unconvincing straw man they set up, and it did not end well for them, they lost this battle bigtime.

  2. Alistair Fairweather on

    Hi Greg,

    Incisive feedback, as usual, thanks. We’ll have to agree to disagree on the chances of a backdoor making its way into the public domain. Information wants to be free, and that kind of master key would have a particular yearning for the wild. And then there’s the fact that the FBI would *definitely* have kept requesting help every time they hit a new snag. That’s not a maybe. And the more keys there are, the more opportunities there are for them to go missing.

    I also disagree on the dropping of the case. This was an ideal opportunity for the DOJ and the FBI to milk an opportunity to set a precedent. Apple and other device manufacturers will find ways to protect against NAND mirroring and they will be back to square one. I suspect they are nervous of testing the All Writs Act because it is such a vital cog in the machinery that allows them to compel companies and individuals to cooperate with surveillance orders. Had the case gone the wrong way, they stood to lose existing powers, not just fail to extend those powers.

    But yes, PR played a big, big role. And yes, I may have bought into their narrative to a degree. But, as we both agree, it’s ultimately a good thing that Apple did not cave.

  3. Greg Mahlknecht on

    >and that kind of master key would have a particular yearning for the wild

    That master key exists today in the same rooms that would have been used for the FBI case, and handled by the same people. I don’t see how this would have changed anything.

    >And then there’s the fact that the FBI would *definitely* have kept requesting help every time they hit a new snag

    This we can agree on, but you can’t really argue that legally, which is why Apple battled to find an argument.

    > I also disagree on the dropping of the case

    I don’t see how the case could have gone on if FBI were asking to unlock an unlocked or hacked phone? If they’d somehow won, and taken the phone to Apple with the chip soldered back on or the firmware blown open, they’d be a total laughing stock and have made a mockery of the legal system? FBI is also bound by laws.

  4. The master key doesn’t exist, it would have to be written by apple’s engineers.
    Would be interesting though, could a court compel a company to write software?

  5. The unlock precedent is one thing.
    There is another matter.

    Why should any state be able to conscript a company and its staff to do its bidding especially when that bidding will do harm to the company?
    Conscripting a company (without the authority to do it) is bad enough.
    Conscripting them to harm themselves… is far worse

    Ordering Apple to provide information out of iCloud = No problem with the right warrant.

    Ordering Apple to do Engineering development — No way

  6. Greg Mahlknecht on

    I’m not sure you understand what the “master key” is. It’s the key used to sign iOS firmware. It exists. That’s how every version of iOS is and has always been signed – without it the device won’t boot the operating system.

    The actual changes to iOS to remove the restrictions to allow a brute force hack of the phone are trivial. It’s getting it on to the device and executing it which is the difficult part, and for this it needs to be signed with the key.

    If FBI were asking for this key to be given out to them, or the signed firmware to be given to them so they could install it on as many devices as they want, I’d agree with Alastair. But this isn’t what they’re asking. This is the straw man Cook has set up and succesfully sold to many people.

  7. Greg Mahlknecht on

    Companies have to do special engineering development all the time to comply with laws; from changing the way things are logged, to making/changing reports to report on certain things. From a programming point of view, there’s nothing special to see here.

    If Tim cook mistakenly forgot his passcode and locked himself out of his phone, he’d pop down to engineering, throw it on someone’s desk, and they’d do exactly a hack like the proposed one and have it unlocked by lunch.

    I agree on Apple’s views on the matter. I’m on their side, I just think their legal team failed miserably at their job and went to court with the wrong arguments that would never have stood up to competent expert witnesses. They should be glad the case was dropped, as it’s sure to come up again and they can learn from their mistakes here and come back with a more sound argument next time.

  8. No I understand, sorry, the key alone exists, the software doesn’t.
    How do you know it’s trivial to rewrite the software?
    Regardless, even if it was 5 minutes work, do you believe a court has the right to order a developer to write it?

  9. But the FBI said it’s “just about one phone” – if it becomes law then definitely not just about phone, it’s technically a backdoor.
    If Apple (and Samsung etc) had to comply in order to sell their devices in the US, what’s stopping China from putting the same laws in place.
    Apple’s PR campaign and lawsuits were different things, the PR campaign (particularly Cook’s cancer comment) weren’t great, but from what I read their legal stance was pretty solid.

  10. Greg Mahlknecht on

    >How do you know it’s trivial to rewrite the software

    They need to remove a delay and a retry check. I can’t think of a way that you could make this a difficult task in code. You could obfuscate with fancy methods it like copy protection in the DOS days, but that’s bad security – keeping it simple and securing iOS via a signature is the right way to do it, and that appears to be how Apple have done it.

    >do you believe a court has the right to order a developer to write it

    If they can prove the law allows them access to the information on the device, then yes, but perhaps they should pay for it (in my ISP days I had to develop software to extract data from logs to comply with a court order, it’s actually a common thing). Unfortunately the conversation has been sidetracked with all these misguided technical conversations.

  11. Greg Mahlknecht on

    > If Apple (and Samsung etc) had to comply in order to sell their devices in the US

    Comply to what exactly? Availing forensic help to the government? That’s an entirely different conversation to what we’re having right now.

    So long as the devices are under Apple’s control in a controlled environment, and the firmware is put on temporarily to bruteforce the passcode, this is not a public backdoor.

  12. Think we’re going to have to agree to disagree here, but I believe the “So long as the devices are under Apple’s control in a controlled environment” is naïve, there is no way that software would remain private, but that’s a matter of opinion, not fact.

  13. Greg Mahlknecht on

    >is naïve, there is no way that software would remain private, but that’s a matter of opinion, not fact.

    Ummmm.. you ARE aware that iOS is in this same controlled environment and has so far remained private. If you truly believe that Apple can’t protect it’s controlled environments, then you have to accept that the main iOS code + related keys will leak at some point, and then this FBI issue will be the least of everyone’s worries. And that is a matter of fact, not opinion.

  14. Actually constitutional experts said that the FBI didn’t have a prayer in conscripting Apple to do engineering work.

    The law the FBI relied on (All Writs Act of 1776) to try and force conscription failed in a similar bid in New York.

    It wasn’t going to work… Congress would need to pass laws to make it happen.
    And thats the way it should be because such things should be debated in Congress.

    In my opinion it is the FBI that screwed up

  15. Greg Mahlknecht on

    I dunno – I’ve read the same stuff as everyone else on this, and Apple’s “unreasonable burden” and “we don’t know how to do it” arguments to argue the All Writs order are almost laughable to anyone with knowledge of the technical subject matter. For a company Apple’s size, the burden is insignificant. In fact, about 90% of what Apple says is “too burdensome” (a device that emulates a keyboard typing 0000 to 9999 over bluetooth or plugged in to the phone) you can buy online for $350. Maybe next time, FBI will hire competent lawyers that will go in to the courtroom with one of those 🙂

    I agree 100% with Apple’s larger argument, but as far as I know “government MIGHT ask us to do X, Y, Z at some point in the future” doesn’t have legal standing.

    > In my opinion it is the FBI that screwed up

    Maybe they both screwed up, but Apple suffers more – the light has been shone brightly on iOS security and we now know if governments want to crack iPhones, not a problem, just drop some taxpayer money on a firm in Israel and it’ll happen.

    I don’t think underestimate the damage to Cook’s reputation – techies now know he’s a world-class BS’er. He didn’t fight this with facts and principle, he just played on fears and used straw men as a PR exercise. That doesn’t sit well with many people.

    I really wanted Apple to win this decisively on privacy protection grounds and set a precedent, but was disappointed the way it was handled.

  16. I remain unconvinced…

    The FBI wanted this precedent. Conversely they didn’t want to be ruled against either as it would set the reverse precedent.

    So why not stay the course unless they believed they would lose?

    I to wanted the precedent for privacy but the FBI withdrew before it really got to court… You can’t blame Apple for that.

    Apple hadn’t begun to fight. These things take ages and you don’t play your whole hand early on.

  17. And in the good old US of A, this would have been a very public trial, so the loser, whichever way, stood to loose much more than just the trial.