The Apple/FBI question is harder than it looks

Nothing about the Apple vs. the FBI showdown is as clear-cut as it initially appears.

There’s a way of telling the story that makes Apple sound completely unreasonable, and could even justify Donald Trump’s call to boycott the company: The FBI needs to get information off the iPhone of one of the San Bernardino terrorists (Syed Rizwan Farook), so that it can check whether there are additional conspirators or direct operational links to ISIS. The only damage in the FBI having that information is to the privacy of a dead terrorist. But Apple is fighting a court order that instructs the company to help the FBI, in a case that could well wind up at the Supreme Court. Senator Tom Cotton draws this conclusion:

Apple chose to protect a dead ISIS terrorist’s p‎rivacy over the security of the American people.

Sounds pretty bad. But that story falls apart in a bunch of ways. First, CNN’s national security analyst Peter Bergen argues that the information on that particular phone is probably not all that important.

What might be learned from Farook’s iPhone? Of course, we don’t know, but it’s likely that it wouldn’t be much beyond what we already know from the couple’s Facebook postings, their Verizon phone account, their computers seized by police, the evidence found at their apartment complex and the fulsome confession of their friend Enrique Marquez, who allegedly provided them with the rifles used in their massacre and also allegedly knew of their plans to commit a terrorist attack as early as 2012.

No evidence has emerged that Farook and his wife had any formal connection to a terrorist organization, and the plot involved only the couple and the alleged connivance of Marquez. What might be found on Farook’s iPhone therefore is more than likely simply only some additional details to buttress the overall account of what we know already.

Bergen thinks the FBI is pushing this case purely to establish a precedent for future cases. In public-relations terms, Farook is the least sympathetic target the FBI is likely to get, so why not have the public battle here?

He notes that Apple’s side of the argument is not so clear-cut either: Apple has cracked iPhones for the government many times in the past, and responds to court orders concerning iPhone data that has been backed up to iCloud. So what great principle are they standing on?

These revelations suggest the possibility that the facts of this particular case aren’t as important as the larger principles at stake and that both Apple and the U.S. government are using the San Bernardino case as something of a test of the question: Should tech companies give the FBI any kind of permanent backdoor?

And then things get technical: What’s different about this iPhone (as opposed to the ones Apple has previously made available to the government) is that it’s a more recent version, the 5C, whose security features Apple touted. So Douglas Rushkoff sums up what the FBI wants of Apple:

They’re saying, “We want you to reveal that the promise you made about this phone turns out not to be true.”

In an open letter to its customers, Apple emphasizes that it isn’t breaking faith with them:

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

Summing up a few of the technical details: Apple doesn’t have the information on Farook’s iPhone, doesn’t have his passcode, and doesn’t have a software tool that recovers the data without the passcode. What, then, could Apple do for the FBI? One security feature of recent iPhones is that the data on an encrypted phone is wiped if an incorrect passcode is entered 10 times in a row. This prevents breaking into a phone by what is called a “brute force” approach, where you connect the phone to another computer that just runs through all possible passcodes. (If we’re talking about the typical 4-digit iPhone passcode, that’s only 10,000 possibilities, which wouldn’t take very long. I’ve seen estimates varying from half an hour to an hour.)

What the court has ordered Apple to do is provide the FBI with what is basically a software patch to circumvent that auto-erase feature. Once they have that, the FBI can crack the phone.

Apple’s response is that it has never written such software, and it doesn’t want to.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

In other words, there won’t be any way to un-ring that bell: Once Apple has software that circumvents its security features, what happens to that software after the FBI has Farook’s data? At a minimum, it’s available to court orders in future cases. And if it’s available to American court orders, why couldn’t it be available to Chinese court orders? Or Iranian court orders? The principle that protects a terrorist today could protect a dissident tomorrow. And if Apple doesn’t stand on a principle, it becomes a kind of court itself, deciding case-by-case which governments deserve its help in which situations.

Worse yet, what happens to the security-circumventing software after this case? What if Apple’s internal security fails, and the software (or enough hints to allow some hacker to reproduce the software) gets out? It could even wind up in the hands of terrorists who decrypt information that helps them plan some future attack.

That’s how you wind up with a story where Apple is the hero: They’re bravely fighting to maintain our privacy. That’s how Edward Snowden put it in a tweet:

The is creating a world where citizens rely on to defend their rights, rather than the other way around.

But Douglas Rushkoff is skeptical of that story too.

It would be a mistake for people to think of this as “The People” against government security. That’s a ruse. Really, it’s the world’s biggest corporation versus the world’s most powerful military. That’s what we’re looking at.

And while I do believe that we people should defend our right to privacy, I don’t see the individual’s right to military-grade encryption. I see Visa companies, or Bank of America’s need to use it on my behalf, if Chinese hackers are using it to buy condoms on my Visa card…

For me to have something that the full focused attention of the Pentagon – which I’m sure is involved – and the FBI… To have something that they can’t break into… Imagine a real-world metaphor for that. “Oh, you’ve got a lock in your house that’s so powerful that if they brought the freakin’ army, and tanks, they couldn’t get in?”

There is certainly an economic angle here: The big tech companies — Apple, Google, Microsoft, etc. — were deeply embarrassed when Snowden revealed how complicit they all were in the NSA’s legally and morally dubious snooping on people who had done nothing to draw suspicion to themselves.

In that sense, Apple’s position (supported by Google and some other tech companies) is a sort of repentance: We have sinned in the past, but we have seen the light now and will sin no more. But the issue isn’t moral, it’s market-based: We need customers to believe we’re on their side, rather than the side of the government that wants to spy on them.

And finally, there’s a technological-inevitability angle on this: If more-or-less unbreakable encryption is possible at a price people are willing to pay, someone will provide it. (In response to Rushkoff: I don’t really need a lock and a door that tanks couldn’t break through, but if I could cheaply get one, it might be tempting.) If the U.S. government won’t let American companies provide those secure products, then they’ll be made in other countries.

So the United States can’t really stop that industry, it can just give it to some other country.

So that’s where I end up: siding with Apple in this specific case, but not making a hero out of Apple CEO Tim Cook. Right now, market forces put Apple on the side of personal privacy. Meanwhile, the FBI is trying to order the tide back out to sea. Law enforcement would do better to start adjusting to the future now.

DISCLAIMER: I don’t think this is affecting my view — I believe I’d feel the same way if Microsoft were taking a similar stand — but I should mention that I own Apple stock, as well as various i-gadgets. However, I am not currently using my iPhone’s encryption capabilities to hide any illegal activities.

Post a comment or leave a trackback: Trackback URL.


  • Leon  On February 22, 2016 at 9:26 am

    A point of technicality: the FBI almost certainly has the ability to modify the iPhone software to circumvent the self-destruct feature. In fact, doing that is within the reach of anybody who has the contacts, money, and time to hire one or two decent reverse engineers.

    The problem is, the FBI doesn’t have the ability to load the modified software onto Farook’s iPhone, as doing this would require the modified software to be signed with Apple’s cryptographic keys, so that the iPhone doesn’t reject the “update” as some random malicious hacker.

    • Corey Fisher  On February 22, 2016 at 10:34 am

      Even if they don’t have a way to circumvent the self-destruct feature specifically, if they have a phone and they can’t break it as a nation-state attacker, they’re incompetent. They almost certainly have a huge collection of known bugs and exploits that work on all kinds of different phones. But that’s also going to be a major expense of time and manpower to keep up to date. The government is feeling the strain of attempting to keep up, and wants to make it so that they have easy access to phones. This isn’t about the ability to break phones in extreme cases, or public access to military-grade encryption – it’s about establishing precedent for the FBI to make it way easier for them to do it whenever they want.

      A note: Public access to good (“military-grade”) crypto is both impossible to control at this point and kind of not the point – even if the crypto is good, a lot of security protocols that implement them are flawed. In most cases, you’re never attacking by breaking crypto, you attack by finding flaws in the system it’s embedded in or places in the system where crypto keys are leaked. To the point where “the crypto can’t be broken” is an extremely common assumption in security design.

  • Carl Kaun  On February 22, 2016 at 9:28 am

    There’s still more to the story, one that tilts it more back to the government’s position. Software on the iPhone is protected from hacking by an encryption key that Apple presumably has heavily protected, with the upshot that only Apple can change the software. Apple could in fact modify the software on Farook’s phone in a way that would work *only* on Farook’s phone, and that would disable the “ten tries then erasure” feature of that phone. No backdoor there, except for that single phone. And it looks like they could have done that quietly in response to the court order, and nobody would have known the difference. Unless, I guess, there really was a smoking bomb of information on the phone. So it really is more a political issue than a technological or precedent-building thing. Or maybe Apple is just tired of responding to government responses.

    • Tom Amitai (@TomAmitaiUSA)  On February 25, 2016 at 12:43 pm

      “Apple could in fact modify the software on Farook’s phone in a way that would work *only* on Farook’s phone, and that would disable the “ten tries then erasure” feature of that phone”

      Please link to a source for this “fact”.

      Here is a real, verifiable fact that supports the government’s case; the court order allows Apple to keep the phone in an Apple facility and merely allow the government to access it remotely.

      This would help somewhat in rebutting the argument that, once they write the software, assuming that they can, that the government will be able to use it as a “back-door” or “master key” in any future case they care to. They would, instead, have to get a court order in each case and pay Apple’s “expenses” in providing the service. This could provide Apple with a potentially lucrative new revenue stream, if they were to follow the example of defense contractors. That level of service would surely be more expensive than a toilet seat or a hammer!

  • lonemtn  On February 22, 2016 at 9:38 am

    I thought I heard that the phone is a county-owned phone. Apple sold the company a phone that had the encryption key, then allowed the employee to use it without requiring the employee to have a key that was company-sponsored? Apple sold a product with that feature, and the county purchased it, agreeing to the product feature. So the county (a government agency!) agreed to the contract. I’d say that the county failed to behave responsibly.

  • joeirvin  On February 22, 2016 at 9:51 am

    We are already living in a police state so does all of this really matter?

    • Philippe Saner  On February 22, 2016 at 10:21 am

      Yes, it really matters.

      America has a long way to go before it becomes North Korea or even China, and cases like this are part of the effort to keep it from going that way.

  • Chris Tierney  On February 22, 2016 at 11:17 am

    I’m not familiar with Rushkoff, but I think that interview with him illustrates one of the particular difficulties about this case—that a lot of people’s intuitions (and a lot of our laws, for that matter) aren’t really caught up with the technology we’re all using on a daily basis.

    So Rushkoff feels like individuals shouldn’t need “military-grade” crypto like banks do, just as our bank vaults need stronger locks than our houses. But this intuition makes no sense with the way the technology actually works—you can only do online banking, or online shopping, or anything with the little lock icon in your browser, safely if both your computer and the bank’s servers know how to do the same encryption.

    And Rushkoff feels like a powerful federal agency ought to be able to break into the phone on their own, or they’re incompetent—but in reality, unless or until the government develops quantum computers and keeps them a secret, there’s nothing special about their computers that should make them able to break in. If the FBI could break the encryption by throwing lots of computing power at it, then so could anybody in the world with a credit card and an AWS account.

    The internet, and all the many aspects of modern life that are online now, rely completely on strong crypto to function. And strong crypto completely ceases to function if a) only some people are allowed to use it or b) it is intentionally weakened “just this once”. But it’s challenging to explain intuitively why this is so, and so it’s unfortunately easy to frame this as “Apple is protecting ISIS.”

    • weeklysift  On February 22, 2016 at 3:18 pm

      Rushkoff is a medium-popularity author, specializing on speculating about the effects of technological change on society.

    • Corey Fisher  On February 23, 2016 at 9:09 am

      A nation-state level attacker is actually almost certainly going to be able to break a phone – not because they have quantum computers they’re keeping secret, but because they have known exploits they’re keeping secret. Breaking crypto is hard, but crypto is embodied in various primitives that are then put into security systems – and unlike the crypto, we usually don’t have mathematical guarantees that these security systems are solid. Nation-states can leverage a lot of resources and manpower towards discovering exploits through research or paying attention to what black-hats are doing that they can then try to use on their enemies.

      (To put it in intuitive terms: your unpickable diamond lock doesn’t help you if the door’s hinges happen to be made of crappy plastic. And while not every door will be, they’re probably going to have a tiny imperfection somewhere…)

  • Rob  On February 22, 2016 at 12:08 pm

    Also don’t forget the angle, that the County (who owns the phone) apparently reset the iCloud password at the behest of the FBI (which the County is claiming, and the FBI is denying) which breaks the existing tools the FBI has to get the data. Thus “forcing” them to get the precedent setting “excepting”

    One more thing to consider, the FBI is asking Apple to create new software. This might or might not be physically possible. What precedent does it set if it isn’t actually possible for them to create the software. Does that make them in contempt? What about the next time the government wants something, can they use this new law to force a tech company (or any other company) to create something that they want? Even if it isn’t possible?

    • Anonymous  On February 22, 2016 at 1:21 pm

      And even if it’s possible, who pays for it?

  • coastcontact  On February 22, 2016 at 2:19 pm

    Most experts have said there is a high likelihood of another terrorist attack. If that attack could be stopped by having access to Syed Farook’s phone what is the justification by Apple? Apple could simply take that phone into a secured room and open it for the FBI. There is ample justification. Apple may be using this argument as an advertising tool. “Our phones are so secure even the FBI cannot obtain access.”

    • Jacquie Mardell (@jacquiemardell)  On February 22, 2016 at 3:12 pm

      But no one has said that the keys to the next attack are to be found on this specific phone. As Bergen points out, the FBI already has most of the available info anyway from email, FB, phone records, previous iCloud backups, etc.

    • weeklysift  On February 22, 2016 at 3:15 pm

      My guess is that both sides like the optics of the current conflict. The FBI is getting tough on terrorism and Apple is defending their customers’ privacy. Win/win.

  • Marty  On February 22, 2016 at 4:12 pm

    There is one detail in software that you are forgetting: costs are always coming down, quickly. If you knew that the local gang was likely to be driving tanks in three to five years, that tank proof door starts to sound like a really good idea. (Yes, today’s cyber criminals are driving around in what would have been considered tanks five years ago.)

  • Anonymous Poster  On February 22, 2016 at 6:07 pm

    I’m someone with no loyalty to Apple at all. I’ve never owned an Apple product, and if I have my way, I never will. With that said: I side with Apple on this one, simply because any precedent set against Apple in this case would be used to “convince” other tech companies into doing the same thing that the FBI wants Apple to do in this case.

  • Abby Hafer  On February 22, 2016 at 8:36 pm

    The usual arguments for letting the government do extreme things is usually the “ticking time bomb” scenario. That is, the government must have the information it wants, or people will surely die. That just isn’t the case here. The crime has already been committed, and the shooters are dead. There’s no ticking bomb. Likewise, for all that terrorist attacks are scary, they’re also highly improbable. so if the FBI is protecting us this way, it really isn’t protecting us from very much. On the other hand, if Apple builds a “back door” into it’s phones, it is certain that hackers including the Chinese government, the Iranian government, and various criminals, will manage to get into Apple’s phones.

    I think it’s pretty easy to make a case based on probability. Like this: What is the probability of being killed in a terrorist attack? Answer: nearly zero. What is the probability of the Chinese government getting into an Apple phone, if Apple builds in a back door? Answer: nearly 100%. So making Apple build in a back door would give many criminals and unfriendly governments the ability to get into your phone, and it wouldn’t make you any safer from terrorists.

    I should also add that we’ve had another shooting since the one in San Bernadino. But it was done by a by white guy, so it isn’t considered a terrorist threat, even though the victims are just as dead. Is the FBI demanding a way into the Michigan shooter’s phone? Just asking.

  • Vanja Pejovic  On February 23, 2016 at 12:35 am

    A more technical summary:


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: