Do We Want An FBiOS?

Super Cooper (our new dog – now one year old) woke me up at 4:45 this morning so I got up, let him out, got a cup of coffee, sat down in front of my computer, and spent the next hour going down the rabbit hole of the FBI / Apple phone unlock backdoor encryption security controversy.

After an hour of reading, I feel even more certain that Apple is totally in the right and the FBI’s request should be denied.

The easiest to understand argument is Bruce Schneier‘s in the Washington Post titled Why you should side with Apple, not the FBI, in the San Bernardino iPhone caseHis punch line is extremely clear.

“Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.”

Given that the law being used to try to compel Apple to do this is based on the All Writs Act of 1789, precedent matters a lot here. And, if you thought the legal decisions from 1789 anticipated the digital age, please fasten your seat belts – or maybe even encase yourself in an impermeable bubble – for the next 50 years as it’s going to get really complicated.

Once I got past the advocacy articles like Why Apple Is Right to Challenge an Order to Help the F.B.I. in the NY Times, I read a few of the “what is really going on here” articles like Wired’s Apple’s FBI Battle Is Complicated. Here’s What’s Really Going On. The context was starting to repeat, so I got it at a high level but wanted to understand the technical underpinnings of what was happening.

Since it’s the Internet, it was pretty easy to find that. The Motherboard article Why the FBI’s Order to Apple Is So Technically Clever was a good start. Dan Guido’s Trail of Bits post Apple can comply with the FBI court order was super interesting since he only focused on the technical aspects rather – focusing on feasibility rather than getting tangled up in whether it’s a good idea or not. Ars Technica has an article that ads a little in Encryption isn’t at stake, the FBI knows Apple already has the desired key.

I think wandered around a few random articles. Troy Hunt’s Everything you need to know about the Apple versus FBI case has some great historical context and then unloads with current activity including Google’s CEO Sundar Pichai’s Twitter support of Apple / Tim Cook. I ended with Why Tim Cook is wrong: A privacy advocate’s view.

Interestingly (and not surprisingly), the titles don’t reflect the actual critical thinking you can derive from reading the article, so just scanning headlines creates huge bias, whether you realize it or not. When I started reading the various articles, I immediately forgot most of the headlines and was surprised by some of them when I put this post together since the headline didn’t correspond with the message of the post.

I expect there will be lots of additional commentary on this as it continues to unfold in court and in public view. What I read, and thought about this morning, reinforced my view that I’m very glad Tim Cook and Apple are taking a public – and loud – stand against this. We are going to have to deal with stuff like this with increasing frequency over the next 50 years and what happens in the next decade is going to matter a lot.

  • Not sure if you’ve come across this article yet, What are your thoughts on it? Feasible?

    • Sam

      Setting aside the feasibility question, I find it bizarre that a guy who aspires to the Libertarian Party presidential ticket is offering to hack an encrypted phone for the FBI. That goes against pretty much everything I thought the Libertarian Party stood for.

    • Donald Trump should sign up McAfee to be his Chief of Staff.

  • It is one of those cases where the short term gain does not outweigh the long term cost. I FIRMLY believe that the longer the time horizon you use to evaluate decisions, the better decisions you make. When I look at this in the long term view, it is clear to me that Apple is doing the right thing.

    • Yup. And that’s the position I think Cook is taking.

  • khill

    I’m curious why you think the FBI’s request should be denied.

    My initial reaction was to completely agree with Apple in this case. After reading some of the articles you’ve linked though, I’m starting to reconsider. Basically, because of the hardware signing keys used by Apple, it already does have the power to circumvent (not break) the encryption on any device, and they have the power to target that code to arbitrary levels of specificity as to which devices can be accessed. Having that power but choosing not to exercise it doesn’t seem like it is compatible with the Rule of Law.

    I can certainly understand why Apple doesn’t want to comply. Both for fair and socially beneficial reasons (eg not wanting to comply with the Chinese government asking for similar firmware for human rights activists’ phones) and for business reasons in wanting to maintain a business model where the phone is both secure from outside threats while still being ultimately controlled from Cupertino, but I don’t see either of those arguments winning in a court of law.

    • khill

      Looks like I should work at the DoJ

      “Apple has attempted to design and market its products to allow technology, rather than the law, to control access to data which has been found by this Court to be warranted for an important investigation. Despite its efforts, Apple nonetheless retains the technical ability to comply with the Order, and so should be required to obey it.”

  • I think a big thing that people are missing is that Apple is challenging the request. If the Supreme Court says they must do it and they don’t that would be denying the request. I don’t know the ins and outs of the legal process, but it seems to me people are upset that Apple didn’t just comply.

    I agree with Apple’s position, but even more than that I applaud their decision to not just quietly comply with something they strongly disagree with.

    The U.S. was not built by people that just said, ok, I’ll do whatever authorities say without question.

    • Extremely well said.

  • Kyle Reese

    I think one of the key words in the 4th Amendment is “unreasonable.” The constitution protects Americans against unreasonable searches and seizures. If a court ordered warrant is not reasonable than what is? Choosing not to comply with a request from the FBI is one thing, but not complying with a court issued warrant is another. This isn’t even coming from a FISA court! The big question someone has to answer for me is what warrants an unreasonable search? The constitution clearly doesn’t provide us with absolute privacy.

  • RBC

    I just found this article, which I think you’ll enjoy.

    • Well – that made me smile.

  • Jay DeVivo

    I certainly understand the privacy and security concerns raised, but after reading the linked articles (Wired was particularly helpful), I find that Bruce Schneier’s argument that “Either everyone gets security or no on does,” to be a false dichotomy.

    We must balance privacy and security concerns (particularly those related to the burgeoning Internet of Things) with important principles (like justice). It seems like it is time we begin a dialogue on a national protocol of sorts for when and how encryption can be circumvented.

    Knowing nothing about encryption, hardware design, software design, the nuanced legal arguments of either side, or the ideal gas law, I submit, with a misplaced confidence rivaled only be your brother-in-law’s surety in Donald Trump’s ability to “Make America Great Again,” the following 3 suggestions on what such a protocol might include:

    1. To address risks to civil liberties, I would suggest the encryption can only be circumvented in very limited situations, such as cases of suspected terrorist plots, murder, or child exploitation.

    2. To prevent “scope creep” the information obtained should only be permitted to be used in the prosecution of crimes where circumvention is permitted. For example, if the government suspects someone is planning a terrorist attack, but the evidence collected instead suggests a planned bank heist, that evidence cannot be used to prosecute a bank robbery (planned or completed).

    3. There must be rules for who can circumvent encryption, how it can be done, and where it can be performed to prevent these techniques from being used in unauthorized ways. It would seem the big and developing risk here would be in attacks on IoT devices. One requirement should be that there must be a direct, hard-wired connection to the device being “busted” so that the crippled firmware was never transmitted across the Internet or an intranet.

  • Most discussion of this matter is so technically ill informed that it doesn’t do justice to the issue.

    Let’s conduct a thought experiment to explore the bounds of the problem.

    Right now there are means by which Apple can access some data on some versions of iPhones, but…

    …let’s imagine that Apple builds a version of the iPhone and iOS which has no security loophole whatsoever that can be exploited to ‘open’ the phone without destroying the coveted data stored on the phone. Even encrypted data will assuredly be destroyed.

    This is a perfectly feasible scenario. And if such phones were built and sold court orders such as the one under discussion today would be moot. Apple could simply truthfully reply that they could not help.

    Now let’s examine what this means for the privacy vs security debate.

    Security would be more assured than ever before. So those on that side of the fence should be happy, though they may well have twinges of concern about padeophile rings escaping justice.

    But those who would like to be able to access citizens data under specified circumstances would be deeply unhappy. What recourse would they have?

    And now we come to the nub of the matter. The only meaningful recourse would be to pass a law forbidding tech companies from building such a product. That would be an extraordinary step. Furthermore, the international implications of that would be unfathomable. The US couldn’t stop foreign firms from building the product, so presumably import would be forbidden.

    We have seen this kind of scenario before. I recall when the UNIX ‘crypt’ command could not be delivered with UNIX overseas. Needless to say such a half assed attempt to control technology completely failed.

    I strongly suspect that whether it be iPhone or not phones are going to be built which vendors can legitimately claim they cannot help crack.

    We live in interesting times.

    • This is exactly correct it is not a fourth amendment issue as people say. Nobody is arguing whether they can have the phone.

      Can you build and sell something that you yourself cannot recover the data.

      Didn’t we go through the same argument with PGP as well??

      It also the same for keeping system logs or old customer data.

      For instance can you be compelled not to truly delete information if somebody wants to delete their account. How about truly deleting emails if somebody hits the delete button?

      As you say even if you did a foreign vendor will.

      We have to keep five foreign datacenters because NOBODY outside the U.S. trusts any data stored in the U.S.

      While I would love to have the data on that guy’s phone, there are two problems with backdoors. The first is that somebody else can exploit them, and the second is that even the people that are authorized to use them have proven time and again that they cannot resist abusing that power. The temptation is too great. Hack that guy’s phone. Now let’s hack all the phones he called and lets hack all the phones they called. People will say if you’ve done nothing wrong what do you have to hide. John Adams said the spark of the American Revolution was over privacy.

      • Agreed.

        Now let’s add to the mix the fact that if it is known the phone can be backdoored it won’t just be nasty criminals that will find a way, it will also be foreign governments and NOBODY’S data will be secure.

        • How about your business plans?

          How about impersonating you and inflicting damage?

          When people say if you’ve done nothing wrong you have nothing to fear I say what if I plant information on your device that shows you have done something wrong.

          I suppose for the drivel that people have on their phones why care, and yes breaches will happen.

          But that doesn’t mean I need to give up my privacy.

          If you look at the Bill of rights most of it is protecting rights that if you take the extreme case are offensive:

  • You have me convinced. I still want to be able to go after terrorists phones somehow. Instead of panicking and hyperventilating, we should thoughtfully think about how we do this. Hyperventilating got us a Homeland Security bureaucracy and a myriad of laws/regs that we have no idea how to apply.

    At the same time, I see a parallel between this and arguments other entities make. “If you do one, then that means it happens for everyone”. Parallels to guns, abortion etc. ad infinitum. I agree strongly with @philipsugar:disqus, I don’t think the founders would just roll over to any demand.

    • I’ll take both sides.

      On one side you could make every person that is in the U.S. wear an implanted camera on their forehead with a recorder and transmitter the government could access.

      We would be much safer, but we would have no freedom.

      You could say that there should be no such things as search warrants ever, and we would be much freer, but have no safety.

      So it is a continuum.

      That is the hard answer. Not the “why wouldn’t you do anything to make us safer?? Or there is no such thing as privacy” bullshit arguments.

      When I hear people make their arguments it really bothers me. Much of the issue is that blocking and tackling doesn’t get done, and once the shit hits the fan people want to sensationalize the results.

  • Jann Scott Live

    Nice try Brad, but no sale.. no cigar. Tim Cook does not have a choice in the matter. He either ponies up that phone data or he’s going to jail. the justice department is not going to screw around with Apple. Everybody knows
    Apple has the data. Apple knows it and they are not going to jerk DOJ around. Who is Apple afraid of ? their own engineers who might fly the code in the hacker market.? They already have that problem. The only people who stand with Apple are the tech crowd who the American people are becoming are becoming short tempered with. It is the same with Edward Snowden. Because your pears stand with Apple, you are out on a very lonely dangerous limb. The people we have to fear are people like Edward Snowden and the rest of the hackers… not the US government..(I realize that is the paranoid delussional sentiment of rich selfish techies, but it is bullshit). You would have to had worked with goj dod state foreign service to know our people are honorable… but no matter, Apple will end up giving up that phone and make it look to their advantage, politically… they already have the data

  • So would we make the same arguments about phone taps? If one person’s phone is tapped by a court order, does that mean none of our phone conversations will ever be private? I haven’t made up my mind on this issue, but the pro-Apple arguments generally strike me as pretty hyperbolic. In general, I think the idea that law enforcement should be able to get a court order for a specific device’s data in a criminal case is reasonable. Supporting that need without making every phone meaningfully less secure seems like it should be possible. I’m certainly not a cryptographer or computer scientist, so perhaps I’m wrong.

    I can see how using the All Writs Act is problematic, and perhaps the implementation the FBI is asking for. But in that case let’s focus on those issues rather than claiming that there’s never a legitimate need to access information on a device.

    • You know you can encrypt phone calls right???

      • I wasn’t aware of that, but it makes sense. Do you think it changes the arguments?

        • This has always been a challenge. Frankly most of the time people put in backdoors for convenience, I.e. a super admin can reset your password when you forget it.

          Also security wasn’t as good.

          The issue now is it’s getting good enough the government can’t break it.

          Most companies were happy to comply with the government because they got paid a ton of money from the government. Apple doesn’t need the money and really wants to sell iPhone’s outside the U.S. I can tell you people from outside the U.S. absolutely do not trust that the U.S. won’t abuse their data. That is an understatement.

          And yes putting in any backdoor is a great big security hole.