The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Crime

Preliminary thoughts on the Apple iPhone order in the San Bernardino case (Part 1)

|

A lot of people are talking about a court order in California requiring Apple to help the FBI disable features of the iPhone used by the San Bernardino, Calif., terrorist/shooter Syed Farook. Apple chief executive Tim Cook released an unusual public statement vowing to fight the order. A lot of readers have asked me what I think about the situation.

My plan is to offer preliminary thoughts about the case in two posts. This first post will make three points: First, it's too early to say who should win; second, this case involves an existing security vulnerability in older iPhones [but see update below]; and third, there are no Fourth Amendment issues in the case. In the next post, which I hope to write later today and post tomorrow, I'll take on the application of the All Writs Act, the law that the government wants to use to compel Apple to help. Tomorrow's post also will address some of the broader policy issues the San Bernardino case raises.

1. It's too early to tell who should win. Let's start with the bottom line: Should the government win or should Apple win? I think it's too early to know. We are only at the beginning of what is likely to be a long legal process. All that has happened so far is that the government obtained a search warrant and then sought and obtained a separate ex parte order—that is, an order after the court heard only from one side—requiring Apple's assistance. The order, obtained from a federal magistrate judge, is pretty tentative. It explicitly invites Apple to file legal objections to the proposed order before having to comply with it. (As a matter of due process, Apple has a legal right to a hearing to dispute the lawfulness of the order.) Apple's objections haven't been filed yet.

At this point, I think it's too early to make strong conclusions about which side has the better argument. Partly that is because we don't yet have adversarial briefing. We have the government's application and the judge's order, plus the public statement from Apple's chief executive. But Apple hasn't even made its case in court yet.

And importantly, there has been no fact-finding yet. The All Writs Act is very context-specific, with mushy standards such as whether the order would impose an "unreasonable burden" on the third-party. You can't apply the law without knowing all the facts. And we don't know the facts yet. There hasn't been a decision, even just from the magistrate judge, about which side has the better legal argument. And however the magistrate judge ultimately rules, the losing side is guaranteed to seek further review. So this is probably an issue that will lead to a hearing and then will work its way from the magistrate judge to a district judge and then to the court of appeals. Ordinarily, that would take around two years.

Of course, some people have strong opinions already about which side has the better legal argument. But I don't, at least not yet. I'll go into more detail on that in tomorrow's post.

2. The government wants Apple to exploit a security vulnerability built in to older iPhones. There's a lot of public discussion about whether the order would require Apple to create a "backdoor" into the iPhone. I think it's probably more accurate to say that this particular model phone, the iPhone 5C, has a built-in security weakness—depending on how you define the term, a kind of backdoor—already. The government's order would require Apple to exploit the potential backdoor in Apple's design. Importantly, though, Apple redesigned its phones after the iPhone 5C to close this potential backdoor [but see update below]. Later phones, starting with the iPhone 5S, have apparently eliminated this potential way in. As a result, the specifics of the order in the San Bernardino case probably only involve certain older iPhones.

Here's some background. The order in this case does not require Apple to decrypt the phone for the government. The phone used the iOS9 operating system. Apple intentionally designed that operating system in a way that Apple can't decrypt the phone even with a warrant. (That was the big issue back in 2014, when Apple introduced the earlier iOS8.) Instead, the order obtained in this case requires Apple to disable features on the phone that were designed to frustrate password-guessing as a way to break into the phone.

Specifically, the government knows that this particular phone had the iOS9 "auto erase" function turned on before the time of the attacks. Although no one can be sure, that feature was probably still on when the attacks occurred. Apple designed the auto-erase feature to thwart passcode-guessing. If someone guesses the passcode 10 times incorrectly, the phone permanently destroys the data in the phone needed to decrypt the phone. The government wants to keep guessing passcodes until it finds the right one—what is usually called a brute-force attack. But it can't do that because of the features Apple designed, and that Farook apparently had on, to thwart passcode-guessing.

But there is another way in for this particular model phone. Apparently, Apple has the technical capability to send a software update to the phone that will disable the auto-erase function and some other similar features. Apple designed its system so that the update has to come from Apple, using its unique cryptographic signature, in order for it to work. The Apple software update could let the phone run with the passcode-guessing-frustrating features turned off. The FBI could then use a fast computer to guess passcodes to try to find the one that Farook used. That might allow the FBI to find the passcode quickly, or it might take them years. How long it might take just depends on what kind of passcode Farook used.

But here's an interesting technical twist. It appears that Apple redesigned its later phones so Apple can't send a software update to the phone without the user first entering in the passcode. Starting with the iPhone 5S, Apple designed the phones so that this feature is embedded in the hardware. The idea was for Apple to take away its own power to send a software update without the user's authorization. If the phone Farook used had been an iPhone 5S or an iPhone 6, Apple probably would have been unable to disable the password-guessing features. (I say probably, because there is some speculation that it would still be possible.) But because this phone is an iPhone 5C, it's at least technically possible for Apple to write a software update that will disable the features that Apple created—and Farook apparently used—to thwart password-guessing.

[UPDATE: It looks like the speculation that this is still possible for newer phones is correct. Apple has plans for future phones and software to no longer allow this, but that is a future plan rather than the current state of technology.]

The "backdoor," if you want to call it that, is that Apple retains the technical ability to send a software update to the phone that would disable the optional password-guessing-thwarting functions that Farook probably used. Apple hasn't written that software update, and it strongly opposes being required to write it.

3. There are no Fourth Amendment issues in the case. Some have speculated that it might violate the Fourth Amendment to require Apple to assist the government's efforts to break into the phone. That's not correct. The search here would comply with the Fourth Amendment for at least two independent reasons. Most obviously, the government has a search warrant. The assistance order is based on that; it seeks Apple's help in carrying out the warrant that the government already has.

Second, even if the government didn't have a warrant, the government has the consent of the phone owner. The phone in this case was owned by the San Bernardino County Department of Public Health, Farook's employer. Farook used it, but the county owned it. The county has already consented to a search of the phone. Some have speculated that the people who communicated with Farook may have Fourth Amendment rights in their communications on the phone. Not so. When you send a communication to someone, you lose Fourth Amendment rights in the communication when the message arrives at its intended recipient. As a result, you have no Fourth Amendment rights in someone else's phone just because you sent them messages. And even if you did have such rights, either the warrant or the phone owner's valid consent—or here, both—would ordinarily trump them.

For these reasons, the legal issues in this case are not about the Fourth Amendment. Instead, they're about the use of the All Writs Act to compel Apple to help the FBI so the FBI can try to guess the passcode to the phone. I'll discuss the application of the All Writs Act, and the broader policy issues it raises, in my next post.