The FBI Wants Access to a Mass Shooter's iPhone. Will They Demand a Back Door?

A deadly shooting on a Naval base in Florida may lead to a new battle against encryption.


When the dust from a suspected terrorist attack on U.S. soil eventually settles, the inevitable fight over access to encrypted communications can quickly commence. This is again the case with last month's shooting spree by Saudi Air Force flight school student Mohammed Saeed Alshamrani, who was being trained at the Naval Air Station in Pensacola, Florida.

Investigators are still determining the now-deceased Alshamrani's motivations, who is believed to have hyped himself up by hosting a party to watch videos of other atrocities and posting anti-US and anti-Israel content before killing three U.S. sailors and wounding eight other people. "Terrorism" seems a pretty good bet: Attorney General William Barr reported Monday that 17 other Saudi Air Force students at NAS Pensacola had pro-jihadi content on their social media accounts.

But to build a watertight case, law enforcement might need access to the shooter's suspected two iPhones, so the FBI is hoping to enlist Apple's help. Last week, the FBI sent a letter to Apple's general council asking for technical assistance after their efforts to simply guess the passcodes proved fruitless. The agency says it has court permission to search the phones, so it's not a question of warrantless access. And Apple appears eager to help, telling NBC News that the company has given the FBI "all of the data in our possession" when first approached at the time of the attack and "will continue to support them with the data we have available." Barr countered this characterization in his speech Monday, claiming that "Apple has not given us any substantive assistance" so far.

Perhaps the FBI and its many forensic offices may be able to work with Apple to find a fortuitous way into the iPhones in question. (Entering at least one of the phones may prove especially tricky, as it was apparently shot during the attack.) Or maybe the FBI will again tap one of the many shadowy organizations that specialize in granting access to secure devices, like the Israel-based NSO Group Technologies, which is believed to have cracked the San Bernardino shooter's phone.

Or perhaps this present détente is just the calm before the revived Crypto War storm. When considering the backdrop that festoons this latest security incident, the curious case of the NAS Pensacola iPhones could have all the makings of a new public battle over law enforcement access to encryption.

Recall that the San Bernardino incident resolved with only a whimper. The government had hoped to secure a legal precedent in the courts that would facilitate future access to encrypted communications. By compelling Apple engineers to build government back doors into phone software, the law enforcement community would not only receive access to those particular devices, but possibly any device protected by strong encryption, whether with or without a warrant. (No wonder the FBI dragged its feet to crack into the devices on its own.)

Eventually, the FBI was able to get into the iPhone without conscripting Apple engineers as unwilling government hackers. The whole thing became moot. But it's likely that many in the law enforcement community have simply been biding their time until another opportunity presented itself.

The emotional high-stakes surrounding the attack in California only added more oomph to the government's case in the court of public opinion. Exploiting horrible events appears to be a key tactic in the government's ongoing campaign against encryption.

In 2015, emails obtained by the Washington Post from a top lawyer in the intelligence community explained that while "the legislative environment is very hostile" to the idea of outright crippling encryption, things could "turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement." A 2018 inspector general inquiry into the San Bernardino incident reported that the case was seen as a "poster child" for the so-called Going Dark problem thwarting law enforcement access to secure devices. So of course the agency played the San Bernardino shooting up.

Perhaps the specific game plan has changed with the rise of a new administration. But the overall goal of chipping away at strong security protocols remains the same under the tenure of Attorney General William Barr and FBI director Christopher Wray. Barr has blasted tamper-proof encryption protocols as "law-free zones" that are fundamentally opposed to the needs of law enforcement. In his press event Monday, Barr called on Apple to "find a solution so that we can better protect the lives of Americans and prevent future attacks." Wray echoes these contentions, characterizing security technologies as creating an "entirely unfettered space that's utterly beyond law enforcement for criminals to hide."

What they want is what's called a "back door" into encryption standards that will let governments decrypt data at their leisure. Law enforcement argues that such back doors are necessary tools to get evidence and carry out justice. But computer scientists point out that a government back door would undermine everyone's security because the bad guys could exploit it as well. Essentially, there may be no way to build a back door that wouldn't effectively ruin the whole point of encryption to begin with, which is to secure our computing.

The FBI has a history of exaggerating the Going Dark problem. Wray had previously claimed that his Bureau was locked out of some 8,000 or so devices in their investigations. But the real number is much smaller, probably around 1,000 or so. Plus, it's simply not the case that these thousand-some devices are related to terrorism or even plain old violent crime cases. The vast majority are probably drug-related.

Then there's the question of whether the information on the devices would have even been instrumental to solving the case. Obviously, law enforcement will want access to as much information as possible to build the best case. But maybe the locked device wouldn't have provided much evidence anyway.

Computer scientists routinely point out that law enforcement may be missing the forensics forest for the trees. We live in a virtual ocean of plaintext data, metadata, and tools that can piece together our online lives if assisted by the right analysis. Many times, law enforcement fails to tap this rich vein of available data simply because they do not know it exists, or they don't know how to access it, or they don't know how to analyze it.

Rather than quixotically trying to undermine everyone's security by installing insecure back doors into encryption protocols, the FBI should instead seek to harvest this kind of low hanging fruit and better train the nation's law enforcement on this kind of forensic analysis.

Hopefully Apple and the FBI will be able to get into these phones the old fashioned way and an encryption stand-down can be avoided. As a Pensacola resident and a security writer, I hope that justice is served without undermining strong encryption.

But if we don't have a stand-down now, we will almost certainly have one later. Today, not only are devices like iPhones protected by strong encryption by default, Facebook is also integrating strong encryption practices into their suite of platforms, which means more tragedies will likely involve some kind of encryption.

The government will not cease in its quest to put a back door into encryption technologies, and we must be ready to challenge them whenever it is attempted.