Encryption

Locked Texas Shooter's iPhone Reignites Encryption Debate

Another possible standoff where officials want to compromise everybody's data security.

|


locked phone
Dave Bredeson / Dreamstime

Well, here we go again: the FBI has once more found itself locked out of the smartphone of a dead mass shooter, this time Texas church massacre suspect Devin Kelley. Unless the feds find some kind of workaround to allow access without undermining the core encryption protections afforded by consumer devices, this incident could ignite another battle between the FBI and the tech community over the tensions between user security and law enforcement access.

The issue is a tender one. In the spring of 2016, the FBI and Apple engaged in a fraught standoff over the encryption question following the 2015 terrorist attack at San Bernardino. The battle played out both in the public and the courts, with the FBI arguing that Apple had a duty to compel its engineers to intentionally break security features in order to access data on the locked devices of deceased shooters Syed Rizwan Farook and Tashfeen Malik. Apple stood firm, refusing to compromise any of its devices and instead seeking to find alternative means to assist law enforcement.

This intense showdown did not present a cathartic ending. The legal issues underpinning the debacle were never resolved in court. Rather, the brouhaha was rendered moot when an outside party swooped in to hack the phone for the FBI for a cool $900,000.

The most recent shooting at a Texas church contains all of the elements to create yet another battle royale between law enforcement and security professionals.

The FBI agent in charge of the investigation, Christopher Combs, has already started grumbling about encryption, griping that "law enforcement is increasingly not able to get into these phones." In an interview with Politico Pro, Department of Justice Deputy Attorney General Rod Rosenstein, who has developed quite a reputation as an encryption critic, recently characterized the desire for strong, unbreakable encryption as "unreasonable."

The agency has confirmed that the device is an iPhone. But officials reportedly have yet to reach out to Apple for assistance, preferring instead to explore alternative means to access the phone's data.

That's problematic. The iPhone's security features are set up in such a way that the first 48 hours after an incident are critical. If the FBI had reached out to Apple within this time frame, its engineers could have assisted law enforcement to exploit this window of opportunities. But since the FBI neglected to reach out, they may have inadvertently foiled their own options.

For example, Apple's Touch ID feature allows individuals to unlock their device by scanning their fingerprint. If Kelley's iPhone had the Apple Touch ID feature enabled, law enforcement could have used the dead man's fingerprints to easily open the phone. That is, unless the device has been powered off and restarted, or 48 hours have passed—in which case, the user's private passcode would be needed. And you can't exactly ask a dead man to tell you his passcode.

If a feckless Android user like myself was one of the first in law enforcement to handle the device, they could easily seal off that route by immediately restarting the device. After all, it's a natural first step that frustrated smartphone users turn to when flummoxed by their technology. But in this case, it could mean the difference between easy access to critical clues, or a drawn-out legal battle that risks undermining the nation's data security.

Even if they didn't turn off the device, the critical two-day window has come and gone. One really hopes that the FBI did not allow pride or prejudice to prevent a simple request for Apple's assistance.

But it wouldn't be the first time the agency has flubbed such a route. Recall that during the San Bernardino debacle, the FBI instructed municipal officials to remotely reset Farook's iCloud password, thereby eliminating the option to access automatic iCloud backups. A quick call to a knowledgeable Apple representative could have swiftly cleared that all up.

Hopefully, law enforcement will find some way to get the data they need without another public brawl with the tech community. But I'm not all that optimistic. Opportunists in the FBI may find the chance to advance their anti-encryption agenda in the face of another tragedy to be too tantalizing to turn down.

There are important differences in the facts of the cases in San Bernardino and Sutherland Springs. Kelley appears to have been a lone wolf, unconnected to a broader terrorist network like Farook and Malik. Investigators may not have as much of a need to scour through Kelley's communications for associates like they did for the Islamic terrorist network apparently involved with the San Bernardino shooting.

Yet the passion and emotion surrounding such high-profile massacres often blur these kinds of distinctions. Authorities could decide to use this as another test case in the court of public opinion or a real court to gain the ability to compel code from security professionals. At the very least, it could be used as another rhetorical data point to promote legislative efforts to secure these new powers.

It is easy to sympathize with the FBI's plight. Their agents investigate horrific crimes, and hope to bring justice for victims' loved ones. I can only imagine their frustration in finding a potential lead blocked by the hard laws of mathematics. Most in the security community feel a similar empathy.

But there is simply no getting around the fact that compromising encryption ultimately makes everyone less safe. Not only is it in many cases simply mathematically unworkable, it is downright undesirable.

Rather than exposing millions of innocents to increased risk of digital predation, law enforcement should seek one-off methods to break into specific devices in an investigation, as they did with the San Bernardino case. This approach, of course, will require a more productive relationship with the technology community than was evidenced in the last go-around of the Crypto Wars.

There is a deep irony to the encryption debate. We are all stewing in a veritable ocean of accessible, unencrypted data. This includes metadata, geolocation tracking, social media posts, cloud data, and ISP logs, among many other expanded digital sets of typical forensic evidence.

How much progress could law enforcement make if they focused more resources on mining these rich new sets of data, rather than antagonizing the poor security engineers that keep us all safe online? The FBI should work with the technology community to seize these opportunities. They might just find that the evidence they needed was there for the easy taking all along.