Encryption

Apple Versus the FBI

It's possible that the FBI is not primarily concerned with the particular evidence stored on the San Bernardino shooter's phone at all.

|

frogsthatmoo/Flickr

The technology industry is in an uproar over the revelation of a series of court orders directing Apple to compromise iPhone security to assist in the FBI investigation of deceased San Bernardino shooters Syed Rizwan Farook and Tashfeen Malik. In a public letter to Apple customers published last Wednesday, CEO Tim Cook strongly condemned the order to build a so-called "back door" for government access into encrypted technology. The FBI, on the other hand, counters that this request is a reasonable and narrow means to bring about justice for the victims of terrorism. Stripping away the emotional rhetoric from all sides, the core question is whether a company can be compelled to build a tool for law enforcement that will compromise the security of any of its devices.

In the short term, such a precedent could create significant ethical dilemmas for technology firms' customer and business relationships while opening the door to similar demands from foreign governments. In the long term, the FBI's efforts may ultimately—and ironically—undermine the agency's broader goal of accessing critical evidence by encouraging tech companies to build products that are impervious to infiltration by design.

This incident is only the latest conflict in a years-long encryption and security war waging between privacy- and security-minded groups and the law enforcement community. As more communications are digitized, authorities have been calling for industry assistance to build so-called government "backdoors" into secure technologies by hook or by crook.

Those in law enforcement fear a scenario where critical evidence in a terrorism or criminal case is beyond the reach of law enforcement because it is protected by strong encryption techniques that conceal data from anyone but the intended recipient. Hence, leaders at agencies like the Department of Justice, the Department of Homeland Security, and the National Security Agency, along with President Obama, have weighed in against strong encryption.

The technology industry and privacy activists, in contrast, emphatically oppose any proposed government backdoor into encryption and security technologies. Civil liberties groups such as the Electronic Frontier Foundation (EFF) and American Civil Liberties Union (ACLU) guard the right to privacy as inalienable. Computer scientists, alternatively, point out the catastrophic security vulnerabilities that such schemes would invariably generate. After all, malicious hackers or foreign governments could exploit these "back doors"—also known as "golden keys"—for their own malevolent interests. When it comes to the question of encryption, there is no tradeoff between privacy and security—weakening encryption weakens everyone's security.

Tensions temporarily thawed last fall, when the White House deescalated its push to require government access into encrypted technologies. However, the San Bernardino attacks in December of last year doused new fuel on this simmering conflict.

Since the shootings on December 2, 2015, law-enforcement authorities have been working to piece together Farook and Maliks' motivations and collaborators by combing through their conversations in the months leading up to the attack. Authorities quickly obtained the personal cell phones and hard drive that the duo had attempted to destroy. Additionally, investigators retrieved a company iPhone that Farook did not bother to destroy from his personal vehicle, in the hopes that it would contain further leads.

Because Farook was a county employee, the county was able to reset the phone's Apple ID password and thereby grant the FBI access to some associated iCloud data. But there was a big problem: The last data backup occurred on October 19, which means that any data in the period between that day and the attack was now inaccessible to investigators. If they hadn't reset the password, engineers could have possibly connected to Apple's servers and executed a fresh iCloud backup (D'oh!), but now there may be extra data on the phone that will never make it to the iCloud.

Investigators cannot simply guess the iPhone PIN number because of a security feature that wipes the phone's memory after enough false guesses. And because Apple instituted full-disk encryption on iPhones in 2014, traditional methods of data request and extraction were fruitless. So the g-men called in some judicial muscle and asked Apple to build a tool that would allow agents to unlock the device for a comprehensive search.

According to CEO Tim Cook's letter, Apple has provided whatever lawful resources and direction that it could to law enforcement in the past and in the course of the current investigation. But when the FBI asked the company's engineers to craft a tool that would allow agents to bypass iPhone security, Cook balked.

It's not that the investigation or request were unworthy in themselves—Cook explains that Apple wanted to bring justice for the victims of this horrendous terrorist act just as much as anyone else. But to Apple, complying with such a request would constitute a fundamental violation of its customer's trust. Publicly compromising the integrity of its devices was simply a non-starter for the tech giant.

What's worse, Cook points out, it is highly unlikely that such a request would be a one-time deal for the FBI. Despite FBI Director James Comey's reassurances that his agency only seeks to access a single device rather than a broad "backdoor" into millions of devices, many in the security and privacy communities are deeply concerned by the precedents and vulnerabilities that this action could generate. Today, it's an obviously guilty terrorist's personal iPhone. Tomorrow, it could be an entire operating system—and there's no guarantee that the public would be made aware of such a backdoor.

Indeed, it's interesting that the FBI chose to pursue this tool in such a public fashion in the first place. According to The New York Times, Apple initially requested that the FBI not publicize its order to decrypt. (Some have speculated whether Apple might have complied if the order was not publicized, or if it merely wanted to challenge the order away from the court of public opinion. But that's a whole other story.) Although it would seem to be in the FBI's interest to manage this development behind the scenes, the FBI curiously insisted on making this event public, which thereby provoked Cook's public letter. This could indicate that the FBI wanted to make a salient precedent out of this order—and perhaps embarrass the encryption evangelists at Apple by appearing to be "weak on terrorism" as a cherry on top.

Looking back on the FBI's struggles with encryption, it is clear that it preferred a public, legislative mandate compelling technology firms to grant some kind of special access for law enforcement investigations. But unfortunately for the agency, actual examples of encryption thwarting a criminal investigation were for a long time stubbornly non-existent. Meanwhile, the technical arguments against breaking security technologies for law enforcement put the kibosh on any legislative proposals for a backdoor mandate. So the intelligence community decided to bide its time, waiting for the moment that some "terrorist attack or criminal element can be shown to have hindered law enforcement," in the words of top intelligence attorney Robert S. Litt's leaked memo to his colleagues.

From the outside, it is clear that the tragic San Bernardino attacks could provide a perfect storm of emotion and technical haziness that is ideal for establishing a precedent. Indeed, legislators have again picked up the possibility of mandating federal access into secure technologies under threat of criminal penalty in light of this debacle. It is possible that the FBI is not primarily concerned with the particular evidence stored on this particular device at all. Rather, this could serve as an excellent public saga to spur the formal rule changes that law enforcement has desired the whole time.

Regardless of the FBI's true intentions, it appears that Apple will fight this order through the courts as much as it possibly can. But in terms of extra-judicial subversion, two can play that game. Technology companies may react to this FBI campaign in a manner that ultimately undermines law enforcement goals.

Already, Apple is talking about encrypting iCloud data in such a way that it could not comply with law enforcement warrants for information even if it wanted to. Other companies in the U.S. and abroad are no doubt observing this fracas with an eye to the future as well. The political climate is demonstrating more and more that trusted third parties are not only security holes, they are prime targets for government coercion and customer liability. It's not hard to understand why many companies will work to simply remove themselves from the equation.