Encryption

The Encryption Fight We Knew Was Coming Is Here—and Apple Appears Ready

Company will not compromise user security to help access terrorist's phone.

|

We have one massive test case for Apple's commitment to end-to-end encryption on its smart phones in the dock. Syed Farook, one of the terrorists responsible for killing 14 people in San Bernardino, California, had a locked iPhone, and the FBI has not been able to access its contents. So they're trying to get help from Apple, and they've turned to a judge to try to force the company's assistance.

Yesterday a judge sided with the FBI, ordering Apple to not exactly create a back door to bypass its encryption, but pretty damn close. The judge has ordered Apple to assist the FBI by making it possible to bypass or deactivate the auto-erase function that deletes the contents of the phone after too many failed password attempts; to allow the FBI to electronically submit passcodes rather than manually; and to eliminate any delays in the system between password attempts. The demands are clearly designed so that the FBI would be able to try to brute force the password by attempting every possible number combination. (You can read the full order and some technical analysis at Techdirt here).

Apple does not appear to have the ability to comply with the order at this very moment. But it seems clear that Apple can possibly develop such a tool that will comply based on President Tim Cook's response. So among the many questions we're dealing with here is whether the judicial system can force a private company to develop certain tools for the purpose of furthering the government's goals.

Cook is saying no. And his response on behalf of Apple (directed toward Apple's customers) provides a clear, level-headed understanding of why this encryption must exist and must be protected, even if it "helps" the occasional terrorist:

In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

What Cook doesn't say but should be extremely obvious is that even if Apple creates only one of these devices for the government, the feds will most certainly try to backwards-engineer the tool to figure out how to replicate it. We have seen every single surveillance authorization given to the federal government abused and expanded to snoop on citizens for inappropriate reasons and without due process. There's no reason to believe the same thing won't happen here and that the justification for breaking encryption won't be defined downward from "terrorist who killed 14 people" to "suspected drug dealer" or what the FBI defines as a domestic "extremist." Cook does make note of how this authority would eventually be corrupted:

The implications of the government's demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

Republican presidential front-runner Donald Trump has no interest in any sort of nuanced discussion of privacy and civil liberties and has demanded that Apple comply, saying, "To think that Apple won't allow us to get into her cellphone? Who do they think they are? No, we have to open it."

On this situation, Trump sounds just like the "establishment" politicians in both parties, who have no interest in Americans' privacy if it gets in the way of government authority, and they really don't seem to be interested in the negative consequences. Folks like Trump (and senators like Dianne Feinstein and Richard Burr) care only that the government can access any information it wants to at any time. They do not seem to care that this process will open up their own constituencies to further fraud and potential crimes.

So we have to rely on one of the biggest tech corporations in the world to protect us from the government. I wonder how many sci-fi cyberpunk writers saw that twist coming.