Forty-two years after unbreakable encryption was first conceived, these tools are more widespread than ever before. One milestone came in 2016, when the world's largest messaging service—WhatsApp—announced it would offer default end-to-end encryption on all communications. In other words, the messages can be read only by the senders and recipients; even the platform provider can't access them.
Law enforcement and intelligence agencies are still reckoning with this new reality. For decades, they demanded that tech companies hand over private data on their users, sometimes without obtaining warrants. So companies like Apple changed their policies so individual users were the only ones holding the keys to their data.
This new era of consumer privacy led to a standoff in 2016, when the Federal Bureau of Investigation (FBI) demanded access to an encrypted iPhone belonging to Sayed Farook, a deceased terrorist from San Bernardino, California. Farook and his wife, Tashfeen Malik, had killed 14 people at a holiday office party in December 2015.
The FBI wanted Apple to write software that would weaken the iPhone's built-in security. Apple refused, saying that such flawed software would jeopardize the security of its customers, who number in the hundreds of millions. Once a back door was created, the company claimed, the FBI could use it on similar phones—and it could be leaked to hackers or foreign enemies. "It is in our view the software equivalent of cancer," Apple CEO Tim Cook told ABC News.
Plenty of consumer data is still unencrypted and can be accessed by large tech companies. From Facebook to Gmail, many online platforms give law enforcement access to their users' private conversations. But the growing use of end-to-end encryption, by Apple in particular, represents a drastic shift.
It's the latest battle in the so-called Crypto Wars, a fight between technologists and the state that's been ongoing for decades.
In 1991, then-Sen. Joe Biden (D-Del.) introduced an anti-crime bill that required providers and manufacturers of electronic communications to make it possible for the government to obtain the contents of voice, data, and other communications when authorized by law, essentially mandating that tech companies provide back doors for government snooping.
Programmer Phil Zimmerman decided to build a tool that would thwart Biden's efforts. He wrote the first attempt at an encryption program for the everyday user; he called it Pretty Good Privacy, or PGP. It scrambled data to everyone except the sender and recipient. PGP was published online, and it spread everywhere.
In the early 1990s, America was getting its first taste of the world wide web, and encryption was making law enforcement agencies nervous. They viewed PGP and similar programs as munitions. In 1993, the Justice Department launched a criminal investigation of Zimmerman on the grounds that by publishing his software he had violated the Arms Export Control Act. To demonstrate that PGP was protected under the First Amendment, Zimmerman got MIT Press to print out its source code in a book and sell it abroad.
The Justice Department eventually dropped the case against Zimmerman, and the government slowly started coming to grips with the legal and technical challenges of regulating software.
At the same time, the National Security Agency (NSA) attempted to preempt stronger domestic encryption by offering its own alternative for telecommunications devices. The Clipper Chip was an encryption tool with a built-in way for the government to gain access to private information. It was the first back door—an intentional flaw in a security system.
The NSA tried to make the Clipper Chip an industry standard by taking advantage of the government's purchasing power. But privacy advocates fought back.
"This was really the first crypto war," says Julian Sanchez, a senior fellow at the Cato Institute. It was "a fight it looked like the government might win until a computer scientist named Matt Blaze discovered certain flaws in the Clipper algorithm."
The Clipper Chip died an unmourned death, and the government once again seemed resigned to the new era of encrypted telecommunications. There was even a compromise of sorts between telecommunications companies and law enforcement, called the Communications Assistance for Law Enforcement Act. The trade-off was that telecommunications companies could still offer encryption but they had to preserve the ability to wiretap.
It seemed like technologists had won. But thanks to the Edward Snowden revelations, we now know that intelligence agencies weren't accepting defeat. Snowden's whistleblowing didn't just reveal massive government spying; it revealed how the NSA was targeting encryption.
The NSA corrupted a widely used software program called Dual_EC with malicious code. The program was used, mainly by big companies, to generate the long strings of numbers and letters that make up the private keys used to unlock encryption.
"Not a pure back door, not a 'push a button and decrypt it' attack, but they had what's called a short cut," says Sanchez. But use of the shortcut didn't last. Researchers who used Dual_EC quickly figured out the algorithm was compromised, and the NSA went back to the drawing board.
Intelligence agencies just couldn't beat the math worked into encryption software. As mathematical sophistication and increased speed of processors improved, so did the strength of encryption.