Rendering Unto CESA

Clinton's contradictory encryption policy.

It was September 28, 1999. Some officials from the Clinton administration were briefing the Congressional Internet Caucus, and Rep. Curt Weldon (R-Penn.) was getting visibly angry.

The officials were outlining the White House's new policy on encryption, the practice of coding and decoding electronic messages using computer programs. Since the Bush years, encryption software has been classified as a munition, meaning that companies need a special export license to ship products overseas, just as if they were shipping guns or warheads. Encryption has long been a contentious issue, pitting privacy advocates against the national security apparatus; now, with e-commerce expanding and requiring stronger protections against intruders, even more parties have been weighing in on the issue. The administration's new approach seemed to make it much easier to export encryption products, and Weldon, a long-time supporter of strict encryption controls, couldn't understand why.

How could you be implementing this policy? he asked the panel. On countless occasions, this administration has sent high-powered people to the Hill, including Attorney General Janet Reno and FBI Director Louis Freeh, warning us that if encryption is freely exported, it will create serious domestic and international security problems and hamstring our law enforcement and intelligence operations. And now you're telling us you've changed your minds?

Did they change their minds? At the same briefing, the White House reaffirmed its support for the Cyberspace Electronic Security Act (CESA), a bill rooted in the government's traditional distrust of private encryption. Along with some less controversial provisions, the bill said the government need not disclose, in the course of a criminal proceeding, how it recovered the decrypted information that it's using against the defendant. The theory is that if the government reveals its decryption secrets--which may involve classified techniques, industry trade secrets, or software flaws that government researchers have discovered--criminals will be forewarned and will be able to thwart decryption in future investigations. But civil libertarians point out that this will make it harder for defendants to authenticate the state's evidence. It may even pose constitutional problems: How can you confront your accuser in court if you don't know the basis of his charges?

The administration's crypto schizophrenia didn't end in September. Earlier this year, the White House announced a new set of crypto-export rules that, while complex enough to require a lawyer to parse, seem to take the lid off the export of encryption almost entirely. Most encryption tools will be cleared for export after a one-time review by the Commerce Department. There will still be restrictions on products that aren't widely available in domestic retail outlets, exports to "terrorist" pariah states will still be banned, and there will still be some restrictions on programs' source codes. But the new policy is unquestionably a significant deregulation.

And yet: At the same time, the Department of Justice is vigorously litigating for export restrictions in Bernstein v. U.S., a case involving a college professor who claims the First Amendment protects his right to distribute encryption-related source code. Between that and CESA, observers are beginning to wonder whether the United States has a consistent encryption policy at all.

For most of the post-World War II era, the government didn't need a general policy on encryption. Because of the massive computing power necessary to generate cryptographic codes, such activity was the province of intelligence agencies and almost no one else. Over the past couple of decades or so, as the personal computer revolution placed more (and cheaper) processing power within reach of virtually anyone, that changed. In response to a world of decentralized computing, U.S. law enforcement responded with a single, panicky policy: Stop the spread of cryptography at all costs.

The new stance was driven by some pioneering work in the late 1970s by American cryptographers--work that, for once, was not performed by people in the pay of the intelligence agencies, and therefore was not "born classified." This academic revolution--the development of a public science of cryptography and a resulting colloquy about it--was accompanied by a similar, equally dramatic revolution on the microcomputer front.

The result: Ordinary people with desktop PCs could encrypt their messages or data to a degree that only governments could have achieved not long before. For intelligence and police agencies, this ushered in a new era, one in which merely intercepting a terrorist's or criminal's (or dissident's) communications was no guarantee that the government could figure out what the communicator was saying. On top of that, telephone companies were relying more and more on computers to run their networks and phone services, raising the specter of a world in which every call might be encrypted. Effective wiretaps might become a thing of the past.

The Bush administration responded to these developments in two ways. First, there was the "Digital Telephony" initiative, which required phone companies to structure their networks and services so as to facilitate wiretapping. It eventually passed into law in 1994 as the Communications Assistance to Law Enforcement Act. Second, there was an effort to regulate the use and sale of encryption tools, domestically and abroad. When Bill Clinton took over the White House in 1993, he adopted Bush's strategy without any significant changes.

While the digital telephony guidelines went ahead with nary a hitch, the encryption initiative proved harder to implement. In spring 1993, the Clinton administration introduced the "Clipper Chip" program, in hopes that this would steer the growing market for computer security in the direction the feds wanted. The Clipper Chip was a hardware device developed under the supervision of the National Security Agency. Each chip had a "backdoor" key that would be held in escrow by a trusted government agency. Thus, while the device would let computer or telephone users encrypt their communications, it would also let the government recover the content of the coded messages.

The government hoped that by requiring all future federal purchases of phone and computer systems to be Clipper Chip-compliant (and enforcing the same standard for all outside government contractors), the private sector would have to follow suit. Instead, the chip ignited a firestorm of criticism from privacy activists, high-tech industries, and others. By 1996, the administration had abandoned the Clipper Chip as such, but it continued to lobby both at home and abroad for software-based "key escrow" encryption standards. Again, the goal was to make it easier for law enforcement and intelligence agencies to recover encrypted data when they believed they needed to. This was marginally successful outside the United States--33 countries, including Britain, Russia, Canada, France, and Germany, adopted the "Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies." But in the domestic software market, few accepted the push for key escrow standards (now relabeled "key recovery").

Civil libertarians didn't just argue against the Clipper Chip per se; they also complained that the United States was pushing privacy-undermining standards precisely in countries that lacked the United State's strong legal protections for privacy. Faced with both industry pressure and public criticism to kill the Clipper Chip, the administration scotched the plan and moved most export control authority from the Department of State to the Department of Commerce. However, most of the encryption controls themselves remained largely unchanged.

Pressure for liberalization continued to increase; even people within the administration were now pushing for change. Several factors were responsible, according to Dorothy Denning, a computer science professor at Georgetown who's been a strong defender of the government's wiretapping and cryptography policies throughout the 1990s. On the export front, Denning says, the administration had begun to recognize "the futility of trying to force the whole market into a Clipper kind of model." Meanwhile, as law enforcement "started dealing with [more encryption- related] cases, they were learning how to handle them." The police were still panicked by encryption, but they were also beginning to relax a little.

As a result, on the last day of 1998, the administration announced that it would further liberalize its export restrictions. The new rules flatly legalized the export of encryption products using 56-bit keys. (The strength of encryption products is generally taken to be a function of key length. The previous standard had been 40-bit keys.) It also allowed for the export of stronger encryption products in industry sectors with a special need for data security--the banking industry, the health and medicine sectors, certain online merchants, and foreign subsidiaries of U.S. companies.

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

GET REASON MAGAZINE

Get Reason's print or digital edition before it’s posted online

  • Video Game Nation: How gaming is making America freer – and more fun.
  • Matt Welch: How the left turned against free speech.
  • Nothing Left to Cut? Congress can’t live within their means.
  • And much more.

SUBSCRIBE

advertisement