Rendering Unto CESA
Clinton's contradictory encryption policy.
It was September 28, 1999. Some officials from the Clinton administration were briefing the Congressional Internet Caucus, and Rep. Curt Weldon (R-Penn.) was getting visibly angry.
The officials were outlining the White House's new policy on encryption, the practice of coding and decoding electronic messages using computer programs. Since the Bush years, encryption software has been classified as a munition, meaning that companies need a special export license to ship products overseas, just as if they were shipping guns or warheads. Encryption has long been a contentious issue, pitting privacy advocates against the national security apparatus; now, with e-commerce expanding and requiring stronger protections against intruders, even more parties have been weighing in on the issue. The administration's new approach seemed to make it much easier to export encryption products, and Weldon, a long-time supporter of strict encryption controls, couldn't understand why.
How could you be implementing this policy? he asked the panel. On countless occasions, this administration has sent high-powered people to the Hill, including Attorney General Janet Reno and FBI Director Louis Freeh, warning us that if encryption is freely exported, it will create serious domestic and international security problems and hamstring our law enforcement and intelligence operations. And now you're telling us you've changed your minds?
Did they change their minds? At the same briefing, the White House reaffirmed its support for the Cyberspace Electronic Security Act (CESA), a bill rooted in the government's traditional distrust of private encryption. Along with some less controversial provisions, the bill said the government need not disclose, in the course of a criminal proceeding, how it recovered the decrypted information that it's using against the defendant. The theory is that if the government reveals its decryption secrets--which may involve classified techniques, industry trade secrets, or software flaws that government researchers have discovered--criminals will be forewarned and will be able to thwart decryption in future investigations. But civil libertarians point out that this will make it harder for defendants to authenticate the state's evidence. It may even pose constitutional problems: How can you confront your accuser in court if you don't know the basis of his charges?
The administration's crypto schizophrenia didn't end in September. Earlier this year, the White House announced a new set of crypto-export rules that, while complex enough to require a lawyer to parse, seem to take the lid off the export of encryption almost entirely. Most encryption tools will be cleared for export after a one-time review by the Commerce Department. There will still be restrictions on products that aren't widely available in domestic retail outlets, exports to "terrorist" pariah states will still be banned, and there will still be some restrictions on programs' source codes. But the new policy is unquestionably a significant deregulation.
And yet: At the same time, the Department of Justice is vigorously litigating for export restrictions in Bernstein v. U.S., a case involving a college professor who claims the First Amendment protects his right to distribute encryption-related source code. Between that and CESA, observers are beginning to wonder whether the United States has a consistent encryption policy at all.
For most of the post-World War II era, the government didn't need a general policy on encryption. Because of the massive computing power necessary to generate cryptographic codes, such activity was the province of intelligence agencies and almost no one else. Over the past couple of decades or so, as the personal computer revolution placed more (and cheaper) processing power within reach of virtually anyone, that changed. In response to a world of decentralized computing, U.S. law enforcement responded with a single, panicky policy: Stop the spread of cryptography at all costs.
The new stance was driven by some pioneering work in the late 1970s by American cryptographers--work that, for once, was not performed by people in the pay of the intelligence agencies, and therefore was not "born classified." This academic revolution--the development of a public science of cryptography and a resulting colloquy about it--was accompanied by a similar, equally dramatic revolution on the microcomputer front.
The result: Ordinary people with desktop PCs could encrypt their messages or data to a degree that only governments could have achieved not long before. For intelligence and police agencies, this ushered in a new era, one in which merely intercepting a terrorist's or criminal's (or dissident's) communications was no guarantee that the government could figure out what the communicator was saying. On top of that, telephone companies were relying more and more on computers to run their networks and phone services, raising the specter of a world in which every call might be encrypted. Effective wiretaps might become a thing of the past.
The Bush administration responded to these developments in two ways. First, there was the "Digital Telephony" initiative, which required phone companies to structure their networks and services so as to facilitate wiretapping. It eventually passed into law in 1994 as the Communications Assistance to Law Enforcement Act. Second, there was an effort to regulate the use and sale of encryption tools, domestically and abroad. When Bill Clinton took over the White House in 1993, he adopted Bush's strategy without any significant changes.
While the digital telephony guidelines went ahead with nary a hitch, the encryption initiative proved harder to implement. In spring 1993, the Clinton administration introduced the "Clipper Chip" program, in hopes that this would steer the growing market for computer security in the direction the feds wanted. The Clipper Chip was a hardware device developed under the supervision of the National Security Agency. Each chip had a "backdoor" key that would be held in escrow by a trusted government agency. Thus, while the device would let computer or telephone users encrypt their communications, it would also let the government recover the content of the coded messages.
The government hoped that by requiring all future federal purchases of phone and computer systems to be Clipper Chip-compliant (and enforcing the same standard for all outside government contractors), the private sector would have to follow suit. Instead, the chip ignited a firestorm of criticism from privacy activists, high-tech industries, and others. By 1996, the administration had abandoned the Clipper Chip as such, but it continued to lobby both at home and abroad for software-based "key escrow" encryption standards. Again, the goal was to make it easier for law enforcement and intelligence agencies to recover encrypted data when they believed they needed to. This was marginally successful outside the United States--33 countries, including Britain, Russia, Canada, France, and Germany, adopted the "Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies." But in the domestic software market, few accepted the push for key escrow standards (now relabeled "key recovery").
Civil libertarians didn't just argue against the Clipper Chip per se; they also complained that the United States was pushing privacy-undermining standards precisely in countries that lacked the United State's strong legal protections for privacy. Faced with both industry pressure and public criticism to kill the Clipper Chip, the administration scotched the plan and moved most export control authority from the Department of State to the Department of Commerce. However, most of the encryption controls themselves remained largely unchanged.
Pressure for liberalization continued to increase; even people within the administration were now pushing for change. Several factors were responsible, according to Dorothy Denning, a computer science professor at Georgetown who's been a strong defender of the government's wiretapping and cryptography policies throughout the 1990s. On the export front, Denning says, the administration had begun to recognize "the futility of trying to force the whole market into a Clipper kind of model." Meanwhile, as law enforcement "started dealing with [more encryption- related] cases, they were learning how to handle them." The police were still panicked by encryption, but they were also beginning to relax a little.
As a result, on the last day of 1998, the administration announced that it would further liberalize its export restrictions. The new rules flatly legalized the export of encryption products using 56-bit keys. (The strength of encryption products is generally taken to be a function of key length. The previous standard had been 40-bit keys.) It also allowed for the export of stronger encryption products in industry sectors with a special need for data security--the banking industry, the health and medicine sectors, certain online merchants, and foreign subsidiaries of U.S. companies.
Outside the administration, critics charged the government with not going far enough--it had talked a good game about liberalization, they argued, but had failed to deliver. Rep. Bob Goodlatte (R-Va.) began gathering support for the Security and Freedom through Encryption Act (SAFE), a bill that would remove restrictions on the export of encryption technology while imposing a greater criminal sanction on those who used encryption to commit crimes. When first introduced, it stalled. When Goodlatte reintroduced it, he had much greater success, garnering about 200 cosponsors, and passing the bill out of five separate committees before it was halted once again. In the current Congress, SAFE has even more support--it has once again passed through five committees (as well as several subcommittees), and has 258 representatives, more than half the House, cosponsoring it.
The government's export policy was faring no better in the courts. Dan Bernstein, a professor at the University of Illinois at Chicago, wanted to post the source code for an encryption-related algorithm on the World Wide Web, and had asked the government whether he would need a license. (Under present law, to publish something on the Web is effectively to export it.) The feds replied that his source code qualified as a munition, so he couldn't do it. In August 1997, a federal district court in San Francisco ruled in Bernstein's favor, finding that this restriction looked a great deal like prior restraint of publication, which is prohibited by the First Amendment. On appeal, the trial court's ruling was affirmed, albeit on slightly narrower grounds; and while the DOJ has continued to litigate the case, encryption-policy mavens inside the government know that the feds were not necessarily going to prevail. The courts will hear further arguments later this year.
Facing challenges in both Congress and the courts, the administration knew it had to take action if it was going to regain control of the issue. In the words of Jerry Berman, director of the pro-privacy Center for Democracy and Technology, "The government has realized that delay is no longer a policy."
So by late 1999, America's once-mono-lithic encryption policy--stop the spread at all costs!--had been reduced to fragmented and perhaps self-contradictory rubble. Perhaps the biggest reason for the Clinton administration's latest liberalization is the heat they've been feeling from the American computer industry. As Vice President Al Gore has hit the campaign trail in earnest, he's been working to strengthen his support among the leaders of high-tech industries. To the extent that such restrictions have prevented these businesses from developing domestic products with encryption features, software companies have been at a disadvantage in the global marketplace. They've made their dissatisfaction known.
What's more, as Georgetown's Denning has noted, the government has simply had more time to adjust to the spread of encryption tools, further easing its once-vivid panic. That point is seconded by Stewart Baker, former general counsel of the National Security Agency, who adds a twist: The original export- control regime has largely served its purpose of limiting the spread of encryption. Absent those controls, he argues, computer and software manufacturers would have almost certainly incorporated encryption tools into the operating system of every mass-market computer. That may still happen, he continues, but right now encryption is far from ubiquitous. This gives the intelligence community some breathing space to adapt.
And the police? Berman notes that "the FBI has been strangely silent," and policy watchers believe there is still serious debate within the administration about the wisdom of the new liberalization. CESA, apparently, is meant to mollify those in Congress and in the executive branch who remain concerned about encryption's alleged threat to law enforcement. The Department of Justice, of course, is still engaged in a full-court press in the Bernstein case.
So: Is there a consistent encryption policy framework now? To Baker, the current muddle, for all its inconsistencies, "is as coherent as government policy gets." Meanwhile, Berman is cautiously optimistic that we're seeing the beginning of the end of encryption controls. Congress seems prepared to pass SAFE, he notes, if the administration fails to fully liberalize the export of encryption products.
But, he warns, "one terrorist incident involving encryption could change the landscape."
Show Comments (1)