Hide and Peek

The government doesn't mind if you lock up your secrets-as long as it holds the key.

|

In recent years, the formerly esoteric world of cryptography has become a subject of potential interest to everyone who uses computers or digital communications. Since hackers have demonstrated that the most secure computers can be breached, people who want to make sure that their electronic information remains confidential are increasingly relying on codes to protect it. Someone who breaks into a computer containing encrypted files will find only electronic gibberish stored in the machine's memory. The simple and incontrovertible fact is that all systems to protect the security of electronic data must use encryption.

If you want to be sure that no one can read your electronic mail except the person you send it to, you should encrypt it. Other new technologies, including cellular phones, cordless phones, and faxes, are also vulnerable to interception. Eavesdroppers can be thwarted by encoding the information before it is transmitted. If Prince Charles and Princess Diana had encrypted their cellular-phone conversations, they would have spared themselves much embarrassment.

The proliferation of data banks, credit bureaus, and computerized record keeping of all kinds has underscored the need to make sure that this information remains secure. Every time you make a credit-card purchase, rent a video, or call someone on the telephone, a record is made of what you did. Someone who can put all of these records together can paint a rough picture of who you are, even before he gets your medical records, income-tax returns, and other, more confidential information. Cryptography is one way of making sure that information about you is used only for the purposes you authorize.

The commercial uses of cryptography are already enormous. The banking industry transmits $350 trillion a year by encrypted wire transfers. Every day, U.S. banks send 350,000 encrypted messages transferring funds around the world. They want to make sure that no one except the intended recipient can read the information or alter it. Indeed, every time you use an automated teller machine, it communicates with its mainframe computer through encryption.

As many political jurisdictions move toward computerized voting, it is essential that votes from individual polling places be sent to the main tabulating center with encryption. Sending them in the clear is an invitation to voting fraud.

Modem codes also have the intriguing ability to provide "digital signatures" that cannot be forged. With a traditional business contract on a piece of paper, both parties sign their names, and the signatures authenticate the agreement. But in the electronic world, if a contract is negotiated and written over a computer network, there is no signature made with a pen and ink on paper. Using contemporary codes, it is possible to prove that the person at the other end of the computer modem is who he says he is and to hold him to any agreement he might make.

Given the multiplicity of current and potential uses for cryptography, it's not surprising that the Clinton administration provoked a storm of protest last spring when it proposed a standard set of computer codes for telephone calls and computer data. The plan envisions two chips, one called Clipper, for encoding digital telephone signals, and another called Capstone, for digital information from computers. The government would hold the keys to all electronic encryption, and it would split them between two agencies chosen by the attorney general. Law-enforcement officials would need a search warrant to get access to the codes. Eric Hughes, a computer security expert in Berkeley, California, observes: "The government is saying, 'If you want to lock something up, you have to [give us] the key.'"

In essence, this system is like the lock boxes that real-estate agents use on houses to be shown to prospective buyers. The lock box, with the front-door key inside it, hangs on the door. The real-estate agent shows up with a separate key that lets him open the lock box and get out the key that opens the door. In the Clinton administration's plan, the government holds the key to the lock box. No matter what key the user chooses for his "front door," the government can always open the lock box and get to the key. While use of the government encryption standard would not be mandatory, computer and communications companies that want to remain eligible for government contracts would have a strong incentive to adopt it.

The Clipper-chip plan was quickly attacked by civil libertarians, the private cryptographic community, and a consortium of major computer and electronics firms. "The idea that citizens must register their secrets with the government just in case they are trying to keep them secret is patently unAmerican," declared one representative correspondent on the Internet, the worldwide computer network.

More than 30 major electronics companies, including AT&T and IBM, signed a letter to the president that said, in part, "While we recognize the importance of authorized national security and law enforcement needs, we believe that there are fundamental privacy and other constitutional rights that must be taken into account." An editorial in Communications Week observed: "This isn't the first time that the government has proposed an authoritarian scheme that goes after a few peoples' crimes while stomping on the majority's civil liberties."

The controversy over the Clipper-chip plan is the latest skirmish in a 20-year struggle between people who consider strong, widely available computer codes a guarantee of privacy and people who consider them a threat to national security and domestic tranquility. For intelligence and law-enforcement agencies, unbreakable codes in private hands mean that spies and criminals would be free to plot their nefarious activities without fear of detection. So ever since cryptography emerged from the secret world of the Pentagon and the National Security Agency in the mid-1970s, the government has sought to restrict what the public knows about new coding schemes and to hamper their general use.

Civil libertarians have consistently resisted measures that would make it easier to catch the bad guys by leaving all Americans vulnerable to government surveillance. The Fourth Amendment to the U.S. Constitution guarantees "the right of the people to be secure in their persons, houses, papers, and effects against unreasonable searches and seizures." What better way is there to be secure in your papers than to have them encrypted in a code that the government cannot read?

In many ways, this is a classic conflict between individual freedom and the demands of law enforcement. It's hard to predict how the political process would resolve the issue. But it may be too late for that. Technological developments and the diffusion of information may soon make the debate obsolete.

Almost as long as people have been writing things they have looked for ways to keep secret what they have written. The problem with writing is that anybody can read it. So when people have had confidential information, they have tried to make sure that only the people they wanted to read it could do so, while everyone else could not. This has led to the development of cryptographic systems—secret codes.

Until fairly recently, military applications of secret codes have predominated. Julius Caesar devised codes to communicate with his generals so that if a messenger were captured by the enemy, the message he was carrying would remain secure. In this century, the advent of radio made the problem more pressing. When information is broadcast through the air, it isn't even necessary to capture a messenger. The signals are there for any eavesdropper who wants to listen. Among the most significant developments of World War II was the Allies' breaking of both the German and the Japanese top-secret codes. Messages to the field from both Berlin and Tokyo were routinely read by Allied generals and war planners.

David Kahn relates the history of cryptography through the mid-'60s in his monumental 1967 book The Codebreakers. But shortly after Kahn's book was published, a very significant thing happened: Computers got very fast and very cheap. This had two consequences for cryptography. First, it made possible a new kind of computer-based codes that for the first time were all but unbreakable. Second, the widespread commercial and personal use of computers created an entirely new domain for cryptography. It is no longer an arcane subject of interest mainly to spies and the military.

The modern era of code making got started in 1976, when Martin Hellman of Stanford University and Whitfield Diffie, now at Sun Microsystems in Mountain View, California, published a paper entitled "New Directions in Cryptography" in the journal IEEE Transactions on Information Theory. They described a radically new kind of code in which the key used to encrypt a message would be different from the key used to decrypt it. The encrypting key could be made public. Hence the name of this system: public-key cryptography. But the decrypting key would be kept secret, so only the recipient of the message could unscramble or decode it.

It's like having a mailbox with two keys, one for locking and the other for unlocking. Everyone can have one of your locking keys, which allows them to put a letter in your mailbox and lock it up. But only you have the unlocking key. Only you can open the mailbox and take out your mail.

These codes are based on what are called trap-door mathematical functions. Like trap doors, it is easy to go in one direction (falling through the trap door), but it is very hard to go in the other (climbing out after you've fallen in). Shortly after Diffie and Hellman published their idea, three computer scientists at the Massachusetts Institute of Technology, Ronald Rivest, Adi Shamir, and Leonard Adleman, proposed a scheme to implement it. Their proposal (now called RSA after its inventors) is based on the fact that multiplication is very easy (especially for a computer), while factoring (determining what two numbers have been multiplied together to produce a given product) is very hard. That is the trap-door function. Easy one way, hard the other.

It is child's play for a computer to multiply two very large numbers—100 digits or more. But if you give a computer a 150-digit number and ask it what two prime numbers were multiplied together to produce it, the computer will have fits. There is no easy way to find the answer. Essentially, brute force is the only way to do it. That is, it has to try all of the possibilities.

For a small number, factoring is relatively easy. If you were asked, say, what are the factors of 35, it wouldn't take long to come up with seven and five. But for very large numbers, factoring takes more time than even the fastest computers have. By some estimates, trying all the possible factors of a 150-digit number would take millions of years.

As long as the general factoring problem remains unsolved—and there is no indication, despite a tremendous amount of work on the subject, that anyone has a clue how to solve it—the RSA codes are unbreakable. Edgar Allan Poe, who was an amateur cryptographer, wrote, "It may be roundly asserted that human ingenuity cannot concoct a cipher which human ingenuity cannot resolve." Poe's statement was true until public-key cryptography came along. So far, these codes have proved him wrong. But if anyone figures out an easy way to factor very large numbers, these public-key codes will be useless.

In 1977, Martin Gardner wrote a "Mathematical Games" column in Scientific American describing the new work in cryptography and the prospect for unbreakable codes. No sooner had it appeared than the National Security Agency, the federal agency charged with creating U.S. codes and intercepting the encoded messages of other nations, declared that research on public-key cryptography could endanger the national security of the United States. It would be possible for foreign nations to discover that the codes they were using were insecure and to substitute new coding systems that the agency could not break.

At one point, the agency suggested that it would ask Congress to enact legislation treating research in cryptography like research in atomic energy. By law, all research on nuclear weapons is "classified at birth," which means that if you go into your garage and invent an atomic bomb—using no published sources, classified or unclassified—you may not publish your findings. (The courts have upheld this seemingly unconstitutional prior restraint on publication, arguing that the threat of "thermonuclear annihilation of us all" justifies an exception to the First Amendment.) The NSA believed that publishing research on public-key cryptography would violate the International Traffic in Arms Regulations. There is obviously great military significance to having an unbreakable code.

In 1980 a public-cryptography study group was formed under the auspices of the American Council on Education. It included representatives of the NSA, the American Association of University Professors, the Pentagon, and mathematics and engineering societies. Among other things, this group struggled with how to differentiate theoretical ideas from actual cryptographic devices. Theoretical ideas cannot be stopped, either legally under the First Amendment or as a practical matter. Ideas do not respect national boundaries. But cryptographic devices are subject to the regulations on trafficking in arms.

The study group, which was chaired by Ira Michael Heyman, then chancellor of the University of California at Berkeley, called for all researchers in cryptography to voluntarily submit their papers to the NSA for review before publishing them. It was one of the first times that members of the academic community, normally committed to the free and open exchange of ideas, had recommended such restrictions. They undoubtedly felt the heat of the NSA's threat to seek federal legislation if academic cryptographers did not voluntarily restrain themselves.

But one member of the study group, George Davida of the University of Wisconsin, dissented from the recommendation, warning that privacy was at stake, that computer networks were "electronic windows into the most intimate details of people's lives," and that "a university, to be effective, has to be free."

In the 13 years since the study group made its recommendation, the issues have remained largely the same: privacy on one side and national security/law enforcement on the other. Significantly, the study group's recommendation for voluntary submission of cryptography papers to the NSA has been almost totally ignored. Nonetheless, the government did not carry through on its threat to seek legislation authorizing prior restraints on the publication of papers on cryptography, and many such papers have been published. In the last dozen years or so, cryptography in general and public-key cryptography in particular have been the subject of very lively debates in the technical literature and the not-so-technical literature. (Of course, it is hard to know what papers have not been published or have otherwise been withheld.)

The problem that the study group wrestled with—distinguishing theory from practice—remains unresolved and is a serious sticking point in all efforts to control knowledge. Twenty years ago, only a handful of mathematicians were interested in the factoring problem. Now, because of the significance of factoring for the security of these codes, there are probably hundreds of mathematicians working on factoring. Suppose one of them figures out how to do it. Would the government try to prevent publication of that information? After all, it would be a result in pure mathematics. To be sure, it is a result that would have important national-security implications. But the prospect of the government trying to prevent people from finding out that the factoring problem had been solved is ludicrous. It would be like the government trying to keep people from knowing that the angles of a triangle add up to 180 degrees, or that the earth goes around the sun.

Despite the difficulty of controlling the spread of knowledge, the government has tried hard to prevent the export of cryptographic devices or of programs that use sophisticated encryption algorithms. It is technically illegal to take out of the United States versions of some very popular computer programs—including the Norton Utilities, for example. But these efforts have proved largely ineffectual. You can buy a disk containing a good public-key cryptosystem in software stores in Moscow.

Here again, the problem is that it is all but impossible to restrict the flow of knowledge. In the era of the Internet, barring people from physically taking information out of the country is no bar at all. Digitized data moves freely by satellite. And even when it doesn't, it is virtually impossible to prevent anyone from walking into a software store in the United States, buying encrypted software on a floppy disk, and then putting it into a suitcase.

Over the years, the NSA has been very vocal on these matters. It has consistently opposed the adoption of public-key cryptosystems, preferring more conventional coding schemes that the agency can crack. The government has consistently endorsed a single-key encryption system called the Data Encryption Standard, which has been the source of much controversy over the last 15 years, though no one has been able to show that it can easily be broken. Because of the government's endorsement of the Data Encryption Standard, many banks and other commercial enterprises now use it, fearing, perhaps, that insurers would balk if they abandoned a cryptographic system that had the government's imprimatur.

Critics of the NSA's policy argue that restrictions on cryptographic systems put U.S. companies at a disadvantage relative to foreign companies, which are free to sell whatever cryptography their customers want. The demand for secure encryption will not go away, and the business will simply flow to any country that can meet the need. Why should the United States, which has a commanding lead in this technology, cede a business that is expected to be worth $5 billion in a few years to other parts of the world?

In addition to the export restrictions, the FBI wants Congress to outlaw the domestic use of unbreakable codes in electronic communications. The bureau argues that it needs the ability, once it has obtained a warrant, to listen in on the conversations of terrorists, organized crime figures, spies, and other malefactors. "We would not be able to succeed without the ability to lawfully intercept," says a bureau spokesman in Washington.

This is a contemporary version of the old argument that the only people worried about privacy are those with something to hide. In this view, only criminals need to be concerned. But even if Congress passes the legislation that the FBI wants, is it likely that bad guys who are breaking other laws would accede to the new law banning unbreakable codes? Or, as an editorial in the trade journal Network World put it recently, "What crook is going to use a technology that has a trapdoor to the Feds?"

The Clinton administration's Clipper-chip proposal was an attempt to reconcile law-enforcement and privacy interests. Assuming it can get its encryption standard widely adopted, the government promises not to release the keys unless and until a law-enforcement agency gets a valid search warrant. "The process is replete with checks and balances," the FBI assures.

Dorothy Denning, a leading cryptographer, agrees that concerns about the proposal are overblown. "The argument for a viable wiretapping capability is not an argument about absolutes," she wrote recently. "It is time to acknowledge that '1984' was fiction and recognize that our extensive system of government checks and balances has worked exceedingly well in preventing and exposing abuses."

Nevertheless, the prospect that the government would have access to any encrypted information it wanted to see leaves many people feeling insecure. The impetus for encryption in the first place stems from a desire to keep out all prying eyes, including the government's. Washington's promise that all of this will be closely monitored and controlled does little to allay those concerns.

Opponents of the Clipper chip are also concerned that the government will not allow independent cryptographers to inspect the basic plan for encryption and decryption. Everyone simply has to accept the government's assurance that the cryptographic scheme is very strong. Rumors abound that the NSA has built a trap door into the Clipper chip that would enable it to read any messages, with or without a court order. It's hard to say whether this is a legitimate concern. But it's a second example of the government saying, "Trust us." Many people would rather use public-key cryptography, which does not require them to trust anybody or to decide whether the government is trustworthy.

This is why many cryptographers became interested in secret codes in the first place. They were concerned about privacy, and they were fascinated by the possibility of keeping information in a way that no one—least of all the government—could see it.

Ultimately, it's unlikely the government can prevent the spread of information and knowledge, regardless of what it decides to do. Washington can force government contractors to use the Clipper chip and not use any other encryption scheme. But as a practical matter, it cannot prevent individuals from using whatever encryption scheme they want. For those who want the strongest encryption possible, RSA public-key cryptography is the system of choice. For better or worse, the genie is out of the bottle.

Lee Dembart is a longtime journalist, science writer, and editorial writer at The New York Times and the Los Angeles Times who has written extensively about computers, mathematics, and public policy. He recently graduated from Stanford Law School.