Nowhere to Hide?

|

The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?, by David Brin, Reading, Mass.: Perseus Books, 378 pages, $25.00

New technologies are disrupting society's evolved checks and balances. Advances in computing lead to low-cost encryption-privacy for our online communications and finances. Advances in sensing lead to low-cost surveillance-tracking of our movements, actions, speech, and physical purchases. As technologist Mark S. Miller and legal scholar David Friedman both point out, our formerly unified world of partial privacy is splitting in two: an online world with much more privacy and a physical world with much less.

The knee-jerk reaction from most of us is that the first change is good and the second is bad: We want more privacy, everywhere. David Brin's goal is to convince us of the opposite. He succeeds only halfway–but that half is brilliant.

Brin's credentials for the task are appropriate and unusual. As a former practicing scientist, he can understand the basics of these new technologies. As a science fiction author, he has experience in projecting their effects into the future.

The book's message on surveillance is: "The cameras are coming; they will soon be everywhere. Complaining won't stop them, so we might as well turn them to our benefit." Once seen only in banks, then retail stores, cameras are now sprouting along the roads. Tiny cameras used to be so expensive that only spies used them. Soon they'll cost so little that each of us can afford to have many: "surveillance technology for the rest of us." Sensors, microphones, cameras everywhere.

This prospect makes practically everyone nervous. The political right wants economic privacy, to discourage redistribution. The political left wants "lifestyle" privacy, to discourage morality-based interference. Libertarians want both kinds of privacy. All understand that privacy is vital to protecting their preferred areas of freedom–what the other side doesn't know about, it can't prevent.

Brin advocates another path: protecting freedoms through openness and accountability instead of secrecy. Openness–letting the data flow where it may–promotes personal responsibility in both economic and social arenas. This in turn decreases the temptation to have the state intervene. By aligning individuals' incentives with the results of their actions, accountability gives us more robust freedom than privacy alone can ever guarantee.

As Brin emphasizes, this is not a new insight. Accountability has always been part of what makes societies work. He makes the case for openness in a relatively nonpolitical way, often emphasizing its usefulness in simply protecting our physical safety.

This usefulness is increasing, not decreasing. After decades where external threats of mass destruction seemed far away, and internal ones were nonexistent, the use of sarin gas by Aum Shinrikyo in Japan reminds us that chemical and biological weapons are smaller, cheaper, and easier to hide than old-style weapons. It used to be sufficient to fly planes over enemy countries to look for weapons facilities. Now we are going to have to pay different kinds of attention, in more places, to head off disaster. Arms control and personal privacy issues were once easy to separate. Now they're blurring together.

This is going to take some getting used to. We're all in favor of "personal empowerment," but it can be taken too far. Your neighbors have a startup company in their garage–they say–but it could be something else. So far we've been lucky in the United States: Our cultists generally choose to kill only themselves, not arrange to take guests along as a favor. Brin points out that one major incident could change American attitudes toward privacy overnight. We'd better figure out in advance how to tame this new world of tiny weapons, or we could lose both privacy and freedom when demagogues–democratically elected by panicked voters–implement their safety nazism.

The problem is that "accountability" is not as sexy a rallying cry as "privacy." The importance of accountability–a.k.a. fairness, justice–is so deeply ingrained in us that we rarely bother to discuss it. Children are taught the fairness principle very early, and then spend the rest of their lives learning the bitter fact that few areas of life are fair. To avoid sounding naive, we soon learn to stop even using the word.

But as concepts, fairness and accountability underlie our entire legal system. They are a large part of what makes nonviolent interaction possible, what enables us to plan. The most basic rule is that if you hurt someone or his possessions, you must pay, or have a very good explanation of why you shouldn't have to. You are responsible for your actions and can be called to account.

To see the benefits of accountability, consider what happens when we can't get it. If we can't identify who caused a given problem, responsibility cannot be a ssigned and becomes diffuse: It becomes an externality, a tragedy of the commons, a "social problem." Every such problem is "our" problem, causing disputes about what we–usually, the state–are going to do about it.

It's hard to overestimate the costs of our current situation of poor accountability. Consider the drug war: Unknown drug users inflict costs on society, giving the political right an excuse to implement drug prohibition. The result is the same as last time we tried alcohol prohibition: disrespect for the law, an increase in gangs and organized crime, and the corruption of law enforcement, which could even spread into the military to the extent that it is required to perform interdiction. The corruption goes beyond the United States, dragging down countries far poorer than we. It took 13 years for us to realize that alcohol prohibition was causing more problems than it solved. Today we focus on accountability for alcohol users: Cheap alcohol detection, combined with social pressure, has reduced drunken driving.

Accountability problems cause noxious side effects that hurt everyone. When an offending individual cannot be identified, sometimes we blame an entire group, resulting in prejudice. The unfairness of this causes outrage, leading to demands that the blamed group be compensated: "social justice." Both sides of the argument try to get their views embodied in laws which turn out to be both intrusive and costly to individuals; youth curfews are an example. If instead we enforce accountability–actual justice for individuals–we can reduce this cynical, damaging groupism.

Not only humans are damaged by lack of accountability. The tragedy of the commons applies equally to the environment. Today we try to protect the environment with crude regulations that inflict high costs even where they aren't necessary, such as California's requirement that every underground gasoline tank be replaced because some leak. Large companies can absorb the cost, but independents are being driven out of business. With cheap sensors, we could have accountability: If your tank is deteriorating and about to leak, you must replace it.

In all these cases, what we need to make more just decisions is better information. That's what the cameras, microphones, and sensors could give us–assuming we can find a way to ensure the new information is handled properly. Brin's proposal: Instead of putting law enforcement agencies in charge, put the people themselves in charge. Distribute all publicly gathered information to everyone. Again, it's an unnerving thought, but possibly less unnerving than the alternative.

One option that isn't available is to not collect the data. When the needed instruments are so small and cheap, it will be collected by many. The only question is who gets the data and how it can be used, which brings us back again to the goal of privacy.

Brin points out that the right to privacy is not considered as strong as, say, freedom of speech. Privacy is a fairly recent invention. Only with the rise of cities, and wealth enabling large houses, have we had any privacy worth mentioning. In tribes, villages, small towns–in most human environments throughout our history–we knew our neighbors' business in detail. The anonymity we enjoy, or don't enjoy, in today's cities may well turn out to be a short-term condition. Certain standards of behavior were drummed into us over those long millennia, but as today's urban incivility illustrates, it may be that accountability is returning just in time.

Privacy extremism leads to perverse results. European privacy regulations on what data may be stored in computer databases are so restrictive that they are often ignored entirely. When privacy concerns dominate, they lead to restrictions on others' right to see, remember, and share data–this last being a form of speech. As Brin points out, gathering and connecting data is a central part of human nature. "Information wants to be free," so they say, but it's even more true that for us, information wants to be correlated: It's what we do. Trying to inhibit this human drive–knowledge prohibition–is doomed to failure; it will merely drive knowledge underground.

On these drawbacks of privacy, and the benefits of openness, The Transparent Society is both convincing and relatively nonpolitical, thereby reaching far more readers than it would positioned as politically right, left, or libertarian. This moderate perspective is strengthened by repeated references to history, to the value of maintaining systems that have been leading to improvement for some time now–evolved systems. By the end of the book, the reader is better able to think calmly about the coming surveillance technologies and how we can not only tolerate but even benefit from them.

Brin is less moderate, and less convincing, in his coverage of encryption technology. He is particularly harsh in his treatment of encryption advocates such as the "cypherpunks," stooping frequently to name calling and straw-man attacks. Whereas Brin accepts surveillance technology as inevitable and mostly good, he regards encryption technology as close to evil, and those who promote it he usually portrays as extremists.

Given Brin's preference for openness, it's tempting to see this as simply an emotional reaction to a technology that happens to push in the opposite direction, toward privacy. But we should look first for a deeper explanation, and the book gives us a clue.

We all have a gut sense that privacy is important to head off interference: from the left in economic matters, from the right in personal morality matters. There are entirely too many laws and regulations already in place that affect us–so many that we don't know what most of them are. The more aware we become of this, the more important privacy is to us.

Brin's response to this concern is that with enough openness, everyone will be seen to be in violation of something. This situation of mutually assured guilt will, in theory, protect us from prosecution and, over the long term, cause pressure for the laws to be reformed.

This makes sense, as far as it goes. But we have to live now, in the short term, and today laws are enforced unevenly. Brin should forgive us for preferring that the offending laws be reformed first, before the whole world finds out–possibly before we ourselves do–that we are in violation of some law. For those who don't believe we have reason for concern, here's an example: I recently had the fun of calling a friend to let him know that he is a felon. This home-owning, taxpaying family man–one of the kindest, most ethical people I know–owns an art object which consists of a decorated walking stick with a pretty little sword inside. Imagine his surprise to learn that mere possession of such an object is a felony in California.

If he can be an unwitting felon, it could happen to anyone. We're told that ignorance of the law is no excuse, but that principle only works when laws are few and simple. Otherwise, trying to abide by the law becomes a time-consuming and risky guessing game.

Brin complains that the cypherpunks–encryption activists–regard government as the sole threat to their freedom. He sees other potential sources of trouble, such as corporations. But for the baby boom and later generations, nongovernmental entities have not been a big problem–unless those entities get control of some state power. Coercive interference has mostly come via government: telling them what they may and may not put in their mouths; whom they may hire and how much they must pay; what products they may and may not sell and to whom they may sell them; what pieces of paper they must buy to be allowed to provide what services–the rules are nearly infinite and often serve special interests, rather than the public interest.

Many of these rules don't make sense, at least for responsible grownups engaging in voluntary transactions. When technologists–a.k.a. engineers–such as the cypherpunks encounter a system like this one that is broken somehow, they don't just complain. They do what they are best at: They try to fix it.

But fixing systems from Washington, D.C., is almost impossible. Some heroic technologists, such as John Gilmore and Matt Blaze, do travel to Washington to try to help fix the laws, but it's a nauseating task. Bureaucrats and politicians live in a world of words, opinions, and spin control; hearing about today's technological realities is of little or no interest to them.

So it's natural that technologists should turn instead to ways of fixing a system that give faster, more reliable results. If the laws can't be fixed, they'll improve privacy so the laws don't matter as much. Brin, as a scientist rather than an engineer, does not sympathize with this intense desire to personally repair broken systems. He sees social systems as more like natural ones, evolving slowly on their own, not directly improved by individual human action–and in most cases, this is the right way to look at them. But it is the universal desire for fairness, for justice, that pushes each of us to try to improve social systems. This at least Brin should recognize and sympathize with, even in the cypherpunks.

There are some other problems with The Transparent Society. It's too long, redundant, sometimes overly dramatic. There are far too many I's, me's, and exclamation points! Brin makes few policy suggestions, and the ones he does make–such as a bit tax on encryption–seem dubious.

But these are minor points, and are compensated for by the author's continually emphasizing that he does not claim to have the right answers, and that the best way to find such answers is through vigorous discussion of the issues from many perspectives. Brin invites us to criticize him, and has made it easy to do so by allowing two key chapters to be posted online. You can join in this discussion–and actually "mark up" his writing with your own comments–at http://crit.org/openness.

The alternation in the book between points you agree with, and those you disagree with, combined with so much repetition, may have you pulling your hair out as you read. It's enough to make one suspect that the author, educated at Caltech–a world leader in advanced practical jokes–has done this on purpose to infuriate all readers and thereby pull them into the debate. If so, it works.

Brin's book succeeds in being provocative and stimulating, as advancing technologies threaten to jerk us back and forth between danger and safety, domination and freedom.

Christine Peterson is executive director of Foresight Institute, www.foresight.org, and co-author with Gayle Pergamit of Leaping the Abyss: Putting Group Genius to Work (knOwhere Press). Her next book is tentatively titled Plain Talk about Technology, the Future, and Other Scary Topics.