Civil Liberties

The Law: Chilled Prodigy

An "anti-censorship" libel ruling threatens choice in cyberspace.

|

The law often demands that we sacrifice some liberty for greater security. Sometimes, though, it takes away our liberty to provide us less security. That's what a New York state court just did, and the U.S. legal liability system may have just claimed another victim: the on-line community.

In October 1994, a message appeared on a bulletin board in the Prodigy on-line service. "The end of Stratton Oakmont [a small stock brokerage] will finally come this week," the message said. "This brokerage firm headed by president and soon to be proven criminal—Daniel Porrush [sic]—will close this week." The message described a particular transaction in which Stratton was involved, and declared, "This is fraud, fraud, fraud and criminal!!!!!!!" Who wrote the message is a mystery. Though it seemed to come from a particular user's account, it turns out that someone else was using that account without permission.

Stratton naturally took umbrage, and sued Prodigy for libel, asking for $100 million in damages, plus another $100 million in punitives. And on May 24, the trial judge ruled that the lawsuit could proceed: Even though the posting, like all bulletin board postings, was made without Prodigy's prior knowledge, the judge held that Prodigy could be held liable for what it said.

Under the law of libel, it's clear that someone who republishes someone else's defamatory statement—for instance, a magazine that prints an article by an outside writer—can be held legally responsible. One can debate the wisdom of this rule, but it makes at least some sense. If my defaming you violates your legal rights, then when REASON broadcasts my defamation to 50,000 subscribers, your injury is aggravated. REASON seeks to profit from my article; REASON has the opportunity to check the article and not run it if it appears to be false; it's at least arguable that REASON should be held responsible for the injuries that its contents inflict.

But it's equally clear that people who merely redistribute defamatory statements can't be held liable. When a newsstand sells a copy of REASON, it's helping circulate its contents, including whatever nasty things I say about you; but the newsstand isn't liable because of this. Likewise, say someone gets a copy of REASON by modem. The phone company is in a sense republishing the text, but you can't get money from the company for its role in ruining your reputation. The law recognizes that no newsstand or phone company can be expected to read and verify every publication that it distributes. Imposing liability in such a situation would be both unfair and chilling to free dissemination of speech.

On-line services are decidedly closer to the phone company or the newsstand than to the magazine. Tens of thousands of messages are posted to the services daily; the services can't read them all and verify their accuracy. In the 1991 Cubby v. CompuServe case, a federal trial court recognized this and held that an on-line service couldn't be liable for defamation unless "it knew or had reason to know of the allegedly defamatory statements." Any alternative rule, the court said, "would impose an undue burden on the free flow of information."

So far, then, so good: Coming into the litigation, Prodigy could have expected that, like CompuServe, it would be immune from liability. What happened?

Well, the trouble was that Prodigy took some steps to oversee the bulletin boards that it operated. At least at one point, Prodigy used a software screening program which automatically rejected bulletin board postings that contained offensive language. Prodigy also announced to its users that they shouldn't post notes that are insulting, harassing, or in bad taste. Prodigy employees scanned the bulletin boards looking for such messages, and the Prodigy software allowed those employees to delete messages that didn't fit the guidelines. "Prodigy is committed to open debate and discussion on the bulletin boards," Prodigy said, but "this doesn't mean that 'anything goes.'"

This, the court explained, is what differentiates Prodigy from CompuServe. Prodigy controlled what was said on its bulletin boards: "That such control is not complete and is enforced both as early as the notes arrive and as late as a complaint is made does not minimize…the simple fact that Prodigy has uniquely arrogated to itself the role of determining what is proper for its members to post and read on its bulletin boards." And because of this control, for the purposes of libel law, "Prodigy is a publisher rather than a distributor." The court concluded that "Prodigy's conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice."

The court's reasoning, though, is rather puzzling. After all, even traditional distributors, such as newsstands and bookstores, do arrogate to themselves the role of determining what's proper for their customers to read. A newsstand might choose not to carry pornography. Some bookstores refused to carry Bret Easton Ellis's novel American Psycho. Some stores specialize in libertarian books, others in communist ones. All newsstands and bookstores select their stock based on some criteria, be they popularity, topicality, nonoffensiveness (or offensiveness), or what have you.

Furthermore, the only way that Prodigy is like a magazine publisher is that it has some control over what's posted on it. But the degree of control is vastly different. I assure you that Virginia Postrel (or someone who works for her) has done more to this article than run it through a profanity-checking program. Some human being at REASON has approved every article that REASON carries—not just reserved the right to approve it or disapprove it, but has actually read it and decided whether to run it or not. Not so for Prodigy and the 60,000 messages that are posted every day to its bulletin boards.

Moreover, Prodigy doesn't purport to check messages for accuracy; it just checks them for offensiveness. It's unreasonable enough to expect Prodigy to read every message, but it simply makes no sense to expect it to fact check every message. Why should Prodigy's willingness to maintain some standards of taste translate into a responsibility for its users' factual errors?

What justification could there be, then, for this puzzling distinction between CompuServe (which does no screening) and Prodigy (which does only the limited screening that the large volume of messages allows)? Why should a small degree of extra oversight translate into a vast increase in liability?

Well, the court never directly answered these questions, but it did give one suggestion: Prodigy's screening policies themselves, in the court's view, were troublesome (even if not illegal). "It could be said," Judge Stuart Ain wrote, "that Prodigy's current system of automatic scanning, Guidelines, and Board Leaders may have a chilling effect on freedom of communication in Cyberspace, and it appears that this chilling effect is exactly what Prodigy wants, but for the legal liability that attaches to such censorship." Prodigy's willingness to restrict what's said makes it, in the court's eyes, a censor. If Prodigy does this, it must "accept the concomitant legal consequences."

It is here—in its blithe assertion that private screening and oversight is "censorship" to which there must be "legal consequences"—that the court made its greatest theoretical error. Private screening is not some shady practice employed by nasty thought controllers. Rather, it can be a positive good: It can add value to information by filtering out what people don't want to see.

This is most obvious when the oversight focuses on particular subjects or particular viewpoints: The reason you're reading this magazine is that you like the screening that its editors perform. But screening out profanity or abuse is valuable, too. People don't just want raw information. They also want a sense of intellectual community—the exchange of ideas among people who are on friendly, polite terms.

Insults and profanity can destroy that community, can make the discussion seem more like a war zone than a living room. (This has happened with some frequency on many unregulated cyberspace forums that I've seen.) And when this happens, many people just drop out. People who've been "flamed" enough times, or have seen others flamed, may find that the emotional costs of participating exceed the benefits.

If the government tried to protect people by prohibiting vulgarity on all services, or by ordering that all insults be promptly deleted, that certainly would be censorship. But Prodigy is just a private organization which lets people voluntarily enter into a private community, a community—much like a private club or even a private home—in which certain norms of behavior are enforced.

Before the Stratton Oakmont v. Prodigy decision, people had a choice: They could participate in an unrestrained forum, such as CompuServe, or a more restricted one, such as Prodigy. But if this decision stands (Prodigy is planning to appeal it), people will lose this choice. Faced with the risk of $200 million lawsuits, Prodigy and similar services will almost certainly drop their screening facilities. Screening may be a valuable feature, but it's not valuable enough to justify the tremendous increase in legal peril.

So what have we gained as a result of the court's decision? Not security: You and I are no more protected now against on-line defamation than we were before. Because a service can escape liability simply by letting all the messages through, most services will do exactly that, and future libel plaintiffs will be out of luck. And where before we might have gotten some protection from the service screening out or deleting the worst of the abuse, now we won't even get that.

Nor have we gained liberty. Sure, we might now be able to post offensive messages on Prodigy, though if we wanted to send offensive messages we could have done that all along using CompuServe, or any of a number of Internet service providers. But if some of us would prefer to participate in a more temperate conversational environment, instead of a free-for-all, no one will be able to safely provide it for us. And if flaming and abuse scare people away from the on-line services, all of us will lose the advantages of those people's contributions.

The great virtue of the free market is that it tends to give people what they want, not what the government thinks they ought to want. Those who want an anarchic on-line environment could get it; those who want a more peaceful community could get it, too. The law, though, is inherently Procrustean: When the court speaks of the "freedom of communication in Cyberspace," that's freedom of communication only on the terms that the court prefers. Another lose-lose proposition, brought to you courtesy of our legal liability system.

Eugene Volokh (volokh@law.ucla.edu) teaches constitutional law and copyright law at UCLA Law School.