The Supreme Court, "community standards," and the Internet
Chances are that if you're at all acquainted with the arcane legal territory known as "the law of obscenity," you know that in the United States such law is based on "community standards" that shift from one jurisdiction to another: What folks consider obscene—that is, without any redeeming social, cultural, or aesthetic value—in one place may be unobjectionable somewhere else. This is no small matter, as material considered obscene can legally be censored.
What you may not know is that a recent federal Appeals Court decision has called the entire "community standards" doctrine into question and that the U.S. Supreme Court has agreed to weigh in on the matter. This sets up the possibility of the first wholesale revision of obscenity law in decades. While there are reasons to be optimistic that the outcome will increase the realm of protected speech, there are also reasons to worry that we may end up with fewer speech rights.
Here's the background: In 1998, Congress passed the Child Online Protection Act (COPA), which is aimed at preventing minors from getting access to sexually explicit but otherwise legal material. COPA is based on the notion that the government has a role in preventing children's exposure to content that is legal for adults (that is, material that isn't legally obscene) but that nevertheless might be considered "harmful to minors" (sometimes known as "obscene for minors").
Soon after COPA became law, the American Civil Liberties Union challenged it in court, claiming it overly restricted First Amendment-protected speech. A U.S. District Court in Pennsylvania agreed that COPA ran afoul of the Constitution and, in June 2000, the 3rd Circuit Court of Appeals also agreed that COPA was too restrictive. The Supreme Court has agreed to hear the case in the upcoming term, which begins in October.
So why is the 3rd Circuit decision troublesome? In Miller v. California (1973), the Supreme Court came up with a way of dealing with so-called obscene content that got the high court out of the business of deciding at a national level what content is legal and what can be punished. Miller held that the definition of obscenity depends at least in part on the standards of local communities: Content that is acceptable in New York or San Francisco isn't necessarily going to be legal in Waco or Paducah, the court reasoned. At the same time, the court did carve out an exception for material that has "serious" literary, artistic, or other social value; such content, it ruled, can't be obscene no matter what local standards prevail.
Miller didn't address whether sexual content that's legal for adults is legal for minors as well, but subsequent federal and state court cases have suggested that "community standards" apply here, too. This led to the concept of otherwise legal content that may be "harmful to minors." Thus, parents in Manhattan, New York, may think little of allowing their children access to content that would appall parents in Manhattan, Kansas. As the 3rd Circuit correctly noted in its decision, the statutory scheme of COPA is wrapped around that concept of "harmful to minors," which itself depends on the notion of "community standards."
This, said the 3rd Circuit, is a big constitutional problem for Web site publishers. After all, when you put up a Web site, you can't tell who's going to access it or where they are—at least not with current technology. So even if you publish content on your Web site that isn't harmful to minors in your locale, you can't guarantee that a kid in a more restrictive community won't click his way to it. What COPA effectively requires, the court concluded, is that every Web site operator design his content so that it is acceptable—not "harmful to minors"—in every jurisdiction in the country, or else check everyone's I.D. at the door.
Either alternative is excessively burdensome on speakers—especially speakers such as the Buffy the Vampire Slayer fans who write erotic fiction featuring their heroine or the keepers of racy Web logs, whose content is not only legal for adults, but also clearly protected by the First Amendment. (COPA purportedly addresses only commercial Web site operators, but the definition of "commercial" in the statute is so broad that it arguably includes even nonprofit and hobbyist Web sites.) Based on this reasoning, the 3rd Circuit concluded that the "community standards" doctrine—at least when applied to the Internet—is itself unconstitutional.
On the face of it, the 3rd Circuit's dismissal of community standards on the Internet should please somebody. After all, the two basic sides in the various legal fights over sexually explicit material both dislike community standards. Civil libertarians find the doctrine too restrictive and variable, while social conservatives hate it because it allows too much to be published (plus, it has that annoying escape clause about "serious" value).
Despite such problems, the community standards doctrine has brought comparative stability to an aspect of law that had become increasingly contentious. Which is the main reason you can expect the Supreme Court to think long and hard before dumping the doctrine, Internet or no Internet. That, in turn, explains why both sides in the upcoming Supreme Court case are wrestling with how to deal with the community standards issue: No one wants to ask the Supreme Court to do what it is plainly loath to do—make national standards and hence deal with every obscenity case in the country. But no one really wants to defend the community standards doctrine, either.
It's likely that the government, along with various social-conservative amici curiae such as the National Coalition for the Protection of Families and Children, will argue for a national "harmful to minors" standard, under which COPA could survive a constitutional challenge. But COPA's defenders may argue something different: that community standards are just fine as a way of regulating content, so now let's make the content providers come up with a technological fix—up to and including redesigning the way the Web works—in order to enforce those standards.
COPA's challengers, despite their earlier successes in this case, may want the high court to avoid the community standards issue altogether. After all, they got COPA struck down in the trial court because the law was judged unconstitutionally overbroad and vague in its regulation of speech protected by the First Amendment. Thus, they too may ask the Court to find other reasons to uphold the finding that COPA is unconstitutional. No need, they may argue, to revisit community standards doctrine this year.
But perhaps there's a way to revisit community standards doctrine and find something useful and pro-liberty in it. It helps to remember that when it comes to censorship, what lawmakers have been trying to preserve is not government authority to regulate content per se, but community integrity—the ability of groups to maintain the character of their public spaces without being confronted with, say, sexually oriented businesses such as adult bookstores and strip clubs.
In other words, lawmakers are interested in public spaces, not private ones. That the government has no authority to invade the privacy of a home to root out obscenity—regardless of community standards—has been established at least since the Supreme Court's 1969 decision in Stanley v. Georgia.
With that case in mind, it seems certain that whatever the social interest is in regulating obscene and harmful-to-minors content, it has to do only with public spaces—spaces outside the home. The government's power to regulate obscenity and harmful-to-minors content has been understood in the modern era to be a function of that community interest in protecting public spaces.
So when we look at community standards, what we're looking at is a right that is vested not in state and federal police officers, but rather in the citizens of communities themselves. Historically, that right has been delegated to the police, but the problem in the age of the Internet has been that police-centered content regulation—especially when it comes to content whose legality varies depending on what community you're in and what age the audience is—is too blunt an instrument. And retrofitting the Internet to make things easier for police is too frightening a prospect to accept.
The Internet is a decentralized medium, so it makes sense that the primary arbiters of what's seen on the Internet ought to be individuals, including individual parents. We Internet users tackle this problem in different ways—some of us may use filtering software to enforce our choices, while others, eschewing what they see as the clumsy world of commercial filtering tools, simply rely on their own ability to choose where they go and what they and their children see on the Internet. It's not a perfect system, of course, but it's better than having the cops make their own judgments as to what may be "harmful" to our minors, and it's better than balkanizing the Internet technologically and legally in order to make it easier for the cops.
Is this user-centered approach to community standards a First Amendment framework that we all can live with? In a sense it always has been-it's precisely the system that we parents are used to enforcing in the offline world. All parents know their children will encounter things in the real world that they'd prefer they not see. What we parents have relied on, historically, has been our ability to instill our own community standards in our children—internalized values that remain with our kids when neither parent nor policeman is around.
In that sense, the Internet and the Web don't pose any new community standards problems—just a digital version of a very old one that we've been coping with for a long time.