The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
I'm blogging excerpts this week and next week from Mark Lemley's and my new article, "Law, Virtual Reality, and Augmented Reality" (click on the link to see the whole article, including footnotes). This post is from the criminal law part; we began with disturbing the peace and are now turning to other crimes.
One general assumption we make throughout: VR and AR technology (think something like Google Glass) will get better, cheaper and more effective at rendering lifelike human avatars that track the user's facial expressions. I think that's a safe assumption, given the general trends in computer technology, and one that doesn't require any major technological breakthroughs; but we are indeed prognosticating about technology that's likely at least a couple of years out.
* * *
Consider another crime, this one visual rather than (like disturbing the peace) aural: indecent exposure.
There you are, minding your own VR or AR business, and you see this avatar a few feet away from you—and he's naked. Plus he's unusually well-equipped; if you're going to have an avatar, why settle for mere realism? Or maybe he's naked and deliberately grotesque. (Two penises?) Or maybe he's masturbating. Or having sex with someone.
You avert your eyes, but he pops right in front of you, wherever you look. And this might happen even when you aren't practically able to leave—for instance, if your in-VR job requires you to be "present" in that particular VR "location."
If this were happening on a street, the exhibitionist would probably be arrested for indecent exposure or public lewdness. [Footnote: He might also be deterred by social convention, or perhaps by the sense that he doesn't look that good naked. But in VR, he can look as good (or as grotesque) as he wants, and he doesn't have to show his real face. There may be immediate and temporary social sanctions—for instance, if he goes naked into a VR shop, he might get ejected—but then he can just quickly change his avatar to something clothed, and then change it back when he's done shopping.] But whether this law can be applied in VR turns out to be surprisingly complicated.
The Supreme Court has held that public nudity may be banned even in strip clubs, where the patrons pay money to see such nudity. But the court has also held that the First Amendment protects public displays of films containing nudity, even on drive-in theater screens visible from the street, where unwilling drivers and pedestrians may see the nudity (moving, in color, 20 feet high).
Even outside VR, this can be confusing enough that a Michigan appellate court has upheld an indecent exposure conviction for a man displaying his penis on a public access cable television show that he produced. This seems inconsistent with the drive-in case, but it may just reflect a deeper inconsistency between the drive-in case and the public nudity cases.
This gets even more complex when we go beyond video of nudity to video of sexual behavior. If the video is obscene, then it can theoretically be punished even when the viewers are consenting. And even material that is not outright obscene enough but is nonetheless "obscene-as-to-minors" might still be punishable when it is deliberately shown in public places where minors may be present. But the court has held that the government can't ban such obscene-as-to-minors material online, even in places that minors can access, because a less restrictive alternative is to have parents use filtering software to shield their children, if they so wish.
Perhaps the drive-in case and the public nudity case, though, can be reconciled: Public nudity is viscerally perceived as real and immediate in a way that a video display is not, the theory would go; and public nudity thus evokes reactions from which the law can legitimately protect people.
If that's so, public nudity in VR and AR becomes a harder case. After all, nudity in VR is technically display of video (as in the drive-in case) but also functionally aimed at emulating in-person presence (as in the public nudity cases). And while the avatars so far are relatively cartoonish, it won't be long before a nude VR avatar—normal size, with normal movements, seemingly standing next to you—feels a lot more like a physically present person than it does like a picture on a screen.
One reason the law forbids indecent exposure is that such public nudity may lead some observers to worry that the exposer may move on to sexual assault. That is a serious worry when the exposer is physically nearby, but not when the exposer is present only virtually. Nonetheless, unwanted exposure to others' nudity may cause feelings of unease even when it is logically clear that no in-person assaults are possible. So whether we should be more worried about indecent exposure in VR may depend on whether we think the primary focus of the law is on the unease that it creates among passersby, or on public indecency as a proxy for future physical attack.
But maybe this legal conundrum is likely to stay academic. First, we're back to the Bangladesh Problem. How many police departments would relish the prospect of trying to extradite someone from a foreign country, or even another state, because his online avatar is nude?
Second, as with loud avatars, VR users may be able to protect themselves from unwanted nudity in many circumstances. VR environments can easily be designed to let users change how others' avatars appear to them. "My avatar," after all, is just a visual image that I would like to present in displays that come up on others' VR goggles, communicated through the VR software on central computers and on other users' computers. Those users don't have to perceive me as the avatar I chose.
They could, for instance, substitute another avatar; if my avatar is Adolf Hitler and they don't like it, they could substitute Mahatma Gandhi (or vice versa). Or they could just edit the avatar: If my avatar is naked and they don't like it, they could color it solid green, or perhaps solid green except the face (software permitting, but this should be easy software to develop). Conversely, if they'd like to see more nudity, they could replace my avatar with whatever naked version—again, whether attractive or grotesque—they prefer. [Footnote: The VR operator might also let the VR store outside which the nude avatar—or the screamer—is standing exercise some control over such behavior.]
Indeed, they could probably use a program that automatically blacks out all the naked parts of naked-seeming avatars. Or the operator can require people who select a nude avatar to also provide a nonnude version, so that people who prefer to avoid seeing nudity can select that with just one global switch. This might be useful if the automated editing yields results that are too crude to yield an enjoyable VR experience, or if the operator wants to minimize even the initial unwilling exposures to nudity.
Now let's play out again a conversation with the police, focusing on how this technologically enabled self-help might affect their decision.
"There's this avatar standing in the VR park, and he's completely naked!"
"Why don't you just hit the 'dress up the avatar' button?" the police officer asks. (Again, we assume an officer who knows something about VR.)
"I shouldn't have to do that!," you say. "He's violating the law, and it isn't up to me, the victim, to try to avoid that."
And that's a plausible argument, in theory; as you point out to the officer, "After all, 'To say that one may avoid further offense by turning off the radio when he hears indecent language is like saying that the remedy for an assault is to run away after the first blow. One may hang up on an indecent phone call, but that option does not give the caller a constitutional immunity or avoid a harm that has already taken place.' Justice Stevens said that, you know. In FCC v. Pacifica. Same for nudity as for vulgar language."
But our police officer is not a theorist. "Are you telling me that you could have avoided this problem by clicking on a button," he says, "and you're bothering me? I have real crimes to deal with—ones in which the victims really need me to do something that they can't do for themselves."
Or, if the officer is a theorist, perhaps he is one of the economic rather than deontological variety. "You are the cheapest cost avoider here," he says. "You can avoid the unwanted nudity with just a few clicks, whereas I would have to go through much more effort to prosecute it. I know that the criminal law does not usually formally focus on that; but, practically, it makes me reluctant to give your call a high priority."
Now of course there are limits to this "you should have avoided the problem yourself" argument. Presumably if the crime is more serious—say, burglary—the police wouldn't just say "your own fault for having left your front door unlocked, we won't investigate the case." But for minor enough crimes, and ones where the main worry is prevention going forward, the police are unlikely to invest many resources into such prevention when citizens can more effectively prevent the problem themselves.
And this tendency only increases as a result of the Bangladesh Problem. As arrest and prosecution become much more expensive for the police, and technologically enabled self-protection simultaneously becomes less expensive for citizens, the police are likely to become less interested in intervening, especially in cases that don't seem to them to involve any "real" harm.
UPDATE: Note that, outside VR, indecent exposure law applies on private property (e.g., at a shopping mall) and not just on public property. To be sure, in many states nudity is allowed in certain places where the owners allow it, such as strip clubs (though some states bar nudity even there).
But on publicly accessible private property whose owner doesn't allow nudity, nudity is more than just violation of the rules that can lead to ejection (and to a trespass prosecution if the naked person doesn't leave). Rather, it's itself a separate crime. A flasher at a shopping mall can be prosecuted, not just kicked out. It may well be, for reasons that we're discussing here, that VR should focus more on technologically enabled self-help; but that's a departure from traditional indecent exposure rules, not an application of those rules.