The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Last week, I blogged about the criminal portions of Mark A. Lemley's and my new article, "Law, Virtual Reality, and Augmented Reality," which the University of Pennsylvania Law Review has just accepted for publication. (Click on the link to see the whole article, including footnotes.) This week, I wanted to turn to tort liability. First, the most straightforward stuff—direct tort lawsuits against offenders.
* * *
A. The causes of action
… In theory, [direct tort liability of offenders] might be possible in many of the circumstances we have identified, even if criminal law won't apply. For instance, using strobe lights to deliberately cause a seizure in a person one knows is epileptic is likely at least negligence, and possibly also a form of battery, though that question is complicated.
For the other scenarios, tort liability would be more of a stretch, but not implausible. Disturbing the peace might be recharacterized as nuisance, at least in a suit brought by "nearby" VR or AR stores whose business is interfered with by the screaming; but, especially as to VR, that would require nuisance law to be modified, for instance by treating VR "places" as tantamount to "uses of land" which nuisance law protects. Nuisance also generally requires either long-term interference or especially serious interference; disturbing the peace law punishes even brief incidents.
Virtual groping might be treated as intrusion upon seclusion; though it happens in "public" places, the intrusion tort can apply even there, to behavior that is seen as intruding on one's bubble of personal space. Indecent exposure might qualify as well. Both might also constitute intentional infliction of emotional distress, even in the absence of physical touching, on the theory that they are both "outrageous," though that tort generally requires a showing of severe emotional distress where there is no physical contact.
Tort law can also reach a wide array of conduct that wouldn't [generally] be a crime even in the physical world. Defaming a VR avatar should be a tort, even if the avatar is pseudonymous.
One of us has had an extended debate with a well-respected federal judge who believed it was impossible to defame an avatar because avatars weren't real, so their reputation couldn't be injured. This "it's just a game" sense might pervade VR for some time in the courts, in part because most judges are unlikely to be early adopters of VR. But we think such a view is misguided.
Corporations can sue for defamation, because people invest time and money to create reputational capital for the corporation. There's no reason why the same wouldn't apply to a pseudonym that is used to do business, in VR or otherwise—or to one that is used for ordinary life. The idea that falsehoods that damaged the reputation of Mark Twain weren't defamatory unless they expressly mentioned Samuel Clemens strikes us as unsound.
The damages to a pseudonym's reputation might be less in many situations than the damages to a real person's reputation, because many pseudonyms have built up less reputational capital, and people can take on new ones with little loss. But they could be quite great in other situations, if—as is true in some Internet circles and will likely be increasingly true in VR—the pseudonym or avatar is better known than the person's name, which might be obscure or even deliberately concealed. Most readers probably couldn't come up with the real name of The Weeknd, but that doesn't mean we couldn't defame him.
B. Practicalities (and impracticalities)
Tort lawsuits against VR and AR offenders have one important advantage over criminal prosecutions: They are available even when the police are unwilling to intervene. For example, even if the police don't want to spend their time on a difficult investigation—especially when they think the complainant could have avoided the problem using technologically enabled self-help—the complainant can still demand his day in court.
Practically speaking, though, we doubt that people will often sue each other for most VR or AR behavior. First, again, there is the Bangladesh problem. VR torts might involve tricky jurisdictional questions; if you're screaming in a VR forum from your apartment in Poland, is it fair to require you to answer lawsuits filed in San Francisco or in Buenos Aires?
People have litigated that question extensively in Internet cases. But even if a court in, say, California concludes that it has jurisdiction over the Pole (perhaps because the Pole targeted strobe lights at a person who he knew to be in San Francisco), enforcing a judgment against someone half a world away would likely be very hard, and in any event many defendants would lack the money to satisfy a judgment.
Second, while police refusal to go forward wouldn't be a barrier to civil lawsuits, the cost of such lawsuits might be. However distressed one might be by virtual groping, it's unlikely that one would be willing to spend tens of thousands of dollars tracking down the culprit, suing him, and trying to recover the judgment. Some people might, perhaps to send a message, but that would be rare.
And abbreviated procedures that are aimed at making lawsuits cheaper and easier—such as small claims trials or restraining orders—won't help much. A small claims court might be reluctant to allow a lawsuit against someone far away, even if jurisdiction is in principle available; any judgment, moreover, would still be costly to enforce. And the police may be as reluctant to go after a faraway restraining order violator as they are to go after a faraway flasher or screamer.
[All this may well give plaintiffs an even greater incentive to go after the VR/AR providers—more on that in coming posts, though you can read about it even now in the full article.]