The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Crime

Crime on the virtual street: Strobe lighting, 'virtual groping,' and startling

|

I'm blogging excerpts this week and next week from Mark Lemley's and my new article, "Law, Virtual Reality, and Augmented Reality" (click on the link to see the whole article, including footnotes). This post is from the criminal law part; we began with disturbing the peace and indecent exposure and are now turning to other crimes.

One general assumption we make throughout: VR and AR technology (think something like Google Glass) will get better, cheaper and more effective at rendering lifelike human avatars that track the user's facial expressions. I think that's a safe assumption, given the general trends in computer technology and one that doesn't require any major technological breakthroughs; but we are indeed prognosticating about technology that's likely at least a couple of years out.

* * *

1. Strobe lighting

Here's a possible test case that does involve a serious harm that is harder to avoid: About 3 percent of people who have epilepsy - disproportionately, young people - can have seizures triggered by strobe lighting. Though such seizures tend not to be fatal, or even greatly injurious, at least when the person having the seizure is just sitting in his home in front of his computer, they do involve a nontrivial risk of injury. This hasn't been seen as reason enough to generally ban strobe lights, especially since such lights seem to be entertaining for many people and are sometimes used as a safety feature. But deliberately creating a strobe effect in VR precisely to play a nasty prank on someone you know to be endangered by this would likely be tortious or even criminal.

But here, too, a program running on a user's VR headset might be able to detect strobe lighting and convert it to something non-strobing. People who know they are strobe-sensitive, or who even think they might be, could then easily turn on this program.

The initial exposure - for those who have neglected to get and turn on such a program, or for those who are unaware that they need it - is materially more dangerous than in the disturbing-the-peace scenario: physical injury, and not just annoyance. And an attempt to deliberately trigger a seizure, as in our hypothetical, is highly morally culpable. The purpose is to harm someone, even if most of the time the purpose will be frustrated by the target's precautions.

Would this be enough to lead the police to be willing to intervene? Or would they likely not think this to be worth triggering a possible interstate or international investigation, when, at least going forward, the victim could avoid such harms through technological means?

The strobe light example is the rare virtual hypothetical that combines such culpability with the real risk of physical injury, but others might arise in the future: Imagine, for instance, a hack that alters the VR camera positioning information so that a user who thinks she is in the middle of her living room is in fact standing at the edge of the stairs, or that deliberately sends someone using AR walking into a wall or off a cliff. The use of VR (or, more likely, AR) systems to deliberately cause physical harm to a user is more likely to get the attention of police and courts than are disturbing the virtual peace or virtual indecent exposure. But it will do so precisely because the consequences are more obviously physical rather than virtual.

2. "Virtual groping"

Harm, though, can also feel real without being physical. Only a few months after commercial VR became available, a woman named Jordan Belamire (a pseudonym) was "virtually groped." Belamire recounted playing a multi-player zombie shooter game when another player - who recognized Belamire as female by her voice - began to make gestures that seemed like virtual groping:

In between a wave of zombies and demons to shoot down, I was hanging out next to BigBro442 [the other player], waiting for our next attack. Suddenly, BigBro442's disembodied helmet faced me dead-on. His floating hand approached my body, and he started to virtually rub my chest….

[E]ven when I turned away from him, he chased me around, making grabbing and pinching motions near my chest. Emboldened, he even shoved his hand toward my virtual crotch and began rubbing….

And Belamire reports that BigBro442's behavior, though utterly lacking in physical contact, seemed so realistic as to be disturbing. Belamire had earlier in her article described how realistic a VR cliff seemed to be, triggering her fear of heights.

"The virtual groping," she said, "feels just as real. Of course, you're not physically being touched, just like you're not actually one hundred feet off the ground, but it's still scary as hell." Her experience is consistent with the studies we reported in Part I.C.2 suggesting that people react physiologically to touches in VR much as if they had happened in the physical world.

Under current law, virtual groping probably wouldn't be a crime. It isn't sexual battery, because there's no touching. Tort law tends to define "assault" as including an actor's intentionally putting someone in "imminent apprehension" of "offensive contact," but criminal law tends not to outlaw such behavior unless it is actually an attempt to commit battery. And beyond that, it's not clear that such imminent apprehension would be present when the target consciously knows that no physical contact is possible. While sexual threats by remote actors over the Internet have sometimes been treated as crimes, those cases all hinge on the plausibility that the threat made over the Internet will be carried out in the physical world.

Should the law be changed? We suspect that very few people would find virtual groping, accomplished through purely visual means, to be as upsetting as real groping. Nonetheless, Belamire is doubtless right that, because of the visceral feeling created by virtual reality, such virtual groping will be more upsetting to many people than getting an unwanted tweet or an email expressing sexual desire.

And people's reactions may well depend on how developed and personalized their avatar is, something that differs from platform to platform and game to game, and that is likely to change over time. Perhaps virtual groping will be upsetting enough to treat it as the sort of action that criminal law ought to, in principle, forbid, if not now then in the near future. This question likely can't be resolved until we have more experience with how people actually feel in such situations.

Nonetheless, here too, as in the indecent-exposure scenario, there is reason to be skeptical of whether criminal law can and should apply. First, as always, is the Bangladesh Problem: Few police departments will be eager to extradite someone from another country or even another state simply because he made gestures, however disquieting, in a virtual reality game. Even police officers who greatly respect women's bodily integrity may be hesitant to use a great deal of resources to deal with people who, after all, did not literally touch anyone.

Second, here too technologically enabled self-protection may be available. The physical structure of the real world is notoriously tolerant of people coming very close to you. Protection from unwanted touch has to rely on legal rules, social mores and the threat of violent self-protection.

But the code-as-law of the VR world can easily forbid avatars from approaching within some perceived distance of you, or forbid particular people from doing it, or forbid this except in certain games. Indeed, VR developers have already offered this as a response to Belamire's article; as the author of the VR game that Belamire had been playing wrote:

We should have prevented this in the first place. While QuiVr is still in pre-release alpha, we'd already programmed a setting into the game called your, "Personal Bubble," so other player's hands disappear if they come close to your face. This way, the rare bad-apple player can't block someone else's view and be annoying. The arrows that get shot at you stick in your helmet, which is good for a laugh, but they do no damage and quickly disappear so they don't get in the way. We hadn't, though, thought of extending that fading function to the rest of the body ….

I called Jonathan, who is … the original creator of QuiVr …. He'd already seen the article - his girlfriend had sent him the link - and he had spent the morning changing the game to extend the Personal Bubble; now, when the setting was turned on, other players faded out when they reached for you, no matter their target, chest included…. It was a possible solution; no one should be able to treat another player like the author had been treated again.

Indeed, the author suggested other technologically enabled self-protection options, including ones that come across as more active self-defense (or, if you prefer, retaliation) - perhaps, for instance, allowing a player to "reach[] out with a finger, and with a little flick, sent [the other] player flying off the screen like an ant." One can even design the game so that this feature can be used only against those avatars who come too close to one's own (or else the flicking could itself become a form of unprovoked aggression). Or the VR or AR company can set up a bubble feature that excludes some avatars but not others that the participant has placed on a "close approach permitted" list.

If people behaved better, none of this would be needed. But given that people do behave badly, VR and AR technologies sometimes offer better tools for dealing with bad behavior than the physical world does.

3. AR crimes that can't be easily technologically avoided - startling

Finally, let's note one crime that is especially likely to be dangerous in AR: deliberately or recklessly startling someone in a way that's likely to dangerously interfere with his physical-world tasks.

Say I know that you're driving with your AR set engaged, and I deliberately appear in your field of vision - not just as me, but as a giant, loud, fire-breathing dragon (or perhaps as a very attractive naked person of whatever sex you find attractive). Or perhaps I happen to know that you have a fear of spiders, so that's the avatar I choose, in an attempt to startle you. You are indeed startled and get into an accident. (In principle, this might happen even in VR, but the risks are greater when people are using AR, which they might do even when driving or walking down a busy street.)

That might well be a crime, such as reckless endangerment, or negligent homicide or involuntary manslaughter if someone dies. It could certainly be a tort (more on that in Part III). This would be one of the few scenarios - strobe lighting being the other - that could actually cause physical injury. And it is also not easily avoided through technological self-help measures.

But as a practical matter, this is likely to be a special case of the broader problem: AR can be distracting, especially for drivers but also for people walking near traffic and other hazards. AR designers will have to find some way of dealing with such normal incidental distractions; that might likewise be useful for dealing with deliberate but much more unusual distractions.