The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
N.Y. AG Suggests Banning Anyone from Distributing Images of Homicide Made by Killer or Killer's Associates
From a report released yesterday:
We recommend that New York (and other states) impose criminal liability for the creation by the perpetrator, or someone acting in concert with the perpetrator, of images or videos of a homicide. Such videos are an extension of the original criminal act and serve to incite or solicit additional criminal acts. In addition, these videos are obscene on their face. Importantly, appropriate legislation should avoid covering videos created by bystanders or passively, such as those captured by police officers' body-worn cameras.
It's certainly a crime to commit homicide (assuming the proposal would be limited to criminal homicide, and not to self-defense and the like), and to conspire with others to commit homicide. (Even if the only "acting in concert" is recording the images, that may well be viewed as purposeful aiding and abetting the killing, much as yelling words of encouragement during a crime can so qualify.) It's not clear whether the First Amendment would allow the law to tack on extra punishment for photographing or videorecording the homicide. But certainly there are ample tools to punish such people for the homicide itself.
But the report goes on:
We also recommend imposing civil liability for the distribution and transmission of this content, including making liable online platforms that fail to take reasonable steps to prevent unlawful violent criminal content from appearing on the platform. Significant penalties, sufficient to realize the goal of deterrence, should be levied in cases where an online platform fails both to take such reasonable steps and to prevent the transmission of content that is captured by or created by the perpetrator of a homicide, or one working in concert with the perpetrator of a homicide, and that depicts a homicide.
That, I think, is unconstitutional, precisely because it's not limited to the killer or people working with the killer. The First Amendment protects people's rights to convey images of crime, which often reveal important information about what happened, how it might have been prevented, and the like—whether those images were captured by criminals or by innocent witnesses. And the First Amendment protects people's rights as recipients of information to view such images.
Nor can this be avoided on the theory that such videos "are obscene"; the Court has made clear that the obscenity exception is limited to images of sex, not of violence (see the violent video games case). Perhaps the exception should be rejected for sex as well, but the Court has upheld it because of the longstanding tradition in American law of allowing some such restrictions; the Court has, however, refused to extend the exception beyond that traditional scope.
The best argument in favor of the AG's proposal would be that this fits within the First Amendment exception of "speech integral to criminal conduct," which covers things like soliciting a crime against a particular person, directly providing specific information to help someone commit a crime, and the like. I discuss this (in 72 pages) in a 2016 article, The "Speech Integral to Criminal Conduct" Exception. The closest analogy there would be child pornography law, under which the government may ban the distribution and even the possession of illegally created images of children having sex, in order to deter the production of such images. Here's the AG's argument:
Some categories of objectionable content are already subject to restrictions even more stringent and wide-ranging. Tech companies have taken extraordinary measures to comply with federal law criminalizing the creation, distribution, and possession of child sexual abuse material (CSAM). They have made great strides in the effort to fully eradicate CSAM on both the mainstream internet and even on fringe sites, such as 4chan, that otherwise do little content moderation. Anti-CSAM laws have survived First Amendment scrutiny because they criminalize a category of speech widely viewed as obscene. The distribution of CSAM material has been upheld as speech integral to illegal conduct—without a market for CSAM material, there would be no motivation to create such material.
A restriction on creating and distributing this material should be drafted in a manner that ensures conformity with the First Amendment. Critically, the rationale of preventing murderers from promoting their crimes—especially racially motivated mass-murderers—is a governmental interest of the highest order, and there is no societal benefit to the dissemination of such videos by their perpetrators. Accordingly, any such law must be tailored specifically to reach only those videos where such serious crime occurs. Any such law should also aim to avoid imposing any penalty for videos that have educational, historical, or social benefits. At a minimum, state attorneys general and other law enforcement agencies should be given explicit authority to enforce these provisions with steep penalties. Further consideration should be given whether the public (or some group of directly affected individuals) should have a private right of action tailored to promote these compelling government interests
But I don't think this is a close enough analogy, for two reasons. First, a major reason why child pornography is banned is that,
While the production of pornographic materials is a low-profile, clandestine industry, the need to market the resulting products requires a visible apparatus of distribution. The most expeditious if not the only practical method of law enforcement may be to dry up the market for this material by imposing severe criminal penalties on persons selling, advertising, or otherwise promoting the product….
The advertising and selling of child pornography provide an economic motive for and are thus an integral part of the production of such materials, an activity illegal throughout the Nation.
Of course, murder—and especially "mass-murder[]"—is far from low-profile, and is often not even clandestine. Banning distribution of photos or videos of murder will likely do very little to deter such shootings. Some of the mass killers may be motivated by the desire for fame, but that will generally come entirely apart from the images of the killings themselves (as we've often seen with regard to pass mass killings). It's hard to imagine someone who's committing the killing simply to have other people see the images that he or his coconspirators have taken, and who would be deterred by the prospect that those images would no longer be legally available.
Second, another premise of restrictions on child pornography (and, in part, other pornography) is that,
The value of permitting live performances and photographic reproductions of children engaged in lewd sexual conduct is exceedingly modest, if not de minimis. We consider it unlikely that visual depictions of children performing sexual acts or lewdly exhibiting their genitals would often constitute an important and necessary part of a literary performance or scientific or educational work. As a state judge in this case observed, if it were necessary for literary or artistic value, a person over the statutory age who perhaps looked younger could be utilized. Simulation outside of the prohibition of the statute could provide another alternative. Nor is there any question here of censoring a particular literary theme or portrayal of sexual activity. The First Amendment interest is limited to that of rendering the portrayal somewhat more "realistic" by utilizing or photographing children.
But images of mass murders, which can often show important details about who is culpable for the crime, how and by whom the crime might have been stoppable, and the like, often do have substantial value to public debate. Of course, many media outlets have their own ethical judgments about whether to show such images, whether based on respect for the dead or concern about the sensibility of their viewers. But the First Amendment doesn't, I think, allow the government to impose such judgments on the public. (Nor can the government just generally ban such material, but with an exception that would be applied by individual prosecutors, judges, or juries for "videos that have educational, historical, or social benefits"; see U.S. v. Stevens (2010), which held that a similar exception couldn't save a ban on depiction of illegal violence, there violence against animals.)
The AG's report also urges that § 230 be modified to allow liability for platforms that don't do enough to prevent the distribution of such images on their platforms. That also raises extra problems; for instance, it may often be hard to tell whether a photo or video was taken by a coconspirator or by an innocent bystander, and the risk of liability can lead to overdeterrence.
But beyond that, any such liability on the platforms would be unconstitutional if the speech is itself constitutionally protected. So if I'm right that the imposition of liability on third parties that forward it is unconstitutional, then imposition of liability on platforms that don't stop the forwarding would be unconstitutional, too.
Of course, all this would apply to depictions of any kinds of crimes, whether racist or anti-Semitic mass murders, gang members' shootings of each other, shootings of police officers, bombings by ideological extremists (whether anti-abortion, environmentalist, animal-rights, or anything else), and the like. The crimes should most certainly be punished; but I don't think the public can be barred from distributing photos and videos of those crimes, whoever might have taken them.
Show Comments (23)