The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Free Speech

No Liability for Netflix Over Teenage Suicides Supposedly Triggered by Thirteen Reasons Why

|

Plaintiffs claimed that Netflix should be held liable, on the theory that "it had been warned by experts backed by decades of empirical research that child suicides and other profound psychological harm would occur if impressionable youths were targeted and not warned of the health risks inherent in viewing the Show." But today's decision by Judge Yvonne Gonzalez Rogers (N.D. Cal.) in Estate of B.H. v. Netflix, Inc. rejects that claim, giving four reasons why not, including:

[P]laintiffs' strict liability claim fails because it is premised on the content and dissemination of the show. There is no strict liability for books, movies, or other forms of media. See Winter v. G.P. Putnam's Sons (9th Cir. 1989) (holding that the content of a book concerning mushrooms did not support a products liability claim and basing its holding on the restatement relied upon by California courts as well as numerous authorities reaching the same)…. [P]laintiffs' efforts to distance the claims from the content of the show do not persuade. Without the content, there would be no claim….

[P]laintiffs also fail to identify a duty to support the negligence-based claims…. California courts have declined to find a duty as a matter of law … for claims implicating expression. McCollum v. CBS (Cal. App. 1988) (finding no duty as a matter of law when a musician  sought to appeal to troubled audience "perhaps most significantly, it is simply not acceptable to a free and democratic society to impose a duty upon performing artists to limit and restrict their creativity in order to avoid the dissemination of  ideas in artistic speech which may adversely affect emotionally troubled individuals"); Bill v. Superior Court (Cal. App. 1982) (finding that "the petitioner's activity in producing a motion picture and arranging for distribution, is socially unobjectionable—and, in light of First Amendment considerations, must be deemed so even if it had the tendency to attract violence-prone individuals to the vicinity of the theaters at which it was exhibited."). California courts have required a "very high degree of foreseeability" in cases concerning suicide and have found it difficult to satisfy. McCollum (holding that teen's suicide "was not reasonably foreseeable risk or consequence of defendants' remote artistic activities" even when the music was known for promoting suicide). The Court has not been persuaded that this case based upon the creation and dissemination of a show requires a different result.

Additionally, authority cited by plaintiffs demonstrate that even if there was a special relationship [between the parties, an argument raised by the plaintiffs -EV], the Rowland factor must be considered and may limit the duty. The above cases demonstrate that the countervailing First Amendment policy concerns warrant limiting the duty even if there were a special relationship. {Plaintiffs also acknowledge that traditional examples of a special relationship are "[t]he relationships between common carriers and their passengers, or innkeepers and their guests." The California Supreme Court recently extended this to college universities who "in turn, have a superior control over the environment and the ability to protect students," including "the power to influence [students'] values, their consciousness, their relationships, and their behaviors." The allegations of targeting in plaintiffs' FAC do not arise to this degree of control. Plaintiffs' allegations again concern the content of the show and implicate countervailing First Amendment policy concerns.}

This strikes me as quite right, for the reasons given in the cases on which Judge Rogers relies. Among other things, even speech intended to cause illegal conduct (a category that might be extensible to speech intended to cause suicide) is generally constitutionally unprotected only if

  1. it falls in the narrow incitement exception, which is to say that it's (a) intended to and (b) likely to cause (c) imminent illegal conduct, or
  2. it falls in the solicitation, exception, which is less well-defined but is likely limited to speech that urges crimes against specific targets or involving specific items of contraband.

But of course here Netflix isn't even alleged to have intended to cause harm; the claim is that it acting negligently in unreasonably distributing material that carried the risk of harm. Speech that supposedly merely negligently (or even recklessly) leads some viewers to act badly doesn't fit within these exceptions—or else a vast range of speech that is seen by millions of people would be jeopardized, because for so many works there is some risk that some tiny fraction of viewers will misbehave (against themselves or others) because of what they see there.