The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Free Speech

Journal of Free Speech Law: "Carriage and Removal Requirements for Internet Platforms: What Taamneh Tells Us," by Daphne Keller

|

This just-published article is here; the Abstract:

The Supreme Court's 2023 Twitter v. Taamneh ruling arrived at a moment of potentially great change in the laws governing Internet platforms. The ruling, including Justice Thomas's factually dubious descriptions of platforms as passive and agnostic toward user content, will have major influence in two categories of future cases.

  • Most obviously, Taamneh will affect cases in which plaintiffs seek to hold platforms liable for unlawful material posted by users.
  • Less obviously, Taamneh is sure to affect "must-carry" claims or mandates for platforms to carry content against their will—including major cases on the Supreme Court's 2023–24 docket about laws in Texas and Florida.

This piece explains how Taamneh may be invoked in both kinds of cases, and explores the strengths and limitations of parties' likely arguments—in particular, arguments based on platforms' putative passivity. It also examines import­ant intersections between must-carry claims and claims seeking to hold plat­forms liable for unlawful user content. It argues that ignoring those connections may cause cases in one area of law to distort outcomes in the other. Taamneh itself may already have done just that.

And an excerpt from the Introduction:

Efforts to regulate platforms in the U.S. are often stymied by a fundamental dis­agreement. Some voters and leaders—mostly Democratic—want platforms to remove more content, including "lawful but awful" material like medical misinformation or hate speech. Some—mostly Republicans—want platforms to remove less content. These same competing impulses appear in litigation. Some plaintiffs sue to prevent platforms from removing content, and others to hold them liable for failing to do so. Courts, unlike Congress, may soon be forced to resolve the tensions between these competing imperatives. Their decisions will almost certainly be shaped by a 2023 Supreme Court ruling, Twitter v. Taamneh.

Taamneh was one of two closely-linked cases about platforms' responsibility for user content. In it, the Court unanimously held that Twitter, Facebook, and YouTube were not liable for harms from ISIS attacks. In the second case, Gonzalez v. Google, the Court declined to decide whether the platforms were also immunized under the law known as Section 230. Litigation on other claims that would effectively hold platforms liable for failing to remove user speech—which I will call "must-remove" cases—continues in lower courts.

Questions about the opposite legal pressure, under so-called must-carry laws that prevent platforms from removing certain user speech, will be considered by the Court this Term in NetChoice v. Paxton and Moody v. NetChoice. Those cases challenge must-carry laws in Texas and Florida. Platforms argue that the laws violate their First Amendment rights, by stripping them of editorial discretion to set speech policies and remove harmful content.

Must-carry and must-remove claims generally come up in separate cases and are treated as unrelated by courts. But they are deeply intertwined. Taamneh, as the Court's first major holding on platforms and speech since the 1990s, will matter to both. This piece examines the connections between must-remove and must-carry claims through the lens of the tort law questions examined in Taamneh. It argues that Taamneh's emphasis on platform "passivity," which permeates the opinion, may have troubling ramifications. Indeed, Texas relied heavily on Taamneh in a brief supporting its must-carry law, arguing that platforms' passivity demonstrates that platforms have no First Amendment interest in setting editorial policies.

Legal disputes like the one in Taamneh, about whether plaintiffs have any cause of action against platforms in the first place, are relatively common in litigation. But academic and public policy discussion tends not to focus on these issues about the merits of plaintiffs' underlying claims. Instead, academics and policymakers often focus on the other two major sources of law governing platforms and speech: immunity statutes and the Constitution. Statutory immunities under laws like Section 230 have been the subject of noisy debates for a number of years, while constitutional questions rose to prominence more recently. It is clear that the First Amendment sets real limits on the laws that govern platforms. But it is less clear exactly what those limits are.

Basic liability standards, immunity standards, and First Amendment standards all combine to shape platform regulation…. A predictable consequence of laws that hold platforms liable or potentially liable for user speech is that platforms will, out of caution, remove both unlawful and lawful material in order to protect themselves. As the Supreme Court has noted in cases about movie theaters and video distributors, speech intermediaries can be chilled more easily than ordinary speakers and publishers. Platforms generally have less motivation to defend users' speech than users themselves do, and less ability to ascertain legally relevant facts. As recognized in a midcentury Supreme Court case about bookstores, Smith v. California, this can be a problem of constitutional dimension. Smith struck down a law that held bookstores strictly liable for obscene books, noting that such unbounded liability would foreseeably lead booksellers to err on the side of caution and "restrict the public's access to forms of the printed word which the State could not constitutionally suppress directly." The resulting "censorship affecting the whole public" would be attributable to state action, and "hardly less virulent for being privately administered."

In Taamneh, the Court steered well clear of examining these First Amendment issues by rejecting plaintiffs' claims under basic tort law. That doesn't always work, though—as at least one amicus brief pointed out, and as I will discuss here. Courts' resolutions of claims like the one in Taamneh can shape both platforms' editorial policies and users' opportunities to speak. As a result, free expression considerations can in turn play a structuring role when courts interpret the elements of such tort claims.