The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Internet

An Important Cert Petition Pending Before the Supreme Court on Section 230 Immunity

The briefing is completed on a cert petition presenting the urgent question of whether section 230 immunizes Twitter's knowing possession and distribution of child sex abuse materials.

|

I previously blogged about a cert petition I helped to prepare regarding the scope of section 230 immunity. Last week, the final briefing was submitted to the Supreme Court. In my view, the briefing makes clear that the Court should grant cert on the important issue of whether section 230 immunizes Twitter's knowing possession and distribution of child pornography.

The issue arises through a provision in the Communications Decency Act, which encourages "Good Samaritan" acts to keep objectionable content off the internet. 47 U.S.C. §230(c). The Act states internet platforms are not liable for "good faith" acts to remove such content. Section §230(c)(2)(A). It clarifies that platforms may exercise that editorial control without being "treated as the publisher or speaker." Section 230(c)(1).

In the case at hand, the Ninth Circuit construed the Act to immunize an internet platform's knowing and deliberate decision to keep "child pornography" on the internet—a federal crime so damaging that Congress expressly allows victims to seek civil penalties. (The images are better described as "child sex abuse materials, or CSAM, but we have followed the current legal nomenclature.) In response to repeated alerts that child pornography depicting the minor petitioners (John Doe 1 and John Doe 2) was on its platform, Twitter asked for John Doe 1's ID—verifying that he was a minor. Twitter, in its own words, "reviewed the content" showing coerced sex acts by minors. But Twitter then decided "no action will be taken." The video proliferated—and Twitter profited—until a Department of Homeland Security official intervened.

The victims sued. Twitter claimed immunity. The Ninth Circuit agreed that Section 230 precludes federal civil penalties for that knowing sexual exploitation of children. And our cert petition presented the question of whether section 230's Good Samaritan immunity applies to the knowing possession and distribution of child pornography.

Twitter has now responded to our petition. In its brief in opposition, Twitter characterizes the case as involving its decision of whether to screen such content:

Child pornography is the most serious category of harmful content that platforms encounter—a fact no one disputes and Twitter does not minimize. But that is a difference of degree, not of legal kind. Courts have uniformly applied Section 230(c)(1) to bar claims that seek to treat a website as the publisher of a third party's obscenity, illegal pornography, or other arguably criminal content. See, e.g., Force v. Facebook, 934 F.3d at 59 (content that encouraged terrorism); Barnes v. Yahoo!, Inc., 570 F.3d at 1098 (nonconsensual nude images);  Dyroff v. Ultimate Software Grp., Inc., 934 F.3d at 1095 (content facilitating illegal drug sales). Indeed, a foundational premise of Section 230 is that websites hosting third-party content cannot be charged with screening all such content; to hold that offending content of any particular stripe  disables the statutory immunity would contravene that premise and expose websites to potential liability simply because their screening has been imperfect in some way.

Our reply argues strenuously that Twitter is mischaracterizing what it did—which was to distribute child pornography, even after it had specifically identified the material as child pornography and knew that the materials were being downloaded by tens of thousands of its customers:

Twitter never contests those knowing criminal acts. Instead, Twitter restates the questions presented, reframing this petition as just another petition about "not removing more quickly allegedly unlawful third-party conduct." Twitter reduces the petition to a dispute about how "swiftly" it acted. Twitter even claims the petition is a poor vehicle because its criminal wrongdoing did no harm.

Try as it might to paint this petition as one about some unknowing publisher, Twitter acted knowingly and deliberately. Those criminal acts led John Doe 1 to consider ending his life. Those criminal acts prompted a DHS official to intercede. Petitioners are still suffering from them. …

This petition is the straightforward vehicle to address the outermost limit of § 230  immunity. This case is not about what steps some internet company should take to monitor content that it is unwittingly hosting. It is about the company's own knowing and deliberate acts. Do federal laws prohibit Twitter from knowingly possessing, distributing, and profiting from child pornography—like everyone else? Or does § 230 put Twitter above the law?

Our cert petition was strongly supported by four excellent amicus briefs.

Florida and sixteen states support the petition. The states amicus brief argues:

That [broad] construction of Section 230 has resulted in widespread derogation of state laws. See Gregory M. Dickinson, The Internet Immunity Escape Hatch, 47 BYU L. Rev. 1435, 1471 (2022). Internet companies have used Section 230 to thwart state-law claims for breach of contract, Goddard v. Google, Inc., No. C 08-2738, 2008 WL 5245490, at *5 (N.D. Cal. Dec. 17, 2008); unlawful and unfair competition, Kimzey v. Yelp Inc., 21 F. Supp. 3d 1120, 1121–23 (W.D. Wash. 2014); cyberstalking and securities violations, Universal Commc'n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 422 (1st Cir. 2007); tortious interference with a business expectancy, Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 252–53 (4th Cir. 2009); negligence, Daniel v. Armslist, LLC, 926 N.W.2d 710, 726 (Wis. 2019); and products liability, negligent design, and failure to warn, Herrick v. Grindr, LLC, 306 F. Supp. 3d 579, 588–92 (S.D.N.Y. 2018).

Worse, social-media companies wield this Section 230 shield while swinging the sword of free speech—arguing their content "curation" constitutes expressive activity that invalidates states' regulations. See, e.g., Moody v. NetChoice, LLC, 603U.S. 707, 721 (2024). And where state tort liability ordinarily checks First Amendment freedom, Section 230's excessive immunity eviscerates any balance. See Doe v. McKesson, 71 F.4th 278, 291–92 (5th Cir. 2023) ("[T]he First Amendment does not prohibit States from imposing tort liability even if the tort occurs in the context of expressive activity." (citing NAACP v. Claiborne Hardware Co., 458 U.S. 886, 927 (1982)); Pennekamp v. Florida, 328 U.S. 331, 356 (1946) (Frankfurter, J., concurring) (First Amendment freedom is "not a freedom from responsibility for its exercise.").

Senator Josh Hawley filed a brief arguing that this case requires Supreme Court review:

Nothing in the text, structure, or history of the Communications Decency Act's Section 230 shields internet platforms that knowingly possess and distribute child pornography. Congress drafted Section 230 to avoid publisher liability without upending distributor liability, which had its own rich common law backdrop. The Good Samaritan carveout in Section 230(c) is narrow and unrelated to any facts in this case. The Ninth Circuit's  holding to the contrary represents the most extreme extension yet of Section 230 immunity in the statute's history, and is a perfect vehicle for this Court to rein in the statute's purview to the four corners of its text.

The National Center for Missing and Exploited Children (NCMEC) filed an amicus brief arguing the Ninth Circuit's decision will harm efforts to combat emerging threats to children online:

The Ninth Circuit's decision—holding that 47 U.S.C. § 230 immunizes internet platforms from liability for the knowing possession and distribution of child sexual abuse material (CSAM)—will make it exponentially harder to fight the crisis of online child sexual exploitation. In NCMEC's experience, strong incentives—including the possibility of  liability for knowing participation in criminal child sexual exploitation—are essential to ensure an adequate response from internet platforms. Absent the Court's review, the Ninth Circuit's decision will undercut a fragile system that is already overly reliant on voluntary efforts and the variable goodwill of internet platforms.

And Child USA filed an amicus brief arguing that Twitter was invoking an unjustifiably broad reading of Section 230:

Congress never intended Section 230 to operate as a blanket shield for internet service providers ("ISPs"). Its narrow purpose was to prevent ISPs from being treated as the publisher or speaker of thirdparty content. Petitioners' claims fall far outside that protection: they seek to hold Twitter liable for its own misconduct—misconduct that, if proven, reflects knowing participation in and financial benefit from criminal exploitation.

I hope that the Supreme Court grants our petition, presenting an extremely important and recurring issue. Reading federal law as an open invitation for social-media companies to  knowingly possess, profit from, and distribute child pornography is an egregious misinterpretation of Section 230—and one that causing grave harm to child victims. The Court should step in to determine whether Section 230 truly places Twitter above the criminal law and allows it to knowingly distribute child pornography.

Counsel of record for our petition is Taylor Meehan, joined by colleagues Thomas McCarthy and Tiffany Bates of the Antonin Scalia Law School Supreme Court Clinic/Consovoy McCarthy PLLC. Also assisting on the petition were Benjamin Bull, Peter Gentala, Danielle Pinter, and Christen Price from the National Center on Sexual Exploitation; Lisa and Adam Haba of the Haba Law Firm, Paul Matiasic of the Matiasic Firm, P.C., and me.

Update: I've added into the original post discussion of the amicus brief from Florida and other states. I've also linked the cert petition into the post.