Does Section 230 Immunize Twitter's Knowing Possession of Child Sex Abuse Materials?
A recently filed cert petition presents this important issue to the U.S. Supreme Court.
Section 230 immunity is an expansive, largely court-created doctrine that often renders online platforms immune from civil suits for any content they maintain. The immunity's sweeping scope is quite controversial, as it seems to extend to online platform protections against civil suits that other entities and individuals do not enjoy. I recently helped file a cert petition presenting compelling facts illustrating the need for narrowing the expansive immunity that lower courts have read into section 230. Justices Thomas and Gorsuch have previously called for review of the scope of section 230 immunity. Because this petition presents the issue cleanly, I hope that the Court will grant it.
The issue arises through a provision in the Communications Decency Act, which encourages "Good Samaritan" acts to keep objectionable content off the internet. 47 U.S.C. §230(c). The Act states internet platforms are not liable for "good faith" acts to remove such content. Section §230(c)(2)(A). It clarifies that platforms may exercise that editorial control without being "treated as the publisher or speaker." Section 230(c)(1).
In the case at hand, the Ninth Circuit construed the Act to immunize an internet platform's knowing and deliberate decision to keep "child pornography" on the internet—a federal crime so damaging that Congress expressly allows victims to seek civil penalties. (The images are better described as "child sex abuse materials, or CSAM, but we have followed the current legal nomenclature.) In response to repeated alerts that child pornography depicting the minor petitioners (John Doe 1 and John Doe 2) was on its platform, Twitter asked for John Doe 1's ID—verifying that he was a minor. Twitter, in its own words, "reviewed the content" showing coerced sex acts by minors. But Twitter then decided "no action will be taken." The video proliferated—and Twitter profited—until a Department of Homeland Security official intervened.
The victims sued. Twitter claimed immunity. The Ninth Circuit agreed that Section 230 precludes federal civil penalties for that knowing sexual exploitation of children.
In our recently filed cert petition, we present two important questions:
1. Whether section 230's Good Samaritan immunity applies to the knowing possession and distribution of child pornography.
2. Whether section 230's Good Samaritan immunity applies to knowingly benefiting from a sex-trafficking venture.
You can read our entire petition here, but I want to highlight several key aspects.
First, the lower courts' version of so-called "section 230 immunity" goes far beyond that text. The lower courts have misread section 230(c) to "confer sweeping immunity," shielding "any decision to edit or remove content." Malwarebytes, 141 S. Ct. at 13, 17 (Thomas, J.). They deem companies immune from suits challenging their own "egregious, intentional acts." Snap, 144 S. Ct. at 2494 (Thomas, J.). Bucking common-law background principles, they dismiss suits "even when a company distributes content that it knows is illegal." Malwarebytes, 141 S. Ct. at 15 (Thomas, J.) As those decisions have proliferated, socialmedia companies have "increasingly used § 230 as a get-out-of-jail free card." Snap, 144 S. Ct. at 2494 Thomas, J.).
Second, extending section 230(c) so far beyond its text has had "serious consequences." Malwarebytes, 141 S. Ct. at 18 (Thomas, J.). The notion that socialmedia companies may claim immunity from civil penalties for knowingly and deliberately possessing and distributing criminal child pornography on their platforms incentivizes a lawless and dangerous "public square," Packingham, 582 U.S. at 107. That came at great cost to Petitioners. They could not convince Twitter to remove criminal videos even after John Doe 1 verified his identity and emphasized repeatedly that he and John Doe 2 were minors and had gone to the police. In response, Twitter said it "reviewed the content" but would take "no action." It took a DHS official to convince Twitter to change course. The tragic exploitation of children that occurred here will continue to repeat itself without the Supreme Court's review. See Facebook, 142 S. Ct. at 1088 (Thomas, J.).
Third, section 230 does not put internet platforms above the law, cloaking them with immunity despite knowingly possessing, distributing, and profiting from child pornography. Yet, that is exactly what the Ninth Circuit posited: Twitter cannot be required to "take any action associated with publication (e.g., removal)," even "once it learns of the objectionable content." For child pornography, that is a crime with congressionally authorized civil penalties. It works the utmost perversion of federal law to declare a company "immune" from those civil penalties for its own knowing acts.
Fourth, this petition is an excellent vehicle to review the scope of immuity, allowing the
Court to make an important inroad clarifying section 230's scope without biting off too much. The case turns on actual knowledge. The allegations are those of a platform's own criminal-level wrongdoing. It allows the Court to start with the easy question: Can an internet platform claim "Good Samaritan" immunity after it obtains actual knowledge that it is possessing, distributing, and profiting from child pornography—federal crimes—and makes the deliberate decision to keep that criminal content on its platform? The Court can leave for another day more difficult questions that might arise in future cases about only tortious conduct or a mens rea short of Twitter's "actual knowledge" that even the Ninth Circuit did not dispute. This is the future opportunity to begin addressing section 230 that members of the Court have anticipated.
Counsel of record for our petition is Taylor Meehan, joined by colleagues Thomas McCarthy and Tiffany Bates of the Antonin Scalia Law School Supreme Court Clinic/Consovoy McCarthy PLLC. Also assisting on the petition were Benjamin Bull, Peter Gentala, Danielle Pinter, and Christen Price from the National Center on Sexual Exploitation; Lisa and Adam Haba of the Haba Law Firm, Paul Matiasic of the Matiasic Firm, P.C., and me.
I hope that the Supreme Court takes a close look at this petition, presenting an extremely important and recurring issue.