Section 230

Abortion Rights Advocates Ask DOJ To Defend Section 230

While "the 26 words that created the internet" have been under fire from both sides, two groups argue that the 1996 law is essential to the future of abortion rights.

|

A case on the Supreme Court's docket could spell the end of Section 230, a law that has benefited websites and online platforms for decades. In a letter last week, two center-left organizations asked the Department of Justice (DOJ) to come to the law's defense.

The law stems from a pair of court decisions from the mid-90s, when the consumer internet was in its infancy. Taken together, the cases established that if any website moderates user-generated content, it's therefore responsible for all user content, whereas it can avoid all liability if it makes no moderation decisions whatsoever. Seeing an incentive against moderation as antithetical to the internet's potential growth, two then-congressmen drafted a law that would absolve online platforms of liability for most user-generated content. That law became known as Section 230 of the Communications Decency Act, sometimes called "the 26 words that created the internet" because of how its straightforward applicability allowed the nascent web to grow and flourish.

Section 230 has received its share of criticism: During his administration, former President Donald Trump and his allies routinely threatened to modify or simply repeal it. Democrats have also been quite critical, with President Joe Biden saying in September that Congress should "get rid of special immunity for social media companies" and "hold social media platforms accountable for spreading hate and fueling violence."

Early next year, the Supreme Court will consider whether Section 230 also applies to recommendations made by a platform's algorithms. The case, Gonzalez v. Google, stems from a 2015 ISIS attack and claims that YouTube's recommendation algorithm helped the terror group spread its message by recommending its videos. Section 230 may grant platforms immunity from user-generated content, the suit argues, but what if the platform then recommends that content to other users?

Last week, a letter jointly signed by the Chamber of Progress, a center-left technology industry group, and Advocates for Youth, a D.C.-based sexual health advocacy nonprofit, encouraged Attorney General Merrick Garland to "submit a brief in support of defendants" on behalf of the DOJ. The defendant is Google, owner of YouTube, which the Chamber of Progress lists as one of its "corporate partners."

Earlier this year, the Supreme Court overturned Roe v. Wade (1973), which had guaranteed abortion access for nearly 50 years. In its wake, many states moved to ban the practice, with some potential bans including websites that provide information on abortion.

In that context, the letter says that Section 230 allows platforms to host content without fear of violating state laws. In fact, Section 230 strengthens a platform's First Amendment protections against litigation or prosecution over its moderation decisions by making it more difficult to bring such cases in the first place. In an influential case soon after its passage, the Fourth Circuit Court of Appeals decided that "lawsuits seeking to hold [an online] service provider liable" for moderation decisions "are barred."

"Without Section 230," the letter worries, "online services might be compelled to limit access to reproductive resources, for fear of violating various state anti-abortion laws."

That same hypothetical applies to any number of issues. This is exactly what the law's supporters have said: In a return to a pre-Section 230 world of either moderate everything or moderate nothing, most platforms will simply err on the side of moderation and take down any content that anyone objects to. It's likely that most decisions not to take down content would ultimately be sanctioned under the First Amendment anyway, but it won't be worth the hassle to deal with an assemblage of meritless lawsuits.

Any weakening of Section 230 would only serve to make things more difficult for both platforms and their users.