The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Volokh Conspiracy

Reforming section 230 of the Communications Decency Act

Why are we giving the world's biggest tech companies litigation subsidies they don't need?

|

Section 230 of the Communications Decency Act seems to inspire bipartisan antipathy.

Joe Biden said it "should be revoked, immediately," and President Trump characteristically topped that by tweeting "REVOKE 230!"

They're both right, at least directionally. Section 230, which dates to 1996,  shields platforms from civil liability stemming from third-party content on their sites. It has been central to the success of crowdsourced platforms like YouTube, Twitter and Facebook, protecting them from potentially staggering liability for the online misbehavior of their users. But by exempting the platforms from the usual rules of liability, Section 230 is also a kind of subsidy, and one that protects some of the biggest companies in the world from expensive litigation.

Such a subsidy made more sense in 1996, when there were only 36 million internet users in the world. Now that 4.6 billion people regularly go online, it's fair to ask why the U.S. should give internet platforms a sweeping exemption from the laws that govern everyone else. In recent years, more and more politicians on either side of the aisle have been pointedly asking this question, though perhaps for different reasons: Some Democrats still blame social media for making Trump's election possible, while Republicans fault the industry for how publicly it has regretted its role in the 2016 campaign.

But revoking Section 230 is not a good idea. Mad as people may be at the platforms, those companies will need at least some liability protection if they're going to keep giving us the crowdsourced content we all consume today. Platforms particularly need protection from defamation liability, which falls on both the author and the publisher. No social media company can police its content for libelous posts. The platforms simply are not equipped to evaluate which statements are true and which are false and defamatory.

That doesn't end the debate, though. The industry may need an exemption from defamation liability, but why should it be immune if it ignores user misconduct that is entirely predictable and largely preventable? That question led Congress, on an overwhelmingly bipartisan basis, to amend Section 230 by  adopting FOSTA, the Allow States and Victims to Fight Online Sex Trafficking Act of 2017. FOSTA withdrew immunity from online platforms that knowingly let their users facilitate sex trafficking. And it turned out that most platforms didn't need protection for facilitating sex trafficking. A few online sites that depended on prostitution ad revenue went out of business, but Big Social Media continued to thrive.

So the next questions for Section 230 skeptics on both sides of the aisle should be, "Where else has Congress given social media an immunity it didn't need? And how can policymakers chip away at this overgenerous subsidy without putting at risk the survival of social media?"

The most thoughtful answers I've seen come from a Justice Department report released this month. Without grandstanding, it offers several proposals that ought to have bipartisan appeal.

The report begins by acknowledging that social media companies still need protection from liability for things they can't be expected to police, like defamation. But the report sees a vast difference between being unable to stop criminal behavior and actively promoting it, as the sex trafficking sites did before FOSTA. It draws a simple lesson in FOSTA's success—the online platforms worth propping up don't need immunity for facilitating or soliciting illegal conduct.

The Justice Department also suggests that platforms should be required to go further in regulating user conduct— they should face liability if they fail to take reasonable steps to prevent the distribution of child sex abuse materials and terrorist and cyberstalking content. The only thing surprising about this proposal is that Section 230 doesn't already demand it; the law confers its immunity without asking the platforms to do anything at all in return. It's past time to spell out exactly what is expected of platforms in exchange for the subsidy they receive. Reasonable efforts to stop things like child sex abuse is certainly not too much to ask.

Among its other suggestions for trimming Section 230 immunity, the report rejects the extreme applications of the law that have gained currency since 1996. Most notably, online platforms have argued that Section 230 creates an immunity from antitrust claims. This is outrageous. If today's monolithic platforms use their control of the national discourse to suppress criticism of their power or praise for their competitors, they don't deserve immunity; they deserve an injunction and treble damages. Similarly, there's no justification for extending platforms' defamation immunity to the point where they can ignore court libel rulings without consequences (as, for example, Yelp has done).

So far, so bipartisan. If a reformed Section 230 forces social media to be more cautious about facilitating criminal conduct online, neither party will weep. There's less unanimity about reforming a second immunity granted by Section 230. (Yes, there are two!) The second immunity protects the platforms not when they allow speech but when they suppress it. Conservatives think (rightly, in my view) that Silicon Valley tilts against them in these decisions, whether the result is a takedown or a warning label or a "shadow ban."

The second immunity protects online platforms from liability when they take down content that is sexual, violent, harassing or "otherwise objectionable," as long as they act in "good faith." In an age when everyone objects to everything, this language invites weaponization. It is only prudent for Congress to narrow the definition of "otherwise objectionable" speech so that the provision gives special protection to the platforms mainly when they're taking down speech that violates the law.

Republicans who think they've been victimized by social media censorship will of course find something to support here, but so too can anyone else uncomfortable with letting a handful of Silicon Valley monoliths decide what can and can't be said online. For starters, unlike the first immunity, which protects against the clear risk of ruinous defamation liability, it's not even fully clear what lurking liability the second immunity is needed to head off. Successful lawsuits for refusing to publish someone else's work are not exactly thick on the ground.

The Justice Department's other recommendation here is to attach some more tangible standards to the statute's requirement that content be policed in "good faith." To meet this requirement, the department urges, content moderation policies should be stated "plainly and with particularity" and takedown decisions should be notified in a timely way and explain "with particularity the factual basis for the restriction." This will certainly be popular on the right, which thinks it's unfairly targeted for suppression. But plenty of speakers on the left feel the same way. The only obvious cure for the widespread mistrust of platforms is for them to embrace greater transparency and candor. They have resisted, sometimes with good reason, sometimes without; but "trust us" is no longer a persuasive argument. This proposal would encourage them to move away from their largely opaque content moderation practices.

In short, and surely surprising to some, this Justice Department has made a real contribution toward bipartisan reform of Section 230. The temptation among Democrats will be to score partisan points by dismissing the report.

That would be a mistake.

Because, having embraced a candidate tied to the unrealistic position that "section 230 should be revoked, immediately," Democrats are going to need more workable solutions that keep the essential core of Section 230 while cutting  back the platforms' now-unjustifiable government subsidy.

And, when they go looking for those ideas, they're going to find that a big chunk of them are already in this report.