Social Media

YouTube Abandons Election Misinformation Policy That Censored Political Speech

"We find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech."

|

YouTube will no longer remove content that promotes false claims about U.S. elections—a tacit admission that the platform's aggressive crackdown on misinformation about the 2020 presidential election had negative consequences for political expression.

The company announced the change on Friday.

"The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society—especially in the midst of election season," wrote site administrators in a blog post.

This is a marked departure from the policy in place at YouTube for the last two years. Content moderators vigorously policed claims that former President Donald Trump had actually won the 2020 election. Indeed, their efforts were so aggressive that at one point YouTube actually censored a video released by the January 6 committee—which was attempting to hold Trump accountable for spreading election-related falsehoods—because the video contain footage of Trump saying something untrue.

Under the policy, YouTube was unable to distinguish between videos aimed at spreading misinformation and videos aimed at reporting on the spread of misinformation. These mistakes hurt Rising, the news show I host for The Hill, which is released primarily on YouTube. In March 2022, the platform suspended The Hill for violating the election misinformation policy after two of our videos included footage of the stolen-election claims. As I explained at the time:

YouTube has taken the position that merely acknowledging an utterance of the false claim is the same thing as making the claim yourself unless you correct and disavow it elsewhere in the video. (It is also sufficient to post a warning label in the video's description that a false election claim makes an appearance. YouTube is thus moving in the direction of trigger warnings.)

This is a policy that effectively outlaws straight news reporting on YouTube. Say a news channel creates a video that merely intends to provide viewers with footage of a Trump speech or interview—minus any additional commentary—where he makes claims about the election. That video is, according to YouTube, in violation of its misinformation policy if the creator does not call out or correct the claims.

By reversing course, YouTube seems to be acknowledging that policing misinformation comes with notable downsides.

"In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm," wrote YouTube administrators.

It should be noted that YouTube still intends to take action against content that seeks to mislead viewers about how to vote and on what days elections take place. In the past, this policy has prompted social media companies to moderate election-related jokes and satire, so YouTube might have been better served gutting this provision as well. Even so, it's encouraging to see a platform moving toward greater transparency and permissiveness, albeit many years too late.