Facebook

New Facebook Oversight Board Will Handle Tricky Moderation, but Will It Value Free Speech?

Mark Zuckerberg can't please the anti-tech populists on the left and the right, no matter what he does.

|


Facebook CEO Mark Zuckerberg has described content moderation as the most perplexing problem he faces, and so today's announcement that the company had appointed an independent oversight board to review controversial decisions comes as no surprise.

The board will be chaired by Catalina Botero-Marino, a former free expression reporter for the Organization of the American States; Jamal Greene, a Columbia University law professor; Michael W. McConnell, a Stanford University law professor; and 

We will not be able to offer a ruling on every one of the many thousands of cases that we expect to be shared with us each year. We will focus on identifying cases that have a real-world impact, are important for public discourse and raise questions about current Facebook policies. Cases that examine the line between satire and hate speech, the spread of graphic content after tragic events, and whether manipulated content posted by public figures should be treated differently from other content are just some of those that may come before the board.

Over the coming months, we will lay out how we prioritize and select cases for review. Once a case has been chosen, it will be considered by a panel with a rotating set of members. All panel decisions will be reviewed by the entire board before they are finalized, and if a majority of members disagree with a decision, they can ask a new panel to hear the case again.

The composition of the rest of the board was also announced. It includes a number of free speech and international human rights experts. John Samples, a vice president of the libertarian Cato Institute, is one of them.

"Facebook is giving users a way to appeal its content moderation decisions," Samples said in a statement. "The board is charged with making sure such decisions agree with Facebook's values and community standards. Facebook has said 'voice' is its paramount value, so I envision that the board will advance the cause of free speech in the digital era."

Facebook has come under heavy criticism from nearly every point on the political spectrum. Many on the left are angry at Facebook essentially for not doing more aggressive content moderation. Sen. Elizabeth Warren (D–Mass.), for instance, has complained about "misleading" political advertisements on the platform. Celebrities, and many in the media, want Facebook to broadly prevent the spread of disinformation, and are evidently worried that the board is oriented toward protecting controversial speech.

On the other hand, various rightwing commentators think Facebook is too aggressive with its content regulation. Recently, small steps taken by the platform to disallow events that deliberately spread harmful information about COVID-19 provoked widespread cries of censorship.

No one seems to hate Facebook more than Sen. Josh Hawley (R–Mo.), who gave the following reaction to news of the oversight committee:

This is a blatant misrepresentation of the committee's aims, and one that in no way, shape, or form makes any sort of compelling argument for the government to break up Facebook. Indeed, the committee might cause Facebook to be less censorious, in a way that meets the approval of conservatives: After all, it includes a number of individuals whose unique backgrounds as advocates for civil and human rights might make them more inclined to permit controversial content than Facebook's own employees would be. It's hard to avoid the impression that Hawley objects to Facebook's size and influence, and would rail against the company no matter what it did.

The most sensible take on the oversight committee belongs to Jesse Blumenthal, a tech policy expert at Stand Together, who writes that he is "cautiously optimistic" about the potential for the oversight committee to defend free expression. Nevertheless:

There are two interrelated issues: what rules do you set and how do you enforce them. The former is a deliberative choice, the latter is an impossible task that can be refined at the margins. Still, it is clear where Facebook stands on those questions. What is less clear is whether the new oversight board will embrace or resist this shift in values.

Indeed, only five of the initial board appointments come from the United States. The other 15 come from abroad. That is some cause for concern because even the strongest free expression advocates abroad tend to place a lower emphasis on free expression than their American counterparts.

In many discussions about the board both Facebook staff and external activists have pressed the company on geographic representation. This priority is reflected in the bylaws and regularly taken as a given. But by emphasizing geographic diversity the board risks shifting away from a uniquely American approach to free expression.

It's possible that by delegating the most difficult moderation decisions to a third party, Facebook will effectively end up permitting the proliferation of new norms that are actually less friendly to speech than what Zuckerberg himself has envisioned. (Facebook has pledged to abide by the board's decisions, though the bylaws give the company plenty of wiggle room.) I don't think this is the most likely possibility, but it could happen.

Given all the opprobrium the platform has earned for its moderation decisions—which are indeed occasionally flawed, but often not nearly as contemptible as critics claim—Zuckerberg can hardly be faulted for wanting to share this responsibility with another entity. As long as that entity isn't Congress, we should all be grateful.