Social Media

Facebook's Oversight Board Was Right To Ban Trump

It's a working model for non-state governance in cyberspace that is vastly preferable to government control of social media.

|


When Facebook's independent Oversight Board recently upheld the social media platform's ban on former President Donald Trump, the decision drew mostly negative reviews, with press outlets calling it "little more than judicial theater" or a "distraction" from more serious issues raised by Big Tech.

The Oversight Board comprises 20 individuals with backgrounds in tech, law, politics, and free speech activism. It's funded by an irrevocable trust set up by Facebook and has a legally binding final say over suspensions.

Its critics are wrong in at least two major ways.

First, the board made the right call in this case. It agreed that by its own rules, Facebook was justified in shutting down Trump after he twice posted praise of violent rioters even as they raised hell in the Capitol. It also correctly found that Facebook violated its terms of service by suspending Trump indefinitely. The board has given Mark Zuckerberg and company up to six months to render a final decision on the former president.

Second, and more importantly, the board is a serious model for non-state governance in cyberspace. It's far from perfect, but it's preferable to rules being forced on the service by politicians. Only up and running since the start of this year, the board has already been twice as likely to overturn Facebook's decisions as it is to uphold them, a strong show of accountability and seriousness of purpose.

In one case, the board restored images of naked breasts in a post about cancer that had been removed by an automated system. The board chastised the platform for making the decision without "proper human oversight." In another, Facebook had removed a post that inaccurately quoted Nazi propaganda minister Joseph Goebbels and implicitly compared him to Trump. Because Goebbels is on a list of prohibited people, the site took down the post even though it was intended as critical of both the Nazi and Trump. "Any rules which restrict freedom of expression must be clear, precise and publicly accessible," reads the decision, which also faulted Facebook for failing to inform the user why his post was originally taken down.

"I think there is actually a bit of a libertarian paradox here," John Samples, a vice president at the Cato Institute and a member of the Oversight Board, tells Reason in an interview. On the one hand, he says, there's an interest in no content moderation and leaving all decisions of what's on Facebook (or any other site) to end users who can control what they engage via filters and other tools. On the other hand, social media platforms are corporations that are trying to maximize shareholder value and the number of users by creating particular types of environments. He says that a private company moderating content on its platform is fully in keeping with libertarian principles. 

Calls to take away the rights of platforms to police their own sites are growing across the political spectrum. Republicans in Florida are trying to restrict social media platforms from banning users while Democrats in Colorado are pushing bans on hate speech. Progressives like Sen. Elizabeth Warren (D–Mass.) and conservatives like Sen. Josh Hawley (R–Mo.) talk about treating social media platforms as common carriers, akin to telephone companies, that would be tightly regulated by the government.

Samples argued in a 2019 policy analysis that government regulation of social media is likely "unconstitutional" because it violates "the individual's right to free speech." While he believes that content moderation is inevitable because platforms want to create a particular sort of community, he also stresses it's really hard to do consistently at scale. Facebook, he says, can't and shouldn't be all things to all people, and that the best outcomes will always come from having a variety of different platforms doing a lot of different things in a lot of different ways so that people can pick and choose among alternatives.

"You don't want one set of rules for the entire world necessarily, but you also need different platforms, different kinds of entities. I would like to see difference rather than sameness," Samples says.

Arguments for government regulation may well carry the day. If they do, we'll all be more limited in not just what we'll be able to say but where we can say it.

Produced by Noor Greene; written and narrated by Nick Gillespie; audio by Ian Keyser; additional graphics by Paul Detrick.

Photos: Photo 43805645 © Dolphfyn | Dreamstime.com, ID 124963884 © Mohamed Ahmed Soliman | Dreamstime.com, ID 29659925© Photographer London| Dreamstime.com, ID 179599250 © Frédéric Legrand | Dreamstime.com, Bill Clark/CQ Roll Call/Newscom, Joel Marklund/ZUMA Press/Newscom, Lev Radin/ZUMA Press/Newscom, Jeremy Hogan/Polaris/Newscom, Toby Melville /Reuters/Newscom, Richard B. Levine/Newscom, Lev Radin/Pacific Press/Newscom, Douliery Olivier/ABACA/Newscom, Abaca Press/Douliery, Olivier/Abaca/Sipa USA/Newscom, Graeme Jennings—Pool Via Cnp/ZUMA Press/Newscom, PoolCNP/Pool via CNP/INSTARimages/Cover Images/Newscom; Niall Carson/ZUMA Press/Newscom