On January 7, 2021, Facebook indefinitely suspended the account of President Donald Trump, saying he had violated the platform's "Community Standards" the day before by calling Capitol Hill rioters "great patriots" even as they were engaged in a violent demonstration. In another post on the same day, Trump also pushed "an unfounded narrative of electoral fraud and persistent calls to action" while pro-Trump rioters rampaged in the halls of Congress. Trump's account, which once boasted 35 million followers and was routinely one of the most popular pages at Facebook, has remained frozen since. (Trump was also banned from Facebook's sister app, Instagram).
Facebook's management immediately remanded its decision to the platform's Oversight Board, an independent group of 20 people who have the legal right to overturn any suspension on the site. On May 5, the board ruled that Facebook applied its standards correctly by suspending Trump's account but that it erred in making the suspension indefinite. "It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored," reads the decision. "In applying this penalty, Facebook did not follow a clear, published procedure." The board gave Facebook up to six months to explain why Trump will not be allowed back on the platform or to lay out the criteria by which he will be let back on.
John Samples is a political scientist and a vice president at the libertarian Cato Institute. He's also a member of Facebook's Oversight Board (go here for a list of all members). He tells Nick Gillespie how the group came to its conclusions and why he thinks the Oversight Board is an excellent example of private-sector governance of cyberspace. He also discusses his 2019 policy analysis "Why the Government Should Not Regulate Content Moderation of Social Media," casts a dark eye on attempts by liberal and conservative politicians to recast Facebook, Twitter, and YouTube as common carriers subject to intense governmental oversight, and explains why a single standard of content moderation is both unwise and impossible.
"Despite the fact that I'm on Facebook's Oversight Board, I worry about a single unified set of content moderation [principles]," says Samples, who argues that a multiplicity of platforms and approaches to speech make more sense from a libertarian perspective. "You don't want one set of rules for the entire world, but you also may need different platforms, different kinds of entities providing these services. I wouldn't necessarily want to see what's good for Facebook to spread everywhere in exactly the same form."