Facebook Has Every Right To Ban Louis Farrakhan and Alex Jones. But It's Still a Bad Idea.
Even more worrying: New Zealand's leading media outlets are self-censoring coverage of the Christchurch mass shooting.
Facebook announced yesterday that it was permanently banning Louis Farrahkhan, Alex Jones, Infowars, Milo Yiannopoulis, Paul Nehlen, Laura Loomer, and Paul Joseph Watson from Facebook and its sister service, Instagram. Each of these accounts, the company says, violated Facebook's community standards governing "Dangerous Individuals and Organziations." Among other things, these
do not allow any organizations or individuals that proclaim a violent mission or are engaged in violence, from having a presence on Facebook. This includes organizations or individuals involved in the following:
- Terrorist activity
- Organized hate
- Mass or serial murder
- Human trafficking
- Organized violence or criminal activity
We also remove content that expresses support or praise for groups, leaders, or individuals involved in these activities.
Yesterday's action was partly intended to combat a "loophole" that allowed Infowars and Jones, who had already been banned by the service, to keep appearing on it. Unofficial fan pages widely shared material from Infowars and Jones. As Wired explains, Facebook plans to stop them with a tool
usually reserved for terrorist organizations and self-proclaimed hate groups. The company says that users will be prohibited from sharing Infowars videos, radio clips, articles, and other content from the site unless they are explicitly condemning the material. Fan pages or groups that reshare Infowars content in a sincere or appreciative manner won't be permitted.
As a private platform, Facebook is completely within its legal rights to ban whomever it chooses for pretty much whatever reason it wants. But whether it's wise to start banning awful, stupid expression that falls short of true threats and other illegal activity is a very different matter. I'd argue that it's not, as it feeds into the tendency to try suppress beliefs that one considers contemptible, dangerous, or evil. Those are not sharply delimited categories, and the tendency will be for more and more material to be seen as worthy of being policed, regulated, and eliminated. That is what's happening on many college campuses, and the results are not encouraging for a society that believes in freedom of expression.
The better solution for social media platforms is to give users robust tools to filter and control content, and to foster an online culture that that is broadly tolerant of diverse viewpoints and conducive to new forms of media literacy. The promise of social media platforms, which are distinct from more traditional forms of publishing or distribution (such as an edited outlet such as Reason.com), are that they allow individuals not just to express ourselves almost effortlessly but to tailor our feeds to infinitely individualized tastes. We are losing some of the utopian potential of the internet every time a new clampdown on content or process takes place.
Facebook's action comes a broader context in which support for free expression is slipping on both governmental and private fronts. In the wake of the Christchurch mosque shooting in March, for instance, the New Zealand government banned not only certain types of guns but possession of the manifesto written by alleged shooter Brenton Tarrant and footage of his killing spree (which was livestreamed briefly on Facebook before being taken down). A 22-year-old man has been arrested for distributing video of the shooting and faces up to 14 years in jail. Even more worrying, writes Jack Shafer at Politico, New Zealand's five top media outlets have voluntarily agreed to censor their coverage of Tarrant's trial.
The news organizations vow to limit coverage of statements "that actively champion white supremacist or terrorist ideology," avoid quoting the accused killer's "manifesto," and suppress any "message, imagery, symbols" or hand signs like a Nazi salute made by the accused or his supporters in support of white supremacy. "Where the inclusion of such signals in any images is unavoidable, the relevant parts of the image shall be pixelated," the guidelines add.
Shafer argues that paternalism barely begins to describe what's going on here:
The Kiwi editors don't appear to trust their readers and viewers to handle the difficult and disturbing material that's sure to billow out of the Tarrant trial. They regard New Zealanders as children who must be sheltered from the heinous and despicable lest they become tainted with its influence. That's essentially the position New Zealand Chief Censor [David] Shanks took when he banned the manifesto in March, saying that documents like it were aimed at a "vulnerable and susceptible" audience and designed to incite them to perform similar crimes. "There is content in [the manifesto] that points to means by which you can conduct other terrorist atrocities…it could be seen as instructional," Shanks said.
Although the New Zealand media have promised to thoroughly cover the trial, Shafer is right to suspect that reporters will do a particularly good job "from a crouch." He's also correct to worry that "the journalistic groupthink encouraged by the pact could result in undercoverage of extremist movements and leave readers unnecessarily vulnerable."
Something similar is at work with Facebook's ban. The company seems worried that its users are impressionable people liable to be gulled into believing nonsense. It also probably just doesn't want to associated with offensive content of all kinds. But nobody should assume that Facebook supports, say, Alex Jones, just because they allow him to use their service, any more than whatever phone service he uses supports him because he makes calls on it.
We live in a moment when the tech giants themselves are explicitly calling for regulation, both to stave off certain forms of government intervention and to lock in current market positions. Politicians as different as House Speaker Nancy Pelosi (D–Calif.) and Sen. Josh Hawley (R–Mo.) are threatening to regulate online speech and to end to the internet as we've known it by gutting Section 230 of the Communications Decency Act. In Europe, copyright law is being marshaled to stifle the free exchange of ideas and content in order to protect legacy entertainment companies. Massively overhyped fears about the influence of Russian trolls during the 2016 election have given rise to story after story about how stupid Facebook users are, a prelude to empowering the very sort of gatekeepers we supposedly left behind when we migrated en masse into cyberspace in the mid-1990s. An era is passing away, not because of the foulness of online trolls but because of the institutional response they've inspired.
Show Comments (139)