In a recent interview on the Recode podcast, Mark Zuckerberg defended his decision to allow Holocaust deniers and professional trolls such as InfoWar's Alex Jones to keep using Facebook. Noting his Jewish heritage, Zuckerberg said
What we will do is we'll say, "Okay, you have your page, and if you're not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive."
He also said that Facebook doesn't " have a responsibility to make it widely distributed in News Feed," meaning that the platform would limit the reach of postings its administrators found offensive.
This isn't a question about First Amendment rights or government censorship. Facebook is a private company and can set the rules however it wants. But what's the better policy from a libertarian perspective, one that simultaneously champions free expression, calls out bullshit when it sees it, and is mindful that you might alienate your best customers by letting a few idiots foul your place of business?
In the latest Reason Podcast, Reason's Robby Soave thinks Zuck has made the right call and defends Facebook's policy. He's written that
Policing hate on a very large scale is quite difficult given the frequently subjective nature of offense; we risk de-platforming legitimate viewpoints that are unpopular but deserve to be heard; and ultimately, silencing hate is not the same thing as squelching it.
Mike Riggs takes a very different point of view. He argues that by allowing obviously false and controversial material to be posted, Facebook is degrading the experience for the 99.9 percent of its users who aren't trolls. The failure to chuck out such material may explain why Facebook is suffering a decline in daily users in the United States and Canada, its most profitable region.
Subscribe, rate, and review our podcast at iTunes. Listen at SoundCloud below:
Don't miss a single Reason Podcast! (Archive here.)