The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Free Speech

Mastodon's Content-Moderation Growing Pains

[I asked Prof. Alan Rozenshtein (University of Minnesota) to write a post about Mastodon and one particular recent controversy related to it, and he very kindly agreed. -EV]

|

Ever since Elon Musk purchased Twitter, Mastodon, a decentralized microblogging platform, has seen millions of new users. I've written elsewhere about the architecture that make Mastodon unique—specifically, each Mastodon server (known as an "instance") can choose its own content moderation standards, blocking whatever content, users, or even other instances that it wants. This leads to what I've called "content moderation subsidiarity" and allows users to tailor their experience while still generally being able to follow and be followed by users on other instances.

Mastodon thus represents a novel solution to the "content moderator's trilemma": the challenge of (1) running a social media platform with a giant and diverse user base, (2) using one set of content moderation standards, and (3) managing user dissatisfaction with the content moderation they experience. Centralized platforms like Twitter respond to the trilemma by accepting that user dissatisfaction is inevitable; decentralized platforms like Mastodon try to satisfy users by giving up on centralized moderation standards.

But it will take time—months, maybe years—for the millions of new Mastodon users to find, and in some cases create, the instances that best suit their needs. In the meantime, we should expect a difficult period of growing pains.

An illustrative example is the ongoing controversy at journa.host, an instance set up for journalists. Mike Pesca, host of The Gist (and formerly at Slate) and a member of the journa.host instance, posted a link to a recent New York Times story about the potential negative effects of treating transgender children and teenagers with puberty blockers, describing the story as "careful, thorough reporting." Parker Molloy, another journa.host user and herself a transgender woman, responded by strongly criticizing the piece and calling Pesca a "bigot" and an "anti-trans ghoul".

Pesca defended his original post and criticized Molloy's actions, writing, "you as a member of the activist community attack the article [because] of bad bedfellows, insult me and mischaracterize the story as Republicans Versus Doctors. That's plainly inaccurate." Journa.host suspended Pesca for "demeaning the professionalism of a journalist, based on their identity" (hence why the links above are to Pesca's Twitter feed, to which he posted screenshots of the exchange).

Journa.host also suspended Molloy for "attacking the integrity of another [user], who is a moderator on the server," apparently referring to Molloy's calling Evan Urquhart, himself a transgender journalist who has criticized the New York Times article, as a "bootlicker." It's unclear whether Molloy's insulting of Pesca also factored into her suspension.

Either way, she has since apologized to both Pesca and to journa.host's moderators. Molloy has moved to a new instance, while Pesca remains suspended. (Although Pesca has described being "suspended from Mastodon," he has only been suspended from journa.host. This conflation between the Mastodon network and individual Mastodon instances is a common, and understandable, confusion among new Mastodon users.)

Journa.host has been criticized from all sides. Some (myself included) believe that journa.host should not have suspended Pesca for his comparatively restrained response to Molloy's attack. Others criticized journa.host for suspending Molloy and for not doing more to fight transphobia. Indeed, it has (it seems to me unfairly) developed a reputation with some instances as transphobic generally, to the extent that several large instances have either limited or outright banned access for their users to journa.host (a move which Molloy has urged against).

The journa.host controversy illustrates how Mastodon's rapid growth is forcing it to confront one of its founding contradictions. On the one hand, simply by dint of Mastodon's decentralized nature and the reality that no content or user can be banned from the network entirely, it is categorically more protective of speech than is any other platform. In this sense, Mastodon, not Twitter, is (as Twitter once called itself) the "free speech wing of the free speech party."

On the other hand, much of the dominant culture of Mastodon—at least before the recent influx of Twitter users—has been supportive of fairly heavy moderation, especially of users and content that is viewed as far-right. As Mastodon's creator Eugen Rochko noted, "One of the things that gave impetus to the creation of Mastodon was a lack of moderation on Twitter against hate groups. The 'no nazis' rule of the original mastodon.social server … continues to serve as a major attraction of the project." The norm of aggressive moderation is also a reflection of the fact that Mastodon, unlike most other social media platforms, originated not in the United States but in Europe (specifically Germany, where the Mastodon nonprofit is based), which has more permissive legal and cultural norms around speech restrictions.

Another example of Mastodon trying to accomplish through norms what it has foreclosed as a matter of architecture is the "Mastodon server covenant," which was introduced in 2019. The covenant is a set of requirements that instances have to meet to be listed on the Mastodon organization's website.

These requirements include "active moderation against racism, sexism, homophobia and transphobia" such that users "have the confidence that they are joining a safe space, free from white supremacy, anti-semitism and transphobia of other platforms." But the covenant does not—indeed cannot—require any particular Mastodon instance to abide by these requirements. A non-compliant instance may not be listed on the Mastodon organization's website, but it remains a full-fledged member of the Mastodon network.

As Mastodon grows—and especially as it incorporates users who may be unsatisfied with its existing norms—this contradiction will remain a source of tension. In some cases, the network has managed to quarantine instances that deviate from the dominant norms, as happened when the alt-right platform Gab migrated to Mastodon. But enough new users might overwhelm Mastodon's dominant culture, especially if those users leave or refuse to join instances that they view as overly censorious.

The question, for which we'll simply have to wait to find out the answer, is whether the system achieves equilibrium around a large core of instances that are willing to communicate despite calls for defederation, or if instead the network balkanizes into an ever-shifting circular firing squad of mutual blocking that is difficult to navigate and unrewarding to be a part of. If the latter, Mastodon may turn off too many users to ever truly be a viable Twitter competitor.

Either way, the fights will be messy and public. But I suspect that Churchill's observation about democracy is true also of content moderation on a global internet: the decentralized option may well be the worst—except for all the rest.