The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Free Speech

Social Media Platforms' Many Functions: Hosting, Recommendation, Conversation, and More

|

Another excerpt from my Social Media as Common Carriers? article (see also this thread).

[* * *]

Of course, like many other businesses, social media platforms (and other Big Tech infrastructure providers) have multiple functions.[100]

  1. So far, I've focused on their hosting function—their letting a user post material on what is seen as the user's own page, and delivering that material to people who deliberately visit that page or subscribe to its feed in reverse chronological order mode.[101] Some Big Tech companies are entirely or almost entirely about the hosting function (consider WordPress, or Amazon Web Services). Domain name registrars, such as GoDaddy, also offer something close to the hosting function, though they don't do the hosting themselves.
  2. Social media platforms also often provide what I call the "recommendation function," for instance when they include a certain account or post in a news feed that they curate, or in a list of "trending" or "recommended" or "you might enjoy this list" items. Other Big Tech features, such as Google's search or Google News, are almost entirely about this function.
  3. And social media platforms also often provide what I call the "conversation function," when they allow users to comment on each other's posts.

It seems to me that the case for common carrier status is strongest (whether or not one thinks it's strong enough) as to the hosting function, which is close to what phone companies and UPS and FedEx do. The decision to remove an account or delete a post, and thus to interfere with the author's communication to those who deliberately subscribe to the account, is similar to a phone company's decision to cancel a phone line. It seems at least potentially reasonable to impose a common carrier requirement that prevents such decisions.[102]

And this is especially so when the hosted material is made visible just to people who seek it out, for instance when Twitter lets people's posts be seen by their followers, or Facebook lets people visit someone's web page, or YouTube lets people watch a video, or WordPress lets people visit a WordPress-hosted blog. I may think lots of material out there is "terrible,"[103] but that needn't interfere with my ability to visit good content, just as the existence of terrible books doesn't keep me from reading good ones.[104] (Indeed, Twitter, unlike Facebook, allows porn feeds, though it labels them "sensitive material." To my knowledge, Twitter users who don't want to see the porn generally don't run across it by accident, and they don't find the utility of Twitter diminished by the fact that some other people are using Twitter to view porn.)

On the other hand, the case for editorial discretion—including for a First Amendment right to exercise such discretion, which I discuss in the next Part—is strongest as to the recommendation function, which is close to what newspapers, broadcasters, and bookstores do.

What about platforms' exercising their conversation function—for instance, blocking certain comments posted to others' pages, or blocking users because they had repeatedly posted comments that violated platform rules?

This is particularly important when it comes to spam (generally defined as mass off-topic posts, usually involving commercial advertising but sometimes political advertising as well). Indeed, unchecked spam could make the comment section of platforms virtually impossible to use, as readers will have to slog through many spam comments to see each one real comment.[105]

Yet it can also apply to other kinds of comments, such as personal insults (bigoted or otherwise) and vulgarities, whether addressed to the page owner or to other commenters. As people who have tried to manage conversations know, such messages can often ruin the conversation, as other readers and commenters decide just to stop participating rather than having to deal with the nastiness.[106]

Generally speaking, platforms give users the tools to deal with this problem: A user can generally block any commenter from commenting on the user's page; and the user can do this for any reason, without special approval from the platform. Much of the conversation policing on social media is done through this, largely uncontroversial, mechanism. And of course platforms encourage people to use this mechanism, since it involves minimum work and decisionmaking for them—hence, for instance, the "What should I do if I'm being bullied, harassed or attacked by someone on Facebook?" page offers these recommendations as the first two options:

Things you can do on Facebook

Facebook offers these tools to help you deal with bullying and harassment. Depending on the seriousness of the situation:

  • Unfriend the person. Only your Facebook friends can contact you through Facebook chat or post on your timeline.
  • Block the person. This will prevent the person from adding you as a friend and viewing things you share on your timeline.
  • Report the person or any abusive things they post.[107]

But sometimes unwanted commenters can get more insistent, for instance creating new accounts under new names, and then the social media platforms may block those people altogether (though even this is hardly fully reliable, precisely because people can create new accounts).

Note the differences between this sort of policing of comments (which I'll use as shorthand for restricting speech posted on others' pages without their explicitly seeking it out) and policing of hosted content (which I'll use as shorthand for restricting speech posted on a person's page or feed that will be visible to those who do affirmatively seek it out). When it comes to comments, you may indeed want to rely on the platform—much as you rely on a newspaper—to save you from information overload, especially spam overload but also trash-talking. And when it comes to comments, the platform may therefore valuable provide a coherent speech product: a curated conversation, with the useful contributions included and the toxic ones eliminated.

But typical readers who want to see a particular author's account aren't really seeking an edited version of that account from the platform. The readers generally know what they'd be getting from that account, and can just unsubscribe or stop visiting if they don't like it.

There might thus be reason to leave platforms free to moderate comments (rather than just authorizing users to do that for their own pages), even if one wants to stop platforms from deleting authors' pages or authors' posts from their pages. And platform authority to restrict comments is much less likely to affect public debate than would platform authority to delete pages or posts.

[100] This multiplicity of functions has long been recognized, even as to the early Internet. See, e.g., Eric Goldman (writing as Eric Schlachter), Cyberspace, the Free Market and the Free Marketplace of Ideas: Recognizing Legal Differences in Computer Bulletin Board Functions, 16 Hastings Comm/Ent 87 (1993).

[101] See Twitter, About Your Twitter Timeline, https:‌//perma.cc/9FBC-AYLU; Twitter, Find Your Way Around Twitter, https:‌//perma.cc/TEW3-ZKA7; Lucas Matney, Twitter Rolls Out 'Sparkle Button' to Let Users Hide the Algorithmic Feed, TechCrunch (Dec. 18, 2018, at 9:‌03 am), https:‌//perma.cc/7VQ9-DJQ2.

[102] If a company has many competitors to whom one can easily switch without losing access to one's audience—for instance, a domain name registrar or a web hosting company—there may be less reason for common carrier treatment, since that company will have less power to use its deplatforming decisions to influence public debate. On the other hand, there is danger that many companies (even without antitrust law) will all be pressured to deplatform sufficiently unpopular views; and while switching might not be that hard for an established media operation, it might be much harder for ordinary citizens. See supra p. 13. Perhaps because of this, common carrier status isn't limited to monopolies, and applies, for instance, to cell phone companies (which famously compete with each other) and to UPS and FedEx (which compete with each other and with the U.S. Postal Service).

[103] [Add citation.]

[104] In the words of Theodore Sturgeon, "ninety percent of everything is crap," Oxford English Dictionary, https:‌//‌perma.cc/‌XUR2-MGUR, whether it's books or web pages. So long as I don't need to read what I view as terrible, I don't think it should matter that others who seek it out (perhaps because they don't think it's so terrible) can read it on the same site.

[105] Cf. Genevieve Lakier, Corporate Constitutionalism and Freedom of Speech in the Digital Public Sphere (forthcoming 2021) (cautiously endorsing such content policing, on the grounds that it "enables … a marketplace of ideas for ideas about freedom of speech" and "enables users to experience, and in some cases, help to create, different kinds of speech communities, organized around different speech rules").

[106] See generally Eugene Volokh, Freedom of Speech in Cyberspace from the Listener's Perspective: Private Speech Restrictions, Libel, State Action, Harassment, and Sex, 1996 U. Chi. Leg. F. 377, 398–401 (1996).

[107] Facebook, Help Center: What Should I Do If I'm Being Bullied Harassed or Attacked by Someone on Facebook? (2021), https:‌//‌perma.cc/‌33P7-N2EQ.