Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Social Media

The FTC's Probe Into 'Potentially Illegal' Content Moderation Is a Blatant Assault on the First Amendment

In the name of "restoring freedom of speech," FTC Chairman Andrew Ferguson wants to override the editorial judgments of social media platforms.

Jacob Sullum | 5.21.2025 3:15 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
FTC Chairman Andrew Ferguson | FTC
FTC Chairman Andrew Ferguson (FTC)

Today is the deadline for public comments regarding a "public inquiry" by the Federal Trade Commission (FTC) into the "potentially illegal" content moderation practices of social media platforms. As many of those comments note, that investigation impinges on the editorial discretion that the U.S. Supreme Court has repeatedly said is protected by the First Amendment.

"Tech firms should not be bullying their users," FTC Chairman Andrew Ferguson said when the agency launched its probe in February. "This inquiry will help the FTC better understand how these firms may have violated the law by silencing and intimidating Americans for speaking their minds."

Ferguson touts his investigation as a blow against "the tyranny of Big Tech" and "an important step forward in restoring free speech." His chief complaint is that "Big Tech censorship" discriminates against Republicans and conservatives. But even if that were true, there would be nothing inherently illegal about it.

The FTC suggests that social media companies may be engaging in "unfair or deceptive acts or practices," which are prohibited by Section 5 of the Federal Trade Commission Act. To substantiate that claim, the agency asked for examples of deviations from platforms' "policies" or other "public-facing representations" concerning "how they would regulate, censor, or moderate users' conduct." It wanted to know whether the platforms had applied those rules faithfully and consistently, whether they had revised their standards, and whether they had notified users of those changes.

If platforms fall short on any of those counts, the FTC implies, they are violating federal law. But that position contradicts both the agency's prior understanding of its statutory authority and the Supreme Court's understanding of the First Amendment.

The FTC's authority under Section 5 "does not, and constitutionally cannot, extend to penalizing social media platforms for how they choose to moderate user content," Ashkhen Kazaryan, a senior legal fellow at the Future of Free Speech, argues in a comment that the organization submitted on Tuesday. "Platforms' content moderation policies, even if controversial or unevenly enforced, do not fall within the scope of deception or unfairness as defined by longstanding FTC precedent or constitutional doctrine. Content moderation practices, whether they involve the removal of misinformation, the enforcement of hate speech policies, or the decision to abstain from moderating content users don't want to see, do not constitute the type of economic or tangible harm the unfairness standard was designed to address. While such policies may be the subject of vigorous public debate, they do not justify FTC intervention."

The FTC says "an act or practice is 'unfair' if it 'causes or is likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition'" (emphasis in the original). "In most cases," the FTC explains, "a substantial injury involves monetary harm, as when sellers coerce consumers into purchasing unwanted goods or services or when consumers buy defective goods or services on credit but are unable to assert against the creditor claims or defenses arising from the transaction. Unwarranted health and safety risks may also support a finding of unfairness."

It is not obvious how that standard applies to, say, a Facebook user who complains that the platform erroneously or unfairly deemed one of his posts misleading. Nor does the FTC's long-established definition of "deception" easily fit the "Big Tech censorship" to which Ferguson objects.

The FTC says "deception" requires "a representation, omission or practice that is likely to mislead the consumer." It mentions several examples of "practices that have been found misleading or deceptive," including "false oral or written representations, misleading price claims, sales of hazardous or systematically defective products or services without adequate disclosures, failure to disclose information regarding pyramid sales, use of bait and switch techniques, failure to perform promised services, and failure to meet warranty obligations."

To justify FTC action, consumers must reasonably rely on a deceptive representation, omission, or practice, which must be "material," meaning it is "likely to affect the consumer's conduct or decision with regard to a product or service." In that situation, the FTC says, "consumer injury is likely, because consumers are likely to have chosen differently but for the deception."

This definition also poses puzzles in the context of social media moderation. Suppose a YouTube user complains that the platform has arbitrarily imposed age restrictions on access to his videos. If he knew that was going to happen, he says, he would have "chosen differently," meaning he would have picked a competing video platform instead of investing time and effort in building his YouTube channel.

Does that constitute the sort of "consumer injury" that the FTC Act was meant to address? It seems doubtful, especially since YouTube is free, so using it does not entail purchasing "a product or service."

The complaints generated by the FTC's "request for public comment" illustrate the problems with trying to treat content moderation decisions as violations of Section 5. "In 2020," says one, "I was posting about [Donald] Trump, memes and such. Also about the vaccines and CoVid being a money grab. I was put in Facebook jail and put on restriction several times for 'misinformation.' I quit Facebook because of this. I miss seeing my family and friends' life adventures but I will not be silenced because of lies."

Around the same time, another commenter reports, "I lost my Facebook AND Twitter accounts for supporting Donald Trump. I DID NOT [write] misleading, outrageous conspiracy-based posts, and didn't even post daily. I was just CANCELLED one day, with NO warnings or previous actions against me. My 79 year old mother, who has since passed, was treated the same."

We can be reasonably confident that Facebook and Twitter would have explained these decisions based on rationales other than outrage at expressions of support for Donald Trump. Does the FTC really plan to adjudicate such disputes, choosing between contending versions of what happened and deciding whether it contradicted the platforms' avowed policies?

Any attempt to police content moderation under this legal theory inevitably would interfere with decisions that the Supreme Court has said are constitutionally protected. Last July, the Court recognized that social media platforms, in deciding which speech to host and how to present it, are performing essentially the same function as newspapers that decide which articles to publish.

"Traditional publishers and editors," Justice Elena Kagan wrote in the majority opinion, "select and shape other parties' expression into their own curated speech products," and "we have repeatedly held that laws curtailing their editorial choices must meet the First Amendment's requirements." That principle, Kagan said, "does not change because the curated compilation has gone from the physical to the virtual world. In the latter, as in the former, government efforts to alter an edited compilation of third-party expression are subject to judicial review for compliance with the First Amendment."

That decision involved Florida and Texas laws that, like Ferguson's dubious assertion of regulatory authority, aimed to fight "Big Tech censorship" by restricting content moderation. "Texas does not like the way those platforms are selecting and moderating content, and wants them to create a different expressive product, communicating different values and priorities," Kagan observed. "But under the First Amendment, that is a preference Texas may not impose."

Ferguson is attempting something similar by suggesting that social media platforms may be engaging in "unfair or deceptive" trade practices when they "deny or degrade" users' "access to services" based on "the content of users' speech." In practice, ensuring "fair" treatment of users means overriding editorial decisions that the FTC deems opaque, unreasonable, inconsistent, or discriminatory.

Ferguson's avowed goal is to increase the diversity of opinions expressed on social media. Like Texas, he wants platforms to offer "a different expressive product" that better fits his personal preferences.

"Holding platforms liable under Section 5 for content moderation policies would necessarily intrude upon their editorial judgment," Kazaryan notes. "The First Amendment not only protects the right to speak but also the right not to speak and to curate content. The Supreme Court has never held that editorial discretion must be evenly or flawlessly applied to qualify for constitutional protection."

The FTC also suggests that content moderation practices "affect competition, may have resulted from a lack of competition, or may have been the product of anti-competitive conduct." But Kazaryan notes that platforms compete based on different approaches to moderation. "The existence of platforms such as Rumble, Mastodon, Substack, Truth Social, and Bluesky," he writes, "demonstrates that users have choices in moderation environments."

Those environments also evolve over time based on business judgments or changes in ownership. "Under its previous leadership, Twitter developed strict rules against misinformation and hate speech," Kazaryan notes. "Following Elon Musk's acquisition, the platform reassessed those policies and relaxed many of them, allowing for broader latitude in political and ideological speech. Some saw this as irresponsible. Others viewed it as a welcome rebalancing in favor of free expression. Both views are valid. But neither justifies government intervention. The fact that a private entity revised its speech rules to reflect the views of new ownership is not a violation of law; it is a demonstration of First Amendment rights in action."

Kazaryan also cites changes in moderation policies at Meta, which this year switched "from a top-down enforcement model to a new community fact-checking system that lets users add context to viral posts through crowd-sourced notes" on Facebook and Instagram. And he notes that YouTube has revised its "moderation policies on
election and health information in light of shifting scientific consensus and public debate."

None of those changes "are inherently deceptive, unfair, or anticompetitive," Kazaryan writes. "A platform's decision to use a top-down moderation system or a community notes model is a design choice and an editorial judgment that the Supreme Court recognizes as protected by the First Amendment."

Kazaryan also questions the premise that social media are systematically biased against right-of-center views. "Conservative accounts, influencers, and news sources have reached massive audiences across all major social media platforms," he notes. "Data from the last several years shows how right-leaning voices have successfully promoted their perspectives online."

Kazaryan backs up that assessment with several pieces of evidence. In the final quarter of 2019, for example, Breitbart's Facebook page "racked up more likes, comments, and shares" than The New York Times, The Washington Post, The Wall Street Journal, and USA Today combined. Kazaryan adds that President Donald Trump's "own social media presence remains unmatched; his accounts across platforms like X (formerly Twitter), Facebook, and Truth Social collectively boast nearly 170 million followers, significantly outpacing his political rivals."

A 2020 Media Matters study, Kazaryan notes, "found that right-leaning pages garnered more total interactions than both left-leaning and non-aligned pages." A 2021 study published in the Proceedings of the National Academy of Sciences "revealed that Twitter's algorithmic amplification favored right-leaning news sources over left-leaning ones in six out of seven countries studied, including the United States." A 2024 Pew Research Center study of "news influencers" on Facebook, Instagram, TikTok, X, and YouTube found they were "more likely to identify with the political right than the left."

Even if you don't find this evidence persuasive, there is a fundamental contradiction between Ferguson's basic beef about "Big Tech censorship"—that "these firms" are "silencing and intimidating Americans for speaking their minds"—and the main legal theory he is floating. Ferguson thinks social media platforms should treat all users equally, without regard to the opinions they express. But his argument that they are guilty of "unfair or deceptive" trade practices hinges on the premise that they are surreptitiously suppressing politically or ideologically disfavored content while claiming to be evenhanded. If they openly discriminated against conservatives, there would be no grounds for FTC intervention under Section 5 even based on Ferguson's improbably broad reading of that provision.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Antitrust Remedies Against Google Would Punish Consumers, Not Protect Them

Jacob Sullum is a senior editor at Reason.

Social MediaFederal Trade CommissionRegulationFirst AmendmentFree SpeechCensorshipFraudCompetitionStatutory InterpretationTrump AdministrationFacebookTwitterSupreme Court
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Show Comments (27)

Latest

A Federal Judge Says New Mexico Cops Reasonably Killed an Innocent Man at the Wrong House

Jacob Sullum | 5.21.2025 6:00 PM

Supreme Court Orders Maine Legislator Censured for Social Media Post Must Get Voting Rights Back

Emma Camp | 5.21.2025 4:30 PM

The GOP Tax Bill Will Add $2.3 Trillion to the Deficit, CBO Says

Eric Boehm | 5.21.2025 4:10 PM

A Judge Blocked Apple From Collecting These Commissions

Jack Nicastro | 5.21.2025 3:52 PM

The FTC's Probe Into 'Potentially Illegal' Content Moderation Is a Blatant Assault on the First Amendment

Jacob Sullum | 5.21.2025 3:15 PM

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!