The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
California Social Media Platform Reporting Mandate Likely Violates the First Amendment
The mandate required platforms to, among other things, report to the state "how the terms of service define and address (a) hate speech or racism; (b) extremism or radicalization; (c) disinformation or misinformation; (d) harassment; and (e) foreign political interference, as well as statistics on content that was flagged by the social media company as belonging to any of the categories."
From today's Ninth Circuit opinion in X Corp. v. Bonta, decided by Judge Milan Smith, joined by Judges Mark Bennett and Anthony Johnstone:
AB 587 … [among other things requires] that social media companies submit to the State a semiannual report detailing their TOS and content-moderation practices including, if at all, how the terms of service define and address (a) hate speech or racism; (b) extremism or radicalization; (c) disinformation or misinformation; (d) harassment; and (e) foreign political interference, as well as statistics on content that was flagged by the social media company as belonging to any of the categories (TOS Report) …. [W]e refer to these … as the Content Category Report provisions.
X Corp. is likely to succeed in showing that the Content Category Report provisions facially violate the First Amendment….
[T]he Content Category Reports are not commercial speech. They require a company to recast its content-moderation practices in language prescribed by the State, implicitly opining on whether and how certain controversial categories of content should be moderated. As a result, few indicia of commercial speech are present in the Content Category Reports.
First, the Content Category Reports do not satisfy the "usual[ ] defin[ition]" of commercial speech—i.e., "speech that does no more than propose a commercial transaction." The State appears to concede as much in its answering brief.
To the extent our circuit has recognized exceptions to that general rule, those exceptions are limited and are inapplicable to the Content Category Reports here. For example, as identified by the First Amendment and Internet Law Scholars amici, we have characterized the following speech as commercial even if not a clear fit with the Supreme Court's above articulation: (i) targeted, individualized solicitations; contract negotiations, and retail product warnings. Though it does not directly or exclusively propose a commercial transaction, all of this speech communicates the terms of an actual or potential transaction. But the Content Category Reports go further: they express a view about those terms by conveying whether a company believes certain categories should be defined and proscribed.
Second, the Content Category Reports fail to satisfy at least two of the three Bolger v. Youngs Drug Products (1983) factors. The compelled disclosures are not advertisements. Nor do the Content Category Reports merely disclose existing commercial speech, so a social media company has no economic motivation in their content. The district court found the same. The State does not dispute the district court's finding on appeal. Although the Bolger factors are not dispositive, they are "important guideposts" to the analysis and, here, further support the conclusion that the compelled speech is non-commercial.
Third, while a social media platform's existing TOS and content moderation policies may be commercial speech, its opinions about and reasons for those policies are different in character and kind. The Content Category Report provisions would require a social media company to convey the company's policy views on intensely debated and politically fraught topics, including hate speech, racism, misinformation, and radicalization, and also convey how the company has applied its policies.
The State suggests that this requirement is subject to lower scrutiny because "it is only a transparency measure" about the product. But even if the Content Category Report provisions concern only transparency, the relevant question here is: transparency into what? Even a pure "transparency" measure, if it compels non-commercial speech, is subject to strict scrutiny. That is true of the Content Category Report provisions. Insight into whether a social media company considers, for example, (1) a post citing rhetoric from on-campus protests to constitute hate speech; (2) reports about a seized laptop to constitute foreign political interference; or (3) posts about election fraud to constitute misinformation is sensitive, constitutionally protected speech that the State could not otherwise compel a social media company to disclose without satisfying strict scrutiny.
The mere fact that those beliefs are memorialized in the company's content moderation policy does not, by itself, convert expression about those beliefs into commercial speech. As X Corp. argues in its reply brief, such a rule would be untenable. It would mean that basically any compelled disclosure by any business about its activities would be commercial and subject to a lower tier of scrutiny, no matter how political in nature. Protection under the First Amendment cannot be vitiated so easily.
{For substantially the same reason, nor can the test for whether speech is commercial or non-commercial turn on whether the speech is "directed to potential consumers and may presumably play a role in the decision of whether to use the platform," as the district court seemed to suggest. Consider, for example, a state law that compels a social media company to disclose the political affiliations of its managers. That information could conceivably "play a role in the [potential consumer's] decision of whether to use the platform"—i.e., if the consumer is concerned about the platform's content being politically skewed. It could not be that such a law compels only commercial speech subject to a lower tier of scrutiny.} …
[N]either the Fifth nor Eleventh Circuit [in the NetChoice cases] dealt with speech similar to the Content Category Reports. Unlike Texas HB 20 or Florida SB 7072, the Content Category Report provisions compel social media companies to report whether and how they believe particular, controversial categories of content should be defined and regulated on their platforms. Neither the Texas nor Florida provisions at issue in the NetChoice cases require a company to disclose the existence or substance of its policies addressing such categories. Though perhaps relevant to an analysis of [other sections], these cases are unhelpful on the issue of the Content Category Reports and offer no compelling reason to apply Zauderer.
For these reasons, we conclude that the Content Category Report provisions compel non-commercial speech. Because the provisions are content-based, which the State does not contest, they are subject to strict scrutiny….
At minimum, the Content Category Report provisions likely fail under strict scrutiny because they are not narrowly tailored. They are more extensive than necessary to serve the State's purported goal of "requiring social media companies to be transparent about their content-moderation policies and practices so that consumers can make informed decisions about where they consume and disseminate news and information." Consumers would still be meaningfully informed if, for example, a company disclosed whether it was moderating certain categories of speech without having to define those categories in a public report. Or, perhaps, a company could be compelled to disclose a sample of posts that have been removed without requiring the company to explain why or on what grounds.
{We do not opine on whether such laws would survive constitutional scrutiny. They are offered only to illustrate that the Content Category Report provisions are not narrowly tailored to the State's interest.}
In any event, the State does not attempt to argue that the law survives strict scrutiny. For the reasons above, X Corp. has shown a likelihood of success on the merits of its First Amendment claim as to [these provisions] ….
Joel L. Kurtzberg, Floyd Abrams, Jason D. Rozbruch, and Lisa J. Cole (Cahill Gordon & Reindel LLP) and William R. Warne and Meghan M. Baker (Downey Brand LLP) represent X. Note that Gene C. Schaerr (my colleague at Schaerr Jaffe LLP) filed an amicus brief in this case, on X's side, for Protect the First Foundation and for me, as amici.
To get the Volokh Conspiracy Daily e-mail, please sign up here.
Show Comments (3)