The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Free Speech

Arkansas Social Media Age Verification Law Likely Violates First Amendment

So concludes a federal judge, issuing a preliminary injunction against enforcement of the law.

|

From Judge Timothy Brooks' opinion yesterday in Netchoice, LLC v. Griffin (W.D. Ark.):

This case presents a constitutional challenge to Arkansas Act 689 of 2023, the "Social Media Safety Act" …, a new law that aims to protect minors from harms associated with the use of social media platforms. Act 689 … requires social media companies to verify the age of all account holders who reside in Arkansas. Self-reporting one's age (a common industry practice) is not sufficient; Arkansans must submit age-verifying documentation before accessing a social media platform.

Under Act 689, a "social media company," as defined in the Act, must outsource the age-verification process to a third-party vendor. A prospective user of social media must first prove their age by uploading a specified form of identification, such as a driver's license, to the third-party vendor's website. A verified adult may obtain a social media account. Minors, however, will be denied an account and prohibited from accessing social media platforms, unless a parent provides express consent—which will require more proof to confirm the parent's age, identity, and relationship to the minor….

The court held that the law likely violated the First Amendment:

Deciding whether Act 689 is content-based or content-neutral turns on the reasons the State gives for adopting the Act. First, the State argues that the more time a minor spends on social media, the more likely it is that the minor will suffer negative mental-health outcomes, including depression and anxiety. Second, the State points out that adult sexual predators on social media seek out minors and victimize them in various ways. Therefore, to the State, a law limiting access to social media platforms based on the user's age would be content-neutral and require only intermediate scrutiny.

On the other hand, the State points to certain speech-related content on social media that it maintains is harmful for children to view. Some of this content is not constitutionally protected speech, while other content, though potentially damaging or distressing, especially to younger minors, is likely protected nonetheless. Examples of this type of speech include depictions and discussions of violence or self-harming, information about dieting, so-called "bullying" speech, or speech targeting a speaker's physical appearance, race or ethnicity, sexual orientation, or gender. If the State's purpose is to restrict access to constitutionally protected speech based on the State's belief that such speech is harmful to minors, then arguably Act 689 would be subject to strict scrutiny.

During the hearing, the State advocated for intermediate scrutiny and framed Act 689 as "a restriction on where minors can be," emphasizing it was "not a speech restriction" but "a location restriction." The State's briefing analogized Act 689 to a restriction on minors entering a bar or a casino. But this analogy is weak. After all, minors have no constitutional right to consume alcohol, and the primary purpose of a bar is to serve alcohol. By contrast, the primary purpose of a social media platform is to engage in speech, and the State stipulated that social media platforms contain vast amounts of constitutionally protected speech for both adults and minors. Furthermore, Act 689 imposes much broader "location restrictions" than a bar does….

Having considered both sides' positions on the level of constitutional scrutiny to be applied, the Court tends to agree with NetChoice that the restrictions in Act 689 are subject to strict scrutiny. However, the Court will not reach that conclusion definitively at this early stage in the proceedings and instead will apply intermediate scrutiny, as the State suggests. Under intermediate scrutiny, a law must be "narrowly tailored to serve a significant governmental interest[,]"which means it must advance that interest without "sweep[ing] too broadly" or chilling more constitutionally protected speech than is necessary, and it must not "raise serious doubts about whether the statute actually serves the state's purported interest" by "leav[ing] [out]" and failing to regulate "significant influences bearing on the interest."

Since Act 689 clearly serves an important governmental interest, the Court will address whether the Act burdens adults' and/or minors' access to protected speech and whether the Act is narrowly tailored to burden as little speech as possible while effectively serving the State's interest in protecting minors online.

Burdens on Adults' Access to Speech …

Requiring adult users to produce state-approved documentation to prove their age and/or submit to biometric age-verification testing imposes significant burdens on adult access to constitutionally protected speech and "discourage[s] users from accessing [the regulated] sites." Age-verification schemes like those contemplated by Act 689 "are not only an additional hassle," but "they also require that website visitors forgo the anonymity otherwise available on the internet." …

Burdens on Minors' Access to Speech

The Supreme Court instructs:

[M]inors are entitled to a significant measure of First Amendment protection, and only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to them. No doubt a State possesses legitimate power to protect children from harm, but that does not include a free-floating power to restrict the ideas to which children may be exposed. Speech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.

Neither the State's experts nor its secondary sources claim that the majority of content available on the social media platforms regulated by Act 689 is damaging, harmful, or obscene as to minors. And even though the State's goal of internet safety for minors is admirable, "the governmental interest in protecting children does not justify an unnecessarily broad suppression of speech addressed to adults."

Act 689 is Not Narrowly Tailored

The Court first considers the Supreme Court's narrow-tailoring analysis in Brown v. Entertainment Merchants Association, which involved a California law prohibiting the sale or rental of violent video games to minors. The state "claim[ed] that the Act [was] justified in aid of parental authority: By requiring that the purchase of violent video games [could] be made only by adults, the Act ensure[d] that parents [could] decide what games [were] appropriate." The Brown Court recognized that the state legislature's goal of "addressing a serious social problem," namely, minors' exposure to violent images, was "legitimate," but where First Amendment rights were involved, the Court cautioned that the state's objectives "must be pursued by means that are neither seriously underinclusive nor seriously overinclusive."

"As a means of protecting children from portrayals of violence, the legislation [was] seriously underinclusive, not only because it exclude[d] portrayals other than video games, but also because it permit[ted] a parental … veto." If the material was indeed "dangerous [and] mindaltering," the Court explained, it did not make sense to "leave [it] in the hands of children so long as one parent … says it's OK."  Equally, "as a means of assisting concerned parents," the Court held that the regulation was "seriously overinclusive because it abridge[d] the First Amendment rights of young people whose parents … think violent video games are a harmless pastime."  Put simply, the legislation was not narrowly tailored.

In the end, the Brown Court rejected the argument "that the state has the power to prevent children from hearing or saying anything without their parents' prior consent," for "[s]uch laws do not enforce parental authority over children's speech and religion; they impose governmental authority, subject only to a parental veto." "This is not the narrow tailoring to 'assisting parents' that restriction of First Amendment rights requires."  The Court also expressed "doubts that punishing third parties for conveying protected speech to children just in case their parents disapprove of that speech is a proper governmental means of aiding parental authority."  "Accepting that position would largely vitiate the rule that 'only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to [minors].'"

The State regulation here, like the one in Brown, is not narrowly tailored to address the harms that the State contends are encountered by minors on social media. The State maintains that Act 689's exemptions are meant to precisely target the platforms that pose the greatest danger to minors online, but the data do not support that claim.

To begin with, the connection between these harms and "social media" is ill defined by the data. It bears mentioning that the State's secondary sources refer to "social media" in a broad sense, though Act 689 regulates only some social media platforms and exempts many others. For example, YouTube is not regulated by Act 689, yet one of the State's exhibits discussing the dangers minors face on "social media" specifically cites YouTube as being "the most popular online activity among children aged 3–17" and notes that "[a]mong all types of online platforms, YouTube was the most widely used by children …."

Likewise, another State exhibit published by the FBI noted that "gaming sites or video chat applications that feel familiar and safe [to minors]" are common places where adult predators engage in financial "sextortion" of minors. However, Act 689 exempts these platforms from compliance. Mr. Allen, the State's expert, criticized the Act for being "very limited in terms of the numbers of organizations that are likely to be caught by it, possibly to the point where you can count them on your fingers…." He then stated that he did not "want to be unkind to the people who drafted [Act 689]," but at least some exempt platforms are ones that adult sexual predators commonly use to communicate with children, including Kik and Kik Messenger, Google Hangouts, and interactive gaming websites and platforms.

The Court asked the State's attorney why Act 689 targets only certain social media companies and not others, and he responded that the General Assembly crafted the Act's definitions and exemptions using the data reported in an article published by the National Center for Missing and Exploited Children ("NCMEC"). This article lists the names of dozens of popular platforms and notes the number of suspected incidents of child sexual exploitation that each self-reported over the past year. The State selected what it considered the most dangerous platforms for children—based on the NCMEC data—and listed those platforms in a table in its brief.

During the hearing, the Court observed that the data in the NCMEC article lacked context; the article listed raw numbers but did not account for the amount of online traffic and number of users present on each platform. The State's attorney readily agreed, noting that "Facebook probably has the most people on it, so it's going to have the most reports." But he still opined that the NCMEC data was a sound way to target the most dangerous social media platforms, so "the highest volume [of reports] is probably where the law would be concentrated."

Frankly, if the State claims Act 689's inclusions and exemptions come from the data in the NCMEC article, it appears the drafters of the Act did not read the article carefully. Act 689 regulates Facebook and Instagram, the platforms with the two highest numbers of reports. But, the Act exempts Google, WhatsApp, Omegle, and Snapchat— the sites with the third-, fourth-, fifth-, and sixth-highest numbers of reports. Nextdoor is at the very bottom of NCMEC's list, with only one report of suspected child sexual exploitation all year, yet the State's attorney noted during the hearing that Nextdoor would be subject to regulation under Act 689.

None of the experts and sources cited by the State indicate that risks to minors are greater on platforms that generate more than $100 million annually. Instead, the research suggests that it is the amount of time that a minor spends unsupervised online and the content that he or she encounters there that matters. However, Act 689 does not address time spent on social media; it only deals with account creation. In other words, once a minor receives parental consent to have an account, Act 689 has no bearing on how much time the minor spends online. Using the State's analogy, if a social media platform is like a bar, Act 689 contemplates parents dropping their children off at the bar without ever having to pick them up again. The Act only requires parents to give express permission to create an account on a regulated social media platform once. After that, it does not require parents to utilize content filters or other controls or monitor their children's online experiences—something Mr. Allen believes the real key to keeping minors safe and mentally well on social media.

The State's brief argues that "requiring a minor to have parental authorization to make a profile on a social media site …. means that many minors will be protected from the well-documented mental health harms present on social media because their parents will have to be involved in their profile creation" and are therefore "more likely to be involved in their minor's online experience." But this is just an assumption on the State's part, and there is no evidence of record to show that a parent's involvement in account creation signals an intent to be involved in the child's online experiences thereafter….

Finally, the Court concludes that Act 689 is not narrowly tailored to target content harmful to minors. It simply impedes access to content writ large….

Age-verification requirements are more restrictive than policies enabling or encouraging users (or their parents) to control their own access to information, whether through user-installed devices and filters or affirmative requests to third-party companies. "Filters impose selective restrictions on speech at the receiving end, not universal restrictions at the source." Ashcroft v. ACLU (II) (2004). And "[u]nder a filtering regime, adults … may gain access to speech they have a right to see without having to identify themselves[.]"Similarly, the State could always "act to encourage the use of filters … by parents" to protect minors.

In sum, NetChoice is likely to succeed on the merits of the First Amendment claim it raises on behalf of Arkansas users of member platforms. The State's solution to the very real problems associated with minors' time spent online and access to harmful content on social media is not narrowly tailored. Act 689 is likely to unduly burden adult and minor access to constitutionally protected speech. If the legislature's goal in passing Act 689 was to protect minors from materials or interactions that could harm them online, there is no compelling evidence that the Act will be effective in achieving those goals.

And the court held that Act 689 was likely unconstitutionally vague:

A "social media company" is defined as "an online forum that a company makes available for an account holder" to "[c]reate a public profile, establish an account, or register as a user for the primary purpose of interacting socially with other profiles and accounts," "[u]pload or create posts or content," "[v]iew posts or content of other account holders," and "[i]nteract with other account holders or users, including without limitation establishing mutual connections through request and acceptance." But the statute neither defines "primary purpose"—a term critical to determining which entities fall within Act 689's scope—nor provides any guidelines about how to determine a forum's "primary purpose," leaving companies to choose between risking unpredictable and arbitrary enforcement (backed by civil penalties, attorneys' fees, and potential criminal sanctions) and trying to implement the Act's costly age-verification requirements. Such ambiguity renders a law unconstitutional….

The State argues that Act 689's definitions are clear and that "any person of ordinary intelligence can tell that [Act 689] regulates Meta, Twitter[,] and TikTok." But what about other platforms, like Snapchat? David Boyle, Snapchat's Senior Director of Products, stated in his Declaration that he was not sure whether his company would be regulated by Act 689. He initially suspected that Snapchat would be exempt until he read a news report quoting one of Act 689's co-sponsors who claimed Snapchat was specifically targeted for regulation.

During the evidentiary hearing, the Court asked the State's expert, Mr. Allen, whether he believed Snapchat met Act 689's definition of a regulated "social media company." He responded in the affirmative, explaining that Snapchat's "primary purpose" matched Act 689's definition of a "social media company" (provided it was true that Snapchat also met the Act's profitability requirements). When the Court asked the same question to the State's attorney later on in the hearing, he gave a contrary answer—which illustrates the ambiguous nature of key terms in Act 689. The State's attorney disagreed with Mr. Allen—his own witness—and said the State's official position was that Snapchat was not subject to regulation because of its "primary purpose."

Other provisions of Act 689 are similarly vague. The Act defines the phrase "social media platform" as an "internet-based service or application … [o]n which a substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application"; but the Act excludes services in which "the predominant or exclusive function is" "[d]irect messaging consisting of messages, photos, or videos" that are "[o]nly visible to the sender and the recipient or recipients" and "[a]re not posted publicly." Again, the statute does not define "substantial function" or "predominant … function," leaving companies to guess whether their online services are covered. Many services allow users to send direct, private messages consisting of texts, photos, or videos, but also offer other features that allow users to create content that anyone can view. Act 689 does not explain how platforms are to determine which function is "predominant," leaving those services to guess whether they are regulated.

Act 689 also fails to define what type of proof will be sufficient to demonstrate that a platform has obtained the "express consent of a parent or legal guardian." If a parent wants to give her child permission to create an account, but the parent and the child have different last names, it is not clear what, if anything, the social media company or third-party servicer must do to prove a parental relationship exists. And if a child is the product of divorced parents who disagree about parental permission, proof of express consent will be that much trickier to establish—especially without guidance from the State.

These ambiguities were highlighted by the State's own expert, who testified that "the biggest challenge … with parental consent is actually establishing the relationship, the parental relationship." Since the State offers no guidance about the sort of proof that will be required to show parental consent, it is likely that once Act 689 goes into effect, the companies will err on the side of caution and require detailed proof of the parental relationship. As a result, parents and guardians who otherwise would have freely given consent to open an account will be dissuaded by the red tape and refuse consent—which will unnecessarily burden minors' access to constitutionally protected speech.

Plaintiff is represented by Erin Murphy, James Xi, Joseph DeMott, and Paul Clement (Clement & Murphy, PLLC) and Katherine Church Campbell and Marshall S. Ney (Friday, Eldredge & Clark, LLP) (not to be confused with Marshal Ney).