The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Texas Law Mandating Age Verification for Sexually Themed Sites Violates First Amendment,
a federal judge held today.
Some excerpts from today's long decision in Free Speech Coalition, Inc. v. Colmenero by Judge David Alan Ezra (W.D. Tex.):
This case concerns a law passed by the State of Texas that restricts access to pornographic websites by requiring digital age verification methods … about the alleged harms caused by pornography….
H.B. 1181 is set to take effect on September 1, 2023. H.B. 1181 contains two requirements, both of which are challenged in this litigation. First, the law requires websites to use "reasonable age verification methods … to verify that an individual attempting to access the material is 18 years of age or older." Second, the law requires adult content websites to post a warning about the purported harmful effects of pornography and a national helpline for people with mental health disorders.
The law defines "sexual material harmful to minors" as including any material that "(A) the average person applying contemporary community standards would find, taking the material as a whole is and designed to appeal or pander to the prurient interest" to minors, (B) is patently offensive to minors, and (C) "taken as a whole, lacks serious literary, artistic, political, or scientific value for minors."
The law regulates a "commercial entity that knowingly and intentionally publishes or distributes material on an Internet website, including a social media platform, more than one-third of which is sexual material harmful to minors." H.B. 1181 requires these companies to "comply with a commercial age verification system that verifies age using: (A) government-issued identification; or (B) a commercially reasonable method that relies on public or private transactional data to verify the age of an individual." "Transactional data" refers to a "sequence of information that documents an exchange … used for the purpose of satisfying a request or event. The term includes records from mortgage, education, and employment entities." H.B. 1181 does not allow the companies or third-party verifiers to "retain any identifying information of the individual."
The court concluded that the age verification requirement was unconstitutional for various reasons, including that Ashcroft v. ACLU (II) (2004) and follow-on circuit court decisions struck down a similar federal law (COPA, the Child Online Protection Act). An excerpt that I thought was particularly closely connected to those precedents (though the court also made other arguments as well):
To endure strict scrutiny, a statute must employ the least restrictive means of protecting minors. The government bears the burden to show that less restrictive means would not be as effective…. [T]he Court … finds that the age verification enforcement mechanism is overly restrictive….
As the district court found [in the COPA case], and the Supreme Court affirmed, "Blocking and filtering software is an alternative that is less restrictive than COPA, and, in addition, likely more effective as a means of restricting children's access to materials harmful to them." The Court elaborated that filtering software is less restrictive because "adults without children may gain access to speech they have a right to see without having to identify themselves or provide their credit card information. Even adults with children may obtain access to the same speech on the same terms simply by turning off the filter on their home computers."
Defendant argues that Ashcroft v. ACLU's analysis no longer applies because it was based on the evidentiary record made by the district court in 1999, which is not applicable to the instant case and of limited relevance to modern internet usage. As Defendant argues, H.B. 1181 uses more secure information, requires companies to delete their data, and is designed for convenience and privacy protection. The Court does not dispute that online interactions have changed since the Supreme Court's decisions in 1997 and 2004. But as determined by the facts on the record and presented at the hearing, age verification laws remain overly restrictive. Despite changes to the internet in the last two decades, the Court comes to the same conclusion regarding the efficacy and intrusiveness of age verification as the ACLU courts did in the early 2000s.
First, the restriction is constitutionally problematic because it deters adults' access to legal sexually explicit material, far beyond the interest of protecting minors. The Third Circuit's holding regarding COPA applies equally to H.B. 1181:
"[The law] will likely deter many adults from accessing restricted content because they are unwilling to provide identification information in order to gain access to content, especially where the information they wish to access is sensitive or controversial. People may fear to transmit their personal information, and may also fear that their personal, identifying information will be collected and stored in the records of various Web sites or providers of adult identification numbers."
Indeed, as the Third Circuit noted, the "Supreme Court has disapproved of content-based restrictions that require recipients to identify themselves affirmatively before being granted access to disfavored speech " The same is true here—adults must affirmatively identify themselves before accessing controversial material, chilling them from accessing that speech. Whatever changes have been made to the internet since 2004, these privacy concerns have not gone away, and indeed have amplified.
Privacy is an especially important concern under H.B. 1181, because the government is not required to delete data regarding access, and one of the two permissible mechanisms of age-verification is through government ID. People will be particularly concerned about accessing controversial speech when the state government can log and track that access. By verifying information through government identification, the law will allow the government to peer into the most intimate and personal aspects of people's lives. It runs the risk that the state can monitor when an adult views sexually explicit materials and what kind of websites they visit. In effect, the law risks forcing individuals to divulge specific details of their sexuality to the state government to gain access to certain speech. Such restrictions have a substantial chilling effect.
The deterrence is particularly acute because access to sexual material can reveal intimate desires and preferences. No more than two decades ago, Texas sought to criminalize two men seeking to have sex in the privacy of a bedroom. To this date, Texas has not repealed its law criminalizing sodomy. Given Texas's ongoing criminalization of homosexual intercourse, it is apparent that people who wish to view homosexual material will be profoundly chilled from doing so if they must first affirmatively identify themselves to the state.
Defendant contests this, arguing that the chilling effect will be limited by age verification's ease and deletion of information. This argument, however, assumes that consumers will (1) know that their data is required to be deleted and (2) trust that companies will actually delete it. Both premises are dubious, and so the speech will be chilled whether or not the deletion occurs. In short, it is the deterrence that creates the injury, not the actual retention. Moreover, while the commercial entities (e.g., Plaintiffs) are required to delete the data, that is not true for the data in transmission. In short, any intermediary between the commercial websites and the third-party verifiers will not be required to delete the identifying data.
Even beyond the capacity for state monitoring, the First Amendment injury is exacerbated by the risk of inadvertent disclosures, leaks, or hacks. Indeed, the State of Louisiana passed a highly similar bill to H.B. 1181 shortly before a vendor for its Office of Motor Vehicles was breached by a cyberattack. In a related challenge to a similar law, Louisiana argues that age-verification users were not identified, but this misses the point. The First Amendment injury does not just occur if the Texas or Louisiana DMV (or a third-party site) is breached. Rather, the injury occurs because individuals know the information is at risk. Private information, including online sexual activity, can be particularly valuable because users may be more willing to pay to keep that information private, compared to other identifying information. Kim Zetter, Hackers Finally Post Stolen Ashley Madison Data, Wired, Aug. 18, 2015, https://www.wired.com/2015/08/happened-hackers-posted-stolen-ashley-madison-data (discussing Ashley Madison data breach and hackers' threat to "release all customer records, including profiles with all the customers' secret sexual fantasies and matching credit card transactions, real names and addresses.").
It is the threat of a leak that causes the First Amendment injury, regardless of whether a leak ends up occurring….
Plaintiffs offer several alternatives that would target minor[s'] access to pornography with fewer burdens on adults' access to protected sexually explicit materials. First, the government could use internet service providers, or ISPs, to block adult content until the adults opt-out of the block. This prevents the repeated submission of identifying information to a third party, and operating at a higher level, would not need to reveal the specific websites visited. If implemented on a device-level, sexual information would be allowed for adults' devices but not for children when connected to home internet.
In addition, Plaintiffs propose adult controls on children's devices, many of which already exist and can be readily set up. This "content filtering" is effectively the modern version of "blocking and filtering software" that the Supreme Court proposed as a viable alternative in Ashcroft v. ACLU. Blocking and filtering software is less restrictive because adults may access information without having to identify themselves. And the Court agreed with the finding that "filters are more effective than age-verification requirements." …
Content-filtering also helps address the under-inclusivity issue. At the hearing, Defendant argued that if H.B. 1181 covered more websites, such as search engines, then Plaintiffs would instead argue that it is overbroad. The point is well-taken, but it misses a crucial aspect: the law would be overbroad because age verification is a broad method of enforcement. Under H.B. 1181, age verification works by requiring a user's age at a website's landing page. This forces Texas (and other states) to choose some broad threshold (e.g., one-third) for what percentage of a website must be sexual before requiring age verification. But this is not true for content filtering, which applies to the material on a webpage, not just the site as a whole. So users can browse Reddit, but will be screened from the sexual material within the site by the content filter. Similarly, a user can search Google, but not encounter pornographic images. This is the definition of tailoring: content filtering, as opposed to age verification, can more precisely screen out sexual content for minors without limiting access to other speech.
Content filtering is especially tailored because parents can choose the level of access. In other words, parents with an 8-year-old can filter out content inappropriate for an 8-year-old, while parents with a 17-year-old can filter out content inappropriate for a 17-year-old. Using age verification, a 17-year-old will be denied access to material simply because it might be inappropriate for a young minor. Content filtering, by contrast, allows for much more precise restrictions within age groups.
In general, content filtering also comports with the notion that parents, not the government, should make key decisions on how to raise their children. See United States v. Playboy Ent. Grp., Inc. (2000) ("A court should not assume a plausible, less restrictive alternative would be ineffective; and a court should not presume parents, given full information, will fail to act."). Likewise, even as it upheld obscenity laws [including bans on distributing harmful-to-minors material to minors in face-to-face transactions -EV], Ginsberg v. N.Y. (1968) affirmed that "constitutional interpretation has consistently recognized that the parents' claim to authority in their own household to direct the rearing of their children is basic in the structure of our society."
Content filtering allows parents to determine the level of access that their children should have, and it encourages those parents to have discussions with their children regarding safe online browsing. As the Principi article notes, it is this combination that is most effective for preventing unwanted exposure to online pornography. Age verification, by contrast, places little to no control in the hands of parents and caretakers. Thus, content filtering keeps the "parents' claim to authority in their own household to direct the rearing of their children " …
Again, changes to the internet since 2003 have made age verification more—not less—cumbersome than alternatives. Parental controls are commonplace on devices. They require little effort to set up and are far less restrictive because they do not target adults' devices.
Moreover, content filtering is likely to be more effective because it will place a more comprehensive ban on pornography compared to geography-based age restrictions, which can be circumvented through a virtual private network ("VPN") or a browser using Tor. Adult controls, by contrast, typically prevent VPNs (or Tor-capable browsers) from being installed on devices in the first place….
In addition, content filtering blocks out pornography from foreign websites, while age verification is only effective as far as the state's jurisdiction can reach. This is particularly troublesome for Texas because, based on the parties here alone, foreign websites constitute some of the largest online pornographic websites
globally. If they are not subject to personal jurisdiction in the state, they will have no legal obligation to comply with the H.B. 1181. Age verification is thus limited to Texas's jurisdictional reach. Content filtering, by contrast, works at the device level and does not depend on any material's country of origin.
Defendant disputes the effects of content filtering and argues that it is only as effective as the caretakers' ability to implement it. But even as Defendant's technical expert noted at the hearing, content filtering is designed for parents and caretakers to be easy to use in a family. The technical knowledge required to implement content-filtering is quite basic, and usually requires only a few steps. And the legislature made no findings regarding difficulty of use when it passed the law.
At the hearing, Defendant's expert repeatedly emphasized that parents often fail to implement parental controls on minors' devices. But Defendant has not pointed to any measures Texas has taken to educate parents about content filtering. And more problematically, the argument disregards the steps Texas could take to ensure content filtering's use, including incentives for its use or civil penalties for parents or caretakers who refuse to implement the tool. Indeed, draft bills of H.B. 1181 included such a measure, but it was abandoned without discussion. In Ashcroft v. ACLU, the Supreme Court gave this precise argument "little weight," noting that the government has ample means of encouraging content filtering's use. In short, Texas cannot show that content filtering would be ineffective when it has detailed no efforts to promote its use.
There's also an interesting discussion of the disclaimer requirements (which were struck down as unconstitutional speech compulsions) and the First Amendment rights of foreign sites distributing material in the U.S., but I hope to include those in separate posts.
Plaintiffs are represented by Scott L. Cole, Michael T. Zeller, Derek L. Shaffer, Thomas Nolan, and Arian Koochesfahani (Quinn Emmanuel Urquhart & Sullivan, LLP) and Jeffrey Keith Sandman (Webb Daniel Friedlander LLP).
Show Comments (30)