Bluesky Blocks Mississippi Users
Age verification laws are already coming for Americans’ access to free speech.
The social media platform Bluesky is blocking users from Mississippi rather than comply with the state's age verification law.
"Unfortunately, Bluesky is unavailable in Mississippi right now, due to a new state law that requires age verification for all users," the company's official account posted last Friday. "While intended for child safety, we think this law poses broader challenges & creates significant barriers that limit free speech & harm smaller platforms like ours."
You are reading Sex & Tech, from Elizabeth Nolan Brown. Get more of Elizabeth's sex, tech, bodily autonomy, law, and online culture coverage.
How To Violate Privacy While Creating a Censorship Regime
Mississippi's age-check law—like so many others being floated and passed in different states—requires social media companies and some other types of web platforms to check all user ages. "A digital service provider may not enter into an agreement with a person to create an account with a digital service unless the person has registered the person's age with the digital service provider," states House Bill 1126, passed last year.
The law applies to digital services that allow users "to socially interact with other users," create a profile, and "create or post content"—a category that covers not just such platforms as Bluesky, X, Instagram, TikTok, and Facebook but also message and chat forums, YouTube and similar video sites, and more.
Providers can verify ages by checking IDs, using bio-metric scans, or other "commercially reasonable efforts," but the result is the same: a major invasion of all users' privacy and digital security. That's strike one.
Strike two: the exclusion of even older minors from participating in these forums without express parental consent.
It gets worse. Digital service providers must also "develop and implement a strategy to prevent the known minor's exposure to harmful material," which the law defines as anything "that promotes, glorifies or facilitates" suicide, self-harm, eating disorders, illegal drugs, stalking, bullying, harassment, grooming, trafficking, child pornography, sexual exploitation, violence, substance abuse, or "any other illegal activity."
Laws like this pass because lots of people casually think "Well, who would want minors to see such horrid stuff?" and move on. But if you stop to ponder how this requirement would work in practice, a lot of problems emerge.
Many of those categories are legal, First Amendment–protected speech, and preventing minors from seeing it often means preventing anyone from seeing it. What's more, defining and identifying which speech falls into this "harmful material" category would be hard enough for the most meticulous human moderators to do properly, let alone the kinds of algorithms necessary to do such content moderation at scale.
That means huge amounts of not just legal but actually beneficial—or at least morally and socially neutral—content will get caught up in the sweep.
We've seen this already with the implementation of the U.K.'s Online Safety Act, which serves some of the same functions as this Mississippi measure. Attempts to stop minors from seeing "violent" or "self-harm" material quickly turned into war news and psychological support groups getting restricted for minors—or anyone unwilling to show an ID.
That's strike three: Laws like these quickly become enormous censorship regimes.
Mississippi Law 'Likely Unconstitutional,' Can Still Be Enforced
Enforcement of Mississippi HB 1126 was put on hold by Judge Halil Suleyman Ozerden last year and, in June Ozerden enjoined enforcement of the law again (after his first preliminary injunction went to the appeals court and then back to him). But the state appealed, and in July the U.S. Court of Appeals for the Fifth Circuit said Mississippi could start enforcing the law as the legal challenge to it played out.
NetChoice—a tech trade group challenging the law—appealed to the U.S. Supreme Court, seeking an emergency injunction on enforcement. But the Supreme Court declined to intervene at this juncture.
The Court did not say why they would not halt enforcement as the challenge plays out, though Justice Brett Kavanaugh did suggest, in a short concurring opinion, that NetChoice "is likely to succeed on the merits—namely, that enforcement of the Mississippi law would likely violate its members' First Amendment rights under this Court's precedents."
"The Mississippi law is likely unconstitutional," wrote Kavanaugh. But "because NetChoice has not sufficiently demonstrated that the balance of harms and equities favors it at this time, I concur in the Court's denial of the application for interim relief."
With the ultimate fate of the law still pending, "we cannot justify building the expensive required infrastructure," Bluesky posted on Friday. "For now, we have made the difficult decision to block access in Mississippi."
A Two-Tiered Internet?
Bluesky's decision is reminiscent of the way some porn companies—including Aylo, the company behind such platforms as Pornhub—have reacted in response to state laws requiring age verification for online providers of adult content.
If this trend keeps up, the U.S. will soon see a two-tiered internet, where residents of some states have access to a relatively free and open online world and residents of other states have major restrictions on what they can see.
Of course, with so many states passing laws requiring various kinds of age verification and harm mitigation—and the federal government constantly threatening to do so—it could soon behoove tech companies to cave and simply start verifying everyone's ages and blocking minor users entirely or treating all adults as if they're minors in terms of what they can see.
At least 20 states have laws requiring age verification for platforms where sexual content can be accessed—and while some of these are aimed specifically at porn websites, others apply more broadly (meaning they could require social media platforms or gaming marketplaces to verify ages merely because some small portion of their content might be adult-oriented). Meanwhile, more states are considering and passing laws like Mississippi's, which require age verification by social media platforms and other sorts of "digital service providers."
It's been absolutely predictable that the crusade to stop minors from seeing pornography would quickly spread into a censorship agenda that could span the whole internet and destroy online anonymity broadly. The cynics among us might even suggest that for some, this has been the endgame all along.
For now, measures like Mississippi's are going to come down hardest on smaller platforms without the resources to either implement compliance or fight off legal challenges.
"We think this law creates challenges that go beyond its child safety goals, and creates significant barriers that limit free speech and disproportionately harm smaller platforms and emerging technologies," states the Bluesky team in an August 22 blog post. "Age verification systems require substantial infrastructure and developer time investments, complex privacy protections, and ongoing compliance monitoring — costs that can easily overwhelm smaller providers. This dynamic entrenches existing big tech platforms while stifling the innovation and competition that benefits users."
More Sex & Tech News
Trump extends TikTok reprieve, again. On Friday, President Donald Trump suggested he'll keep extending the deadline for TikTok parent company ByteDance to sell off its U.S. operations of TikTok or face the app's ban. So far, Trump has extended the deadline three times, with the last extension scheduled to expire on September 17. "We have American buyers," Trump said. "At the right time, when we're set, I'll do it. In the meantime, until the complexity of things work out, we just extend a little bit longer."
The University of Nevada, Las Vegas (UNLV) will be home to the Norma Jean Almodovar Papers. Almodovar—author of the 1993 book Cop to Call Girl: Why I Left the LAPD to Make an Honest Living As a Beverly Hills Prostitute—has for decades been a prolific advocate for sex worker rights and a fastidious documentarian of the toll that law enforcement's crusade against sex work has been taking. "I was always hopeful that the work of me and my organization would someday lead to changes in laws to criminalize police corruption and decriminalization of sex work," Almadovar told the university. Her papers "are one of the founding collections in UNLV's new Sexual Entertainment and Economies collecting initiative," the school reports.
Uber for robot cars to hit the Big Apple. Waymo now has permission to start testing its services in New York City.
Henry VIII's wives could have benefited from modern fertility tech. Ruxandra Tuslo writes about the reproductive problems of Tudor England, arguing that reproductive health has been "one the most important biological determinants of history after infectious disease."
'What, exactly, was Twitter?' The Paroxysms newsletter suggests that "what Twitter was, and what social media continues to be, is a redescription of the world in terms of networks and relationships, and a sense imparted on its users that we lack validity and authenticity unless we are plugged securely into the right channels."
Today's Image
Show Comments (30)