Reason.com - Free Minds and Free Markets

Social Media

Landmark Social Media Addiction Trial Starts Today in California

In the first social media addiction case to reach a jury, K.G.M. claims TikTok, YouTube, and other platforms are responsible for her depression, anxiety, and poor self esteem.

Elizabeth Nolan Brown | 1.26.2026 11:25 AM


A woman with her head in her hands with social media apps around her head. |  Illustration: Midjourney
( Illustration: Midjourney)

A potential watershed moment for the tech industry starts this week, as the first of a series of major "social media addiction" trials gets underway.

The case involves a 19-year-old California woman, K.G.M., who claims she was addicted to social media as a minor. This habit, she asserts, caused low self-esteem, body dysmorphia, depression, anxiety, and suicidal thoughts.

K.G.M. sued Meta, ByteDance, and Google, arguing they had designed their products—including Facebook, Instagram, TikTok, and YouTube—to keep kids glued to them and that they were negligent about the harms this could cause. (Snapchat was originally named as a defendant too, but it settled with K.G.M. last week on undisclosed terms.)

You are reading Sex & Tech, from Elizabeth Nolan Brown. Get more of Elizabeth's sex, tech, bodily autonomy, law, and online culture coverage.

This field is for validation purposes and should be left unchanged.

A First-of-Its-Kind Trial With Broad Implications

Thousands of plaintiffs have brought personal injury suits against Meta, Google, ByteDance, and Snap, alleging addiction and other mental health harms. Many of these suits were consolidated into one coordinated judicial proceeding, from which the judge selected three cases (involving pseudonymous plaintiffs K.G.M., R.K.C., and Moore) to go to trial. "The judgments in those first three cases will, at least in theory, form the basis for a settlement covering the remaining plaintiffs," reports Courthouse News.

Jury selection in K.G.M.'s case starts in California Superior Court in Los Angeles County on Tuesday.

"The bellwether trial marks the first time in the United States that a case alleging social media addiction will reach a jury," notes Clay Calvert, a senior fellow at the American Enterprise Institute.

The case has huge implications for free speech online and for tech companies more broadly.

Design or Content?

K.G.M. is suing on negligence and negligent failure-to-warn grounds. If she's successful, it could portend a tsunami of lawsuits against tech platforms from people claiming that they and/or their children spend too much time on social media and it's Big Tech's fault.

The case could also provide a playbook for plaintiffs to get around Section 230—the federal law that limits online entities' liability for third-party speech—and the First Amendment. K.G.M. argues that it's not the content on YouTube, TikTok, etc., that is addictive; it's their design features. In this way, her complaint centers the harm on actions taken by tech giants rather than by third parties.

This is far from the first time this approach has been tried. And judges have generally—though not always—rejected the argument.

In fact, the judge in this case—Carolyn Kuhl—has already thrown out some of K.G.M.'s claims on Section 230 grounds. These included claims against TikTok based on "challenge" videos, in which users posted themselves engaging in risky activity.

But "there is evidence in the record that K.G.M. was harmed by design features," Kuhl wrote in a November 2025 motion. "The cause of K.G.M.'s harms is a disputed factual question that must be resolved by the jury."

The Heart of These Claims

I agree with YouTube attorney Brian Willen, who said at an October hearing: "Exposure to third-party content is at the heart of these claims."

The idea that this isn't about content falls apart pretty quickly upon examination—especially given some of K.G.M.'s statements. For instance: "I have gotten a lot of content promoting…like body checking, posts [of] what I eat in a day—just a cucumber—making people feel bad if they don't eat like that," she said in one video played for the court. Statements like that suggest it's the material itself—material protected by the First Amendment—that alleged contributed to K.G.M.'s problems.

No matter what algorithms TikTok uses to generate people's feeds, whether Facebook employs endless scroll, or whether YouTube autoplays a next video, people would not be coming back to these platforms unless the content they featured—content provided by third parties—was compelling.

And this content could not contribute to negative self-image or other negative feelings based on design features alone. Even the filters K.G.M. complains about creating unrealistic expectations are simply a neutral photo-editing tool—one of many—that can provoke negative feelings only because of the way that users decide to employ it in creating their own content.

You cannot divorce the design of these platforms from the content that they serve.

Making Compelling Products Is Not Illegal

Certainly, these social platforms are designed to help draw eyeballs to third-party content. But if trying to attract people to your product is illegal, then the majority of U.S. companies are guilty.

Take streaming platforms. Like social media platforms, they also offer custom recommendations and endless scroll. Like YouTube, they often auto-play more content.

Or take major clothing retailers. They often send out notifications—in the form of text messages and emails—with product recommendations tailored to user's tastes, basing these recommendations on a user's past views and purchases. They use promos and other enticements to draw visitors to their websites, where they may offer showcase products via infinite scroll.

You seldom hear people blaming stores and streaming services for "addiction." We expect adults to exert willpower in these scenarios. And we expect parents to parent here, preventing their kids from binge watching Netflix endlessly or compulsively shopping. The same goes for gaming platforms, TV networks, and so much more.

The fact that many people hold social media platforms to a special standard reflects the global moral panic around them (and ulterior motives, such as politicians' desire to control online speech) rather than especially nefarious intent or negligent actions from tech giants.

Just the Beginning

"Since 2022, school districts, dozens of state attorneys general and other families have filed a deluge of lawsuits against big tech platforms—many of which have been consolidated into a separate case before U.S. District Court for the Northern District of California," The Washington Post reports.

Trials for the school-district cases will start this summer in federal court, where they're slated to be heard by U.S. District Judge Yvonne Gonzalez Rogers. She is scheduled to hear summary judgment motions in the school district cases today.

Rogers is "also expected to hear cases brought by dozens of state attorneys general against Meta arguing the company violated consumer protection laws because they were deceptive about the safety of their products," per the Post.

Key Issues: Negligence, Causation

Some of the key issues in K.G.M.'s case involve "parceling out what (or how much, if any) harm may have been caused by which platform [and] whether warnings truly would have prevented the causation of alleged injuries," notes Calvert.

A jury will also have to decide "if her use of the apps was a substantial factor in her depression, compared with other causes such as the third-party content she viewed on the apps or aspects of her life offline," reports Reuters.

That last bit seems like a crucial component in all of this. If all that K.G.M. or the other defendants had to prove was that tech platforms strove to attract users—including minors—and to increase their engagement, this would probably be a slam dunk for them. But not only must they show that these companies were negligent, and not only must they separate design effects from content effects, they must also prove that whatever ill effects came to the plaintiffs stemmed more from social media than from other factors, such as biology, school, or other stressors.

Decades of awareness-raising around depression have told us that it stems from a complicated interplay of environmental factors, biological factors, and more. So much of the panic around social media and mental health seems to ignore this lessons in favor of a simplistic cause-and-effect narrative in which evil corporations caused preventable tragedies.

"Research doesn't show that all digital media use is bad or addictive," Tamar Mendelson, a professor at Johns Hopkins Bloomberg School of Public Health, told the Post. One study "actually shows that the kids who seem to have the most mental health problems were actually ones who used almost no digital media, as well as the ones who used it excessively."

Surely, social media can be a crutch for people who are already struggling. It also seems likely that social media—along with artificial intelligence, porn, video games, and many other forms of media and technology—could exacerbate existing problems in some subset of the population. But the vast majority of people who use these products or consume this media, including the majority of teens, don't develop depression, starve themselves, or become suicidal. And a particular product or platform being bad news for some subset of people with pre-existing issues is worlds apart from the notion that these entities are universally harmful or that their creators negligent for designing them this way.


TikTok Deal Reached At Last

Almost two years after the Senate passed a law banning TikTok if parent company ByteDance didn't divest of it, and a little more than a year after President Donald Trump first postponed shutting down TikTok to comply with that law, ByteDance has announced that it will sell the U.S. version of the app to U.S. investors.

"TikTok said on Thursday that its Chinese owner, ByteDance, had struck a deal with a group of non-Chinese investors to create a new U.S. TikTok, concluding a six-year legal saga," The New York Times reports:

Investors including the software giant Oracle; MGX, an Emirati investment firm; and Silver Lake, another investment firm, will own more than 80 percent of the new venture. That list also includes the personal investment entity for Michael Dell, the tech billionaire behind Dell Technologies, and other firms, TikTok said. Adam Presser, TikTok's former head of operations, will be the chief executive for the U.S. TikTok….

Under the new arrangement, Oracle, MGX and Silver Lake will each own 15 percent of TikTok's U.S. operations. ByteDance will own just under 20 percent.

While the deal avoids U.S. users seeing TikTok disappear, many are worried about the implications of all of this. We've got Congress demanding that that a popular platform for online speech be sold by its foreign owners or face a ban, under the auspice of vague hand-waving about national security.

We've got the president postponing that ban for a year under unclear legal authority.

And we've ultimately got a deal that will result in several Trump allies taking partial ownership of the app.

"Several of the new investors have ties to Mr. Trump, raising concerns for some TikTok users that the app could start showing more content aligned with the president's views or the positions of the U.S. government," notes the Times.

"This deal…hands control over speech on TikTok to a new consortium of investors with their own motivations for shaping online discourse," commented Kat Ruane of the Center for Democracy and Technology on Bluesky.

Americans may breathe easier knowing TikTok isn't going dark, but questions remain. Cato's @jrhuddles breaks down what this deal means for users, data, and First Amendment rights. pic.twitter.com/4JXMgRnI7x

— Cato Institute (@CatoInstitute) January 23, 2026


More Sex & Tech News 

Why liberals lost tech: It's not because the Biden administration swung too far left and so Silicon Valley had no choice but to swing right, suggests Jordan McGillis at The Argument. "For some right-leaning technologists, populism isn't a distasteful compromise: it's the whole point."

OneTaste sentencing scheduled for March: Sentencing for orgasmic meditation advocates Nicole Daedone and Rachel Cherwitz is scheduled for March 4, 2026. They face 15- to 20-year prison sentences each.

Is 2026 the year social media dies? Kate Lindsay of the newsletter Embedded thinks so. "Everyone wants off their phones," she writes, and while "this isn't a particularly new desire…something about this year feels different." Perhaps that's less about the fact that people suddenly want phone discipline more and more about the fact that "we just left a year when social media got fundamentally less interesting. The only reason cutting down on screen time was ever hard was because we wanted to look at our phones. But over the past few months, as I methodically opened and closed my stable of apps, I slowly realized just how long it's been since I found what I was looking for."

New AI bills proliferate: "As lawmakers kick off the 2026 legislative session, a new and consequential phase in the conversation about free speech and artificial intelligence is already taking shape in statehouses across the country," writes John Coleman of the Foundation for Individual Rights and Expression. "Yet another crop of AI bills is set to dictate how people use machines to speak and communicate, raising fundamental constitutional questions about freedom of expression in this country."

Taylor Lorenz compares concern over smartphones to Satanic Panic, noting that "the pervasive false narrative that phones, apps, and screen time are responsible for rising anxiety, depression, and harm among teenagers" is being used "to justify mass surveillance laws, deplatforming marginalized people, and implementing policies that actually harm kids and reward big tech."

"She's Probably Not 'Delaying Marriage'": Cartoons Hate Her challenges the narrative that marriage rates are down and/or people are marrying at older ages because women are deliberately choosing to put off marriage until they're much older.

Microplastics pushback: "The microplastics panic is not the result of disputed science entering the public lexicon; it's the result of shoddy science journalism and a nascent anti-modernity vibe that is plagued by confirmation bias," writes Jerusalem Demsas.

Elizabeth Nolan Brown is a senior editor at Reason.

Social MediaAddictionFirst AmendmentChildrenPsychology/PsychiatryTeenagersInternetSection 230LawsuitsFree SpeechFacebookTikTokGoogleCalifornia