Policing 'Thinspiration' Is Harder Than Lawmakers Think
And likely to backfire, too.
Whenever the subject of social media and teenagers comes up, it seems someone wants to talk about eating disorders. Conventional wisdom says Instagram and other highly visual platforms promote negative body image and push young people—mostly young women—to take dieting to extremes. Some politicians even want to hold tech companies legally liable when young users develop eating disorders.
Suggestions like these make me want to bang my head against a wall. They represent a dreadful misread of both eating disorders and technology, as well as the limits of content moderation and the functions of eating disorder communities and content.
I'm reminded of all of this because of this excellent post at Techdirt by TechFreedom legal fellow Santana Boulton. "Eating disorders are older than social media, and advocates who think platforms can moderate [eating disorder] content out of existence understand neither eating disorders nor content moderation," she writes.
Eating Disorders and the Kids Online Safety Act (KOSA)
The big reason we're discussing this issue right now is KOSA, the latest (and arguably most likely to succeed) effort in Congress to childproof the internet. KOSA was first introduced in 2022 and a new version is back this year. It comes from Sen. Richard Blumenthal (D–Conn.), one of the original sponsors and biggest supporters of the measure that became the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA).
Under KOSA, social media companies and other covered online platforms have a "duty of care" that requires them to "act in the best interests" of minor users. To do this, they must "take reasonable measures…to prevent and mitigate…harms" including "anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors," as well as "addiction-like behaviors" and exposure to bullying, sexual exploitation, deceptive marketing practices, and more.
KOSA would grant the Federal Trade Commission (FTC) the power to enforce this "duty of care." Even if enforced in the most neutral of ways, this could lead to a lot of crazy harassment of tech companies unless they obsessively and excessively censor online content.
Of course, giving this power to the FTC all but ensures it will be used to further the political agendas of whatever administration is in power. Under a Biden or otherwise Democratic administration we could see, for instance, the failure to delete content that doesn't uphold progressive gender orthodoxies deemed a violation (under the theory that this could promote mental health problems in transgender children). Under a Trump or otherwise GOP administration, we could see platforms censured for allowing too much LGBTQ content (grooming!), information about birth control or abortion, and so on.
KOSA is the epitome of the type of "online safety" bill that danah boyd—a tech and culture writer and researcher who has been exploring these issues since back when social media was still called "social networking"—describes as "pretend[ing] to be focused on helping young people when they're really anti-tech bills that are using children for political agendas in ways that will fundamentally hurt the most vulnerable young people out there."
"Efforts to drive design through moral panics and flawed deterministic logics can easily trigger a host of unintended consequences," writes boyd in a pre-print paper with María P. Angel. "When it comes to kids' safety, these consequences tend to be most acutely felt by those who are most vulnerable."
Which brings us back to eating disorders.
At best, laws like KOSA try to sweep eating disorders (and other mental health issues) under the rug. But they could also make things worse for teens struggling with anorexia or other forms of disordered eating.
Misunderstanding Eating Disorders
Many people believe that anorexia and other eating disorders are about thinness, which implies that they can be triggered by exposure to beauty standards prizing thinness and solved by exposing people to less of this. From what I know about eating disorders and about media effects, everything about this is essentially wrong.
Yes, eating disorders often manifest as a desire to be thin. But this is often a symptom of deeper issues—anxiety, depression, obsessive-compulsive disorder, etc. Restricting calories, obsessively exercising, or binging and purging become ways to exercise control.
This isn't to say that some teens exposed to a lot of images of ultra-thin people won't wind up feeling worse about their bodies. But otherwise healthy, well-adjusted people aren't developing anorexia from looking at Instagram posts. And pretending like they are actually minimizes eating disorders' seriousness.
Could exposure to pro-anorexia ("pro-ana") content or idealized images of thinness on social media tip the scales for someone already suffering from mental health issues? It seems plausible. But in this scenario, Instagram (or Tumblr, or TikTok, or whatever) is just the most zeitgeisty manifestation of a much older phenomenon.
People said the same things about fashion magazines, Hollywood movies, Photoshopped advertisements, and all sorts of other depictions of thin beauty standards—and yet no one is suggesting we hold these legally liable when young people get anorexia.
Misunderstanding Eating Disorder Communities
Besides, pro-ana content isn't anything new to Instagram and its contemporaries. Back when Instagram was just a twinkle in Kevin Systrom and Mike Krieger's eyes, you could find all sorts of "pro-ana" communities and eating disorder forums on LiveJournal—and I did. Which is how I know that these types of communities and content can be multifunctional, and sometimes serve as a force for good.
Yes, people in these communities posted "thinspiration" (photos of very thin people offered up for "inspiration") and offered encouragement and tips for extreme calorie restriction.
But people in many of them also offered support on the underlying issues that others struggled with. They encouraged people who wanted to recover, and warned those who didn't against going too far. They told people to seek help for suicidal ideation, and were there for each other when someone said they had no one who understood them.
Even when not being overtly pro-recovery, these communities could sometimes have a deterrent effect, since they served as windows into the misery that people with serious eating disorders lived through. If anything, they de-glamorized life with an eating disorder, perhaps driving relatively casual calorie counters (like my young adult self) away from descending into more extreme behavior.
"The empirical literature has documented both the harm and potential benefit that these forums offer participants," noted a paper published in the Journal of Infant, Child, and Adolescent Psychotherapy in 2014.
Shutting down communities like this could actually do more damage than good.
And, as Boulton notes, people with eating disorders aren't one-dimensional and their social media accounts likely aren't either:
The account might also include that they're pro-recovery, post or do not post fatphobia, their favorite K-pop group, whether they go to the gym or not, what kind of eating disorder they have, and what kind of fashion they like.
Behind these accounts are individuals who are complex, imperfect, and hurting….Platforms cannot simply ban discussions about eating disorders without sweeping away plenty of honest conversations about mental health, which might also drive the conversation towards less helpful corners of the internet.
Banning accounts that hover between pro-eating disorder content and other things—be it recovery and mental health or things totally unrelated—could make the poster feel even more isolated and more likely to engage in harmful behaviors.
Misunderstanding the Internet
The idea that you can rid the web of eating disorder communities and content is silly.
Even if lawmakers manage to drive any mention of eating disorders from platforms like Instagram and TikTok, there will still be countless web forums, messaging platforms, and other venues where this sort of content and community can live. Only here, teens aren't going to have algorithms steering them away from extreme content. They're not going to be exposed to counter-messages and reality checks from people outside these communities. And it's less likely that people who know them in real life will see their activity and somehow intervene.
The idea that it's easy to separate bad eating disorder content from good eating disorder content, or other types of food/beauty/health content, is silly, too.
Inevitably, attempts to more strictly police pro-eating disorder content or content that good be viewed as triggering would ensnare posts and accounts from people in eating disorder recovery. Besides, different people view the same content in different lights.
To some, memoir-ish accounts of eating disorders are cautionary tales, to others they're crucial for feeling less alone and seeing a way out, and for others they're simply road maps.
Posts on intermittent fasting could be healthy tips for people with diabetes but bad news for people with anorexia.
"Content has a context," as Boulton writes. "A post that's so obviously eating disorder content on [a particular] account may not obviously [be] eating disorder content posted elsewhere."
The idea that tech companies could easily differentiate between content that "promotes" eating disorders and content that merely discusses them, or discusses other aspects of food and fitness, is ludicrous—even if they did have the capacity to use human moderators for this and not automated content filters. And the necessary use of such filters will only erase content even more.
Attempts to do this would almost certainly sweep up not only pro-recovery content but also a lot of harmless and/or unrelated stuff, from perfectly normal nutrition advice and fitness tips to posts from people proud of their (not-extreme) weight loss to pictures of people who just happen to be very thin.
"We could have everyone register their [Body Mass Indexes] and daily caloric intake with their username and password, to be sure that no unhealthy daily outfit pictures slip past the mods," quips Boulton, "but short of that appalling dystopia, we're out of luck."
Another important consideration here is that the content moderation this would take isn't likely to affect all communities equally. Men struggle with eating disorders and bad body image too, of course, but something tells me that male body-building content is going to be a lot less subject to mistaken takedowns than women's pictures and content about their bodies, diets, and fitness routines. In a paper published in the journal Media, Culture, & Society, Nicole Danielle Schott and Debra Langan suggest that policing eating disorder content leads to censorship of women's content more broadly.
A Demand-Side Problem
If KOSA or something like it became law, tech companies would have a huge incentive to suppress anything that could remotely be construed as promoting eating disorders. But there's scant evidence that this would actually lead to fewer eating disorders.
As Mike Masnick noted at Techdirt earlier this year, the issue of eating disorder content is "a demand side problem from the kids, not a supply side problem from the sites."
This is, of course, true with so much of the online content that people find problematic. But because lawmakers have every incentive to Do! Something! (and face almost no recourse when it doesn't work or backfires), we see them again and again approach demand-side issues by simply asking tech companies to hide evidence that they exist.
Show Comments (38)