Is It TikTok's Fault When Children Die Making Dangerous Videos?
A lawsuit alleges that the social media giant "tries to conceal the dangerous and addictive nature of its product, lulling users and parents into a false sense of security."

Earlier this month, the respective parents of two girls filed suit against social media platform TikTok and its parent company, ByteDance. According to the suit, the girls died while imitating behavior from videos that TikTok knew were dangerous and should have taken down. It asks for a jury trial and unspecified monetary damages.
One of the largest social media outlets on the planet, TikTok features short videos created and submitted by users. As with many platforms, it uses an algorithm to determine what users see. Based on individual users' demographics and the videos they engage with, the site curates content to display on their "For You Page" (FYP).
Unique among platforms, TikTok features "challenges," in which users film themselves doing certain tasks or activities and then encourage others to do the same. Typically they will all use a unique hashtag to make the videos easily cataloged.
In this case, the parents allege that their children, an 8-year-old girl and a 9-year-old girl, each died while taking part in the "blackout challenge" in which participants film themselves holding their breath or asphyxiating until they pass out. In fact, in just over 18 months, at least seven children have died after apparently attempting the challenge.
The story is truly tragic. But it's not clear that TikTok is uniquely responsible, nor that it should be held legally liable.
The parents are being represented in part by attorneys from the Social Media Victims Law Center (SMVLC), which bills itself as "a legal resource for parents of children harmed by social media addiction and abuse." The lawsuit accuses TikTok of inadequate parental controls and of doing too little to prevent the proliferation of dangerous content. It alleges, "TikTok actively tries to conceal the dangerous and addictive nature of its product, lulling users and parents into a false sense of security."
The parents refer to the platform's algorithm as "dangerously defective," "direct[ing users] to harmful content" in a "manipulative and coercive manner." Specifically, they say the algorithm "directed exceedingly and unacceptably dangerous challenges and videos" to each child's FYP.
Of course, children imitating dangerous behavior did not originate with TikTok. In the early 2000s, after the premiere of MTV's Jackass, multiple teenagers died reenacting stunts from the show. Even the activity at issue is not new: In 2008, the Centers for Disease Control and Prevention (CDC) warned that at least 82 children had died in a little over a decade from what it called "the choking game."
For its part, TikTok claims that the blackout challenge "long predates our platform and has never been a TikTok trend." Now, searching the "blackout challenge" hashtag is effectively banned, instead redirecting users to a page about the potential risks of online challenges. In the past, it has removed challenge videos of objectionable content, including activities that weren't dangerous.
It could be that TikTok did, in fact, know about dangerous content and intentionally allow it to spread. Or it could be that, as TechDirt's Mike Masnick says, "Content moderation at scale is impossible to do well." TikTok has over one billion monthly active users, and more than half have uploaded their own videos—it is not feasible to expect that it could possibly screen all of that content. A study released last month regarding "the lifespans of TikTok challenges" determined, "TikTok removes challenges reported as dangerous and has increased safety controls. However, considering the huge number of users and challenges created every day on this social platform, as well as the usage of some tricks exploited by the authors of dangerous challenges to bypass controls, the risk that dangerous challenges are accessible is real."
Even if TikTok were directing dangerous content to people's FYPs, it's not clear whether the platform could be found legally liable at all. Section 230 of the Communications Decency Act protects online services from legal liability for content posted by its users. While TikTok may host the videos, the law states that it cannot "be treated as the publisher or speaker" of any content provided by a user.
The lawsuit tries to circumvent this restriction by claiming that by offering images, memes, and licensed music that users can incorporate into their videos, "TikTok becomes a co-publisher of such content." This is a common misreading of Section 230: In fact, there is no distinction between a "publisher" and a "platform" in the eyes of the law. Besides, a platform cannot be the same as a user solely because it supplies the user with tools to make videos.
Every child who is injured or killed imitating a viral video is a tragic and senseless loss. Their parents are certainly suffering unimaginable grief. But it's not clear that TikTok could, or should, be held individually liable.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
"Is It TikTok's Fault When Children Die Making Dangerous Videos?"
No, Trump's.
I without a doubt have made $18k inside a calendar month thru operating clean jobs from a laptop. As I had misplaced my ultimate business, I changed into so disenchanted and thank God I searched this easy task (ky-08) accomplishing this I'm equipped to reap thousand of bucks simply from my home. All of you could really be part of this pleasant task and will gather extra cash on-line
travelling this site.
>>>>>>>>>> http://getjobs49.tk
I make a lot of money in on the side putting plastic bags over suspects' heads to get them to confess to crimes after fainting from oxygen deprivation a few times. My clients include caudillos from leading MAGA-allied juntas all over Latin America. You too can collect a few grocery bags, get grants, and go into business as a government contractor.
>>>>>>>>>> #getconfessions
(c)Protection for “Good Samaritan” blocking and screening of offensive material(1)Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
I think that what's still being tested in courts is whether or not the "platform" can be held liable based on what that platform does with the material published by another information content provider-- which in this case we presume to be the users of the interactive service.
So for instance, if the cdc posts a video of Rochelle Walensky saying that the vaccine stops transmission, and then Youtube promotes that video while de-ranking or outright banning anyone who says "that's not true", is Youtube potentially liable for the damage that medical misinformation caused?
In fact, I made $30,030 in just 5 weeks working part-time right from my apartment. When I lost my last business I got tired right away and luckily I found this job online and with that I am able to start reaping lots right through my house. Anyone can achieve this top level career and make more money online by:-
.
Reading this article:>>>> http://oldprofits.blogspot.com
they don't start breathing once they pass out?
There's probably several regulars here with experience at choking out children without killing them. Maybe they could make an instructional video.
No, it's not, and I say that as someone who despises Tik Tok.
The issue is two generations that are perpetually online, and desperately seeking validation and updoots through virtual sources that they can't get in their otherwise empty real lives. If some of them die because they do stupid shit while chasing those sweet dopamine hits, that's actually a bonus for the human race overall.
That's my take on it - rather than being punished for children's deaths, TikTok should be granted a bonus for each dumbass they remove from the gene pool.
Frankly, not enough people die who eat spoonfuls of cinnamon, shove Tide pods up their asses, or whatever is on the intertubes these days. Now excuse me, those goddamned kids are on my lawn again.
None of those things seem that dangerous.
Is it normal to let 8 year olds browse social media completely unsupervised?
Parents now blame social media when bad things happen.
Maybe they should monitor their young kids or don't give them devices that can access social media until they are older.
They used to blame television, movies, rock ‘n’ roll music, and peer pressure. Now they blame social media. Everything but themselves
I blame Canada.
Darwin smiles.
TikTok serves an important service to weed out the herd. Darwin Awards all around!
"addictive" = desirable
What they're saying is, nobody should produce a product or service that anybody wants.
Outlaw cell phones for anyone under 21
Common sense phone control.
I shall renew my call for "common sense posting control".
Social media users should be limited to one account, and only allowed to post once every 30 days.
They would be allowed to log in and look once per day, for a maximum of one hour.
Truth in advertising would require the Social Media Victims Law Center (SMVLC) to bill itself as 'a legal resource for parents who want to make money by blaming others for their own bad parenting.'
Should Tik Tok be liable for people who lose while competing for Darwin Awards?
do they lose?
The federal government has the same policy as TikTok. The Volstead act was written with help from chemists who know that the lethal dose of water is usually 2 gallons. That law allowed about a shot of alcohol per 2 gallons. Like flagpole-sitting and dance marathons, water drinking contests became a fad, but killed several people. Way more are still killed by the mandated policy of adding poison to alcohol then not taxing it--perhaps as a death-trap for libertarians.
How have we ever become so inane?
Time to thin the herd by putting free fentanyl, Tide Pods, and ammunition on the counters at every convenience store.
Within a couple months, we'll be living large....
If the content is inherently dangerous, than maybe the video hosting site should be held liable to some degree. A blackout challenge feels little removed from video instructions on how to build bombs.
Dangerous stunts: Like texting your girlfriend just after she leaves the house, "She just left, come over"?
I am tired of blaming everything else than blaming the individual for someone's actions if and when they get hurt.
Today it's TikTok.
Yesterday it was Instagram. (For body issues)
Does anyone remember all the people that hurt themselves while playing Pokemon Go?
How about people hurting themselves while taking selfies?