Judge Dismisses Lawsuit Against TikTok Over Child's Death in Blackout Challenge
In his dismissal order, the judge cited Section 230, the law protecting websites from liability for user-generated content.

In December 2021, Tawainna Anderson experienced every parent's worst nightmare: She found her 10-year-old daughter, Nylah, hanging by a purse strap in a bedroom closet. Anderson called 911 and attempted CPR until the ambulance arrived, but after several days in intensive care, her daughter was pronounced dead.
Based on a search of her devices, police later determined the girl was attempting a viral challenge from TikTok, the social media video platform with over a billion monthly active users, 28 percent of whom are under the age of 20. TikTok challenges involve users recording themselves performing a particular act or feat and calling on other users to do it too. The videos typically feature a common hashtag to make them easy to search for. TikTok users also have a personalized "For You Page" (FYP), where the app curates suggested videos based on each user's viewing habits.
The so-called "blackout challenge" involves participants asphyxiating themselves with household objects until they pass out; according to the Centers for Disease Control and Prevention (CDC), a version of the challenge existed in some form at least as far back as 1995. Nylah watched blackout challenge videos before apparently attempting it herself. Since 2021, at least eight children have died after allegedly attempting the challenge.
Earlier this year, Anderson sued TikTok and its parent company, ByteDance, for wrongful death, negligence, and strict products liability. The suit claims the "predatory and manipulative app and algorithm…pushed exceedingly and unacceptably dangerous challenges and videos to Nylah's FYP, thus encouraging her to engage and participate."
Ultimately, it contends, TikTok's "algorithm determined that the deadly Blackout
Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson, and she died as a result."
In July, Reason reported on a similar lawsuit:
Even if TikTok were directing dangerous content to people's FYPs, it's not clear whether the platform could be found legally liable at all. Section 230 of the Communications Decency Act protects online services from legal liability for content posted by its users. While TikTok may host the videos, the law states that it cannot "be treated as the publisher or speaker" of any content provided by a user.
Just as with that suit, Anderson's lawsuit tried to circumvent Section 230's liability protections by stipulating that it "does not seek to hold [defendants] liable as the speaker or publisher of third-party content and instead intends to hold [them] responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors."
This week, Judge Paul Diamond of the Eastern District Court of Pennsylvania dismissed the case, citing Section 230 as the justification. Diamond decided that Anderson "cannot defeat Section 230 immunity…by creatively labeling her claims." He determined, "Although Anderson recasts her content claims by attacking Defendants' 'deliberate action' taken through their algorithm…courts have repeatedly held that such algorithms are 'not content in and of themselves.'"
Citing previous case law, Diamond contends that "Congress conferred this immunity 'to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum'…. It recognized that because of the 'staggering' amount of information communicated through interactive computer services, providers cannot prescreen each message they republish…. Accordingly, Congress conferred immunity on providers to encourage them not to restrict unduly the number and nature of their postings."
Indeed, given the sheer number of users, it would be impossible for TikTok to screen every single video for objectionable content. (The platform encourages content creators to post 1–4 times per day for maximum reach). A study produced this year concluded that while "TikTok removes challenges reported as dangerous and has increased safety controls," the sheer volume of content posted every day makes the task difficult.
Nylah Anderson's death is a tragedy, and her mother's pain is immeasurable. But it's far from evident that TikTok is uniquely liable for what happened, nor is it clear that it should bear any legal responsibility.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
All this criticism of TikTok is rooted in Sinophobia. We saw the same phenomenon in 2020 when racists blamed China for a virus that was 100% the fault of Trump and DeSantis.
#LibertariansAgainstSinophobia
#(HateRussiaInstead)
Citing previous case law, Diamond contends that "Congress conferred this immunity 'to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum'….
Which is why the behemoth Communications Decency Act was required... just to get section 230.
I am making 80 US dollars per hr. to complete some internet services from home. I did not ever think it would even be achievable , however my confidant mate got $13k only in four weeks, easily doing this best assignment and also she convinced me to avail.
For more detail visit this article… http://www.Profit97.com
Yeah, it's a total nonsense misreading of the Communications Decency Act legislative history to pretend that Congress was trying to keep government interference on the Internet to a minimum. The whole thing was the government trying to interfere, and enable others to interfere to achieve the aims of the government. Section 230 wasn't included to entrench Cubby v. CompuServe by statute, it was included to reverse Stratton Oakmont v. Prodigy.
Still, when it comes to Section 230's text, rather than history, the judge got it right.
Create your own stupid human trick.
I've been thinking TikTok needs a "Bear Spray Challenge".
FYI, there are some legal minds (or as the journlisming class likes to say: Experts say) that while section 230 does provide immunity for the platform from the content posted, it does NOT provide immunity to the platform based on what the platform does with the content. As youtube discovered when section 230 did NOT provide them with immunity, if they promote certain content, they can run afoul of the law. Or at least be subject to civil suits.
Yes. I don't know how this should be handle, but I think this argument is more colorable than holding them accountable for things uploaded to the website.
The real issue is that this recommendation system is the entire business model, and so it cuts to the quick of what they do.
It is a hard question though, I just think it's reasonable that this is a different question than 230 protections.
If they had a friend who kept encouraging them to do this exact challenge, and they finally did it would that friend have been held accountable absent malice? I actually don't know.
It is a hard question though,
Disagree. Remember, we're in the process of finding Alex Jones guilty to the tune of (*checks news feed*) $2.75 t-r-i-l-l-i-o-n dollars(!) for a speech issue that didn't directly or indirectly kill anyone.
It directly hurt my feelings. Where my check at?
I imagine that's going to get thrown out, because it is retarded. We'll see what the actual damages are. But if not, I would agree it's ridiculous precedent.
Uh, the jury already decided in favor of $965M in damages... in CT. I could be wrong, but I believe a jury in TX decided in favor of $50M.
The actual damages are zero. They never were asked to prove damages beyond he made them cry.
The case is more fed up than that. They had 5 weeks of trial befor Alex Jones showed up apparently.
The most important damage. Except maybe feet on Pelosi's desk.
I can remember 2 people charged and convicted for convincing someone to commit suicide through repeated communications.
right, I don't have a feeling on this case one way or another. It probably is a section 230 case, fair and square. But there have been questions raised that if Shrike posts child pornography to youtube, and youtube runs ads on that content and makes money with it while promoting it to teenagers... they're on very dangerous ground.
There's also just the simple fact that if people get irritated enough about shit like this, and the tone that a lot of tech has been taking, that eventually it will just get slapped down anyway.
The real question to me isn't running ads against content though, it's that this was definitely recommended to the user directly. This is like the "You May Also Enjoy" sidebar on Youtube, or the autoplay next in Youtube. At that point, you are being shown something you wouldn't otherwise know of, and there is some question of whether that counts as promotion. I think that is not an absurd argument.
It's more relevant because this type of thing (https://en.wikipedia.org/wiki/Recommender_system) wasn't super common when 230 was written. And it is quite different from the commenter commenting on a message board protection that 230 is built around.
I really don't know though. I can see this both ways.
It’s more relevant because this type of thing (https://en.wikipedia.org/wiki/Recommender_system) wasn’t super common when 230 was written. And it is quite different from the commenter commenting on a message board protection that 230 is built around.
Exactly. If I write a long, lurid comment in the form of a short story containing child pornographic content, with pictures, and Reason says, "Hey! We'd like to publish this as a feature article!" and without changing a single word or pixel of my comment does so, I don't think they get section 230 protections. But I'm no lawyer (or Doctor).
It’s more relevant because this type of thing (https://en.wikipedia.org/wiki/Recommender_system) wasn’t super common when 230 was written. And it is quite different from the commenter commenting on a message board protection that 230 is built around.
I really don’t know though. I can see this both ways.
Thing is, before S230, you could clearly have it both ways, as demonstrated by Compuserve and Prodigy, and S230 was specifically written so that users and content contributors cannot. Not every judge parsing recommender algorithms is going to get every call correct, but it’s better than the “All recommender systems and, indeed any publishing system, implemented in good faith will always work as intended.” edict written into law by two congressidiots.
Again, no self-respecting software engineer would in good conscience write a destroyBaghdad() function. They’d write a destroyCity() function and pass Baghdad as an argument.
If I've subscribed to something it's on me.
If the locus of people I follow logically expands to something, that's on me to be discriminating.
If a platform shows me naturally popular or most followed items, that's neutral and no harm, no foul.
If a platform actively pushes something at me how is that not on them to at least screen it for appropriateness.
I cannot tell the specifics here other than the tragic and that we are nowhere near common sense for a free society.
If a platform shows me naturally popular or most followed items, that’s neutral and no harm, no foul.
Define 'naturally popular', I'm sure someone like Elon Musk (or any other investor or business associate) would be rightly and righteously interested to know exactly how many bots operated by whom pushes 'naturally popular' one way or the other.
I find it easy. Fine them 100 billion. Tic tok is a ccp key logger, and should be treated as the Spyware it is. The US should go full lawfar on them
I believe we have a new damages precedent of $2.75 Trillion, so, let's just go ahead and solve a reasonable chunk of the national debt while we're at this... TikeTok owes this poor woman $10 and the US Government $5 Trillion.
Given the reliance on hash tags, it would seem “easy” to allow users to flag them for dangerous material. And then review those that reach a threshold.
But that doesn’t solve the more native clicking algorithms.
Short answer is no tik tok for kids.
>allow users to flag them for dangerous material.
Yeah, that totes won't get abused at all. Not one bit.
Your short answer is the right answer. What parent lets their 10 year old watch that trash unsupervised? In the analog age, as an analogy, I'd let my kid walk home from school alone, but I sure wouldn't have him wandering around the red light district.
The internet is not for children.
Short answer is no tik tok for kids.
Liable or not, no TikTok for anyone. It's Chinese spyware exploiting our own government to encourage kids to kill themselves and adults to act like 'burn shit down', anti-social, über dysmorphic psychotics.
She found her 10-year-old daughter, Nylah, hanging by a purse strap in a bedroom closet.
So, anybody want to have a crack at clearing up the stolen base for me here?
I understand that the attorneys may not have asserted ByteDance's culpability under '98 COPPA, but that doesn't make the judge's assertions about S230 correct and, more critically, did someone download and activate the TikTok account for
this girlthese kids or did it come pre-installed and activated on whatever device they were using?Whoops, was not meant in reply.
Either way, seems fairly clear/unequivocal that ByteDance failed to include instruction on "when and how to seek verifiable consent from a parent or guardian".
including children outside the U.S. if the website or service is U.S.-based.
Not a problem for Tik Tok I'd guess.
I'm not saying you're wrong, but that's unbelievably fucked up if true. Corporations, owned and/or operated by adversarial governments, get 1A protections *from American citizens by our own government* and laws that our own corporations are subject to.
It's more like the anti-Jones Act/Pro-Chinese Corporatism Law of the internet.
Absent a human making a decision as to what to promote, it would seem to fall entirely into the no-liability zone of Cubby v. CompuServe, as a matter of automatic distribution rather than editorial choice to publish. Just because the algorithm distributes the content non-identically to users doesn't change that the distributor didn't know and had no reason to know what the content was.
So, we'll see what the Supreme Court says in Gonzales v. Google, which is that YouTube case that it just accepted this month, but I expect that regardless of Section 230, established First Amendment case law (like Cubby) will hold them non-liable.
I think you've got a Hihnfection. This is as ass-backwards a reading of the plain facts of the article and the case law as his assertion that Scalia making gun control arguments in Heller.
Hmm? What, exactly, am I getting wrong about my reading of Cubby v. CompuServe?
I mean, here's a direct quote from that decision --
That's based on the Supreme Court precedent in Smith v. California (1959), where it was held that a bookstore owner could not be convicted of obscenity unless it could be shown he knew of the contents of the obscene book he was selling.
That is, the existing First Amendment precedents, preceding the existence of, and thus independent of, Section 230, establish that knowledge of contents is a prerequisite for holding a distributor liable.
Now, it's of course possible that the Supreme Court will not apply that reasoning to Gonzales v. Google. Cubby isn't a Supreme Court precedent and it's three decades old, based on six-decades-old Supreme Court precedent. And there's plenty of factual differences along the line from a bookstore owner being held criminally liable for carrying obscene books (Smith), to a computer information service not being held responsible for content posted to an unmoderated forum (CompuServe), to a major website being held civilly liable for algorithmic promotion of terrorist content (Google).
But the way I'd bet is that if the Supreme Court decides Section 230 doesn't protect Google (in the YouTube case, and thus TikTok in this case), they'll declare the First Amendment does.
What, exactly, am I getting wrong about my reading of Cubby v. CompuServe?
You mean other than reading words and intent that plainly aren't even in your own source material?
Ctrl+f "absent": 0 results
Ctrl+f "human": 0 results
Ctrl +f "making": 1 result
Ctrl + f "decision": 0 results
Weird how you start off with "human making a decision" when your source makes no such mention. Almost like you're deliberately reading the decision out of context with an intent in mind. I mean you not-even-subtly conflate the liability for (e.g.) "making a decision to murder", in your words, and "must have knowledge of a murder", in the Smith decision.
With such an obviously inaccurate reading of your own source material and an obviously false conflation between any and all forms of direct liability, indirect liability, and non-liability, why should anyone regard you as anything other than an evil fucking asshat actively seeking to disinform people?
Night of the Living Hihndead?
Since autoerotic asphyxiation is a thing going back at least several thousand years (and so is the history of accidentally dying from it), I fail to see how this is either avoidable or anyone's fault.
Probably the 10-year old was exploring her possibilities in the transgender realm. This is the proper basis for blaming TikTok.
Thats tightening a belt around your dick until it falls off.
Indeed, given the sheer number of users, it would be impossible for TikTok to screen every single video for objectionable content.
The real question is do platforms have responsibility for actively suggesting videos for people. At what point does the distinction between platform and publisher get crossed when the content is being actively pushed towards users.
I do not know the answer here, but I do think this is a more complicated question than most of the concerns around 230.
>>a version of the challenge existed in some form at least as far back as
Michael Hutchence
I think that's INXS-ive.
still love them so much. best hippie band of the '80s
What other hippie bands were there? Edie Brickell??
I guess you could count the B-52s.
lol my first apartment in Dallas had been previously occupado by Edie Brickell … and there really weren’t any new hippie bands in the 80s the whole love idea had kinda died in America … maybe REM but before Green? the inxs lyrics spoke to me
EDIT: also I love the B-52s. she came from Planet Claire.
True Fact - I am MoDean.
goin' to the store for hot dogs and wine!
their songs were perfect for our garage band.
maybe REM
Someone had to light the Gen X signal...
>>Gen X signal
wearing Dead Kennedys in my office chair if that helps.
B-52s were more a pre hippie band. Sort of campy garage-surf rock
10000 maniacs.
'Beautiful Girl' was Mrs. Casual (and I)'s wedding song. I refrained from suggesting 'Need You Tonight' in front of my FIL.
lovely song for a wedding. well played.
Without knowing everyone's race, I can't confirm whether Tawainna Anderson was blond or not.
wish the poll results of "who got this?" were more readily available.
Every single one of us?
Skin color is the most important thing
If you were looking for a way to earn some extra income every week... Look no more!!!! Here is a great opportunity for everyone to make $95/per hour by working in your free time on your computer from home... I've been doing this for 6 months now and last month i've earned my first five-figure paycheck ever!!!!
Learn more about it on following link.........>>> Topcitypay
Huh...
1. Would
2. Nuh Uh
— Lying Jeffy
3. She should sue over the pledge
She should sue over the pledge
If only more grad students had the means, fiscal and intellectual, to hold their education programs responsible for the success of their degree(s). If your degree kept you so busy making money hand over fist, you'd have no time, or grounds, to sue.
Sign the pledge. Finish school. Get a job. Violate the pledge at every turn. Counsel everyone, Trump supporter or not, that they are not victims and they’re better than what Antioch and systems like theirs try to paint them as. Make the University choose to, in a court of law, disclose that, in their psychology program, they encourage the neutering of children and forcing black people into victim mindsets.
The video demonstrates hard work in at least a modicum of good faith. As little as I value psychology as a science and/or profession, it would be a shame to throw away hard work and definitively sink those costs when you’re still the one in charge of how they pay out.
Looks official mother fuckers!
https://www.npr.org/2022/10/27/1132153277/elon-musk-takes-control-of-twitter-and-immediately-ousts-top-executives
Wonder if average Twitter links per day in the roundup will decrease? Anyone willing to count the number last 30 days, I’ll count the next 30.
Or maybe a 10 day grace each way? A lot of liberal tears the next 10 days before they bail?
No Lou Reed memorial post? Or was Nick too much in mouring?
To me, this is a simple drafting error by plaintiffs’ attorney.
What they should have done is to sue TikTok and “John Doe,” defined as the person who posted the challenge. Then they should demand TikTok identify that person in order to receive the protection of Sec. 230. They do, that person is served, and made to pay — because he is the author of the material that killed plaintiffs’ son.
If TikTok doesn't know the poster's identity then I'd hold them liable.
Google pay 200$ per hour my last pay check was $8500 working 1o hours a week online. My younger brother friend has been averaging 12000 for months now and he works about 22 hours a week. I cant believe how easy it was once I tried it outit..
🙂 AND GOOD LUCK.:)
HERE====)> ???.????????.???