Supreme Court To Hear 2 Cases About Social Media Moderation and Liability for Terrorism
Does Section 230 shield YouTube from lawsuits about recommendations? Can Twitter be forced to pay damages over the terrorists it hasn’t banned?

The Supreme Court is back in session this morning and has agreed to hear nine new cases, two of which relate to the extent that online platforms can be held liable for terrorist recruitment efforts.
One of the cases will directly address the extent of Section 230 protections of the Communications Decency Act of 1996. In Gonzalez v. Google, Reynaldo Gonzalez sued the company under Section 2333 of the federal Anti-Terrorism Act, claiming that YouTube algorithms helped the Islamic State group radicalize and recruit terrorists through videos and that this led to the death of his daughter, Nohemi, in an ISIS attack on a Parisian bistro in 2015. Gonzalez argues that Google (which owns YouTube) could be held liable for damages under the act.
Under Section 230, Google is generally not legally liable for the content that third parties post online through their platforms. Lower courts have ruled against Gonzalez thus far. But the question presented by Gonzalez is whether Section 230 protects Google when YouTube's algorithmic program makes "targeted recommendations of information provided by another information content provider." In other words, when YouTube recommends videos to people who use the platform without these users actively searching for them, can Google then become liable for the content of said videos?
In the second case, Twitter v. Taamneh, the Court will consider under the same section of the Anti-Terrorism Act whether Twitter can be found to be aiding and abetting terrorists (and, as with the last case, be held liable for damages in civil court) because its service is used by terrorists, even though Twitter forbids such use and actively removes accounts of terrorists when they're found. The plaintiffs in this case are relatives of Nawras Alassaf, who was killed in a terrorist attack by an ISIS member at a nightclub in Istanbul, Turkey, in 2017. According to Twitter's petition, the terrorist responsible for the attack wasn't even using its service. The plaintiffs insist that because other terrorists have been found to be using Twitter, it can nevertheless be held liable for not taking enough "proactive" action to stop terrorists from accessing the platform.
Twitter v. Taamneh is not a Section 230–related case on the surface—though Twitter did try unsuccessfully to invoke its protections.
The U.S. 9th Circuit Court of Appeals ruled in January on both the Google and the Twitter cases in the same decision. The court decided that Google was shielded by Section 230 but that Twitter could be held financially liable for terrorists using its platform, even if the actual terrorist involved wasn't using Twitter. This decision linked the two cases together, which can explain why the Court took up both of them.
Some media coverage of the Supreme Court granting the cases today notes that Justice Clarence Thomas has been vocal that the Court will eventually have to examine the power Big Tech companies have over who has access to their platforms and whether Section 230 has any limits to what content/users companies may remove.
It doesn't seem like these are necessarily the types of cases Thomas meant, though the rulings may still be significant. Thomas and many other conservatives are worried about the power Twitter, Facebook, and other social media companies have to "deplatform" users based on their viewpoints. But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.
Section 230 helps protect online speech. If Google loses its case, the probable outcome will be a significant reduction of video recommendations on YouTube, making it harder for people to find videos related to what they're viewing and harder for content providers to reach viewers. If Twitter loses its case, it will most likely result in even more account bans at even the slightest suggestion of anything violent or inappropriate.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Seeing as YouTube edits content, no they are not protected.
The are also the Ives when they "demonitize" channels, as they still collect money from the advertisements, they just dont split it with the creator.
The are also the Ives when they “demonitize” channels, as they still collect money from the advertisements, they just dont split it with the creator.
That is straight up fraud. If they want to demonitize, they should not be able to sell advertising at all.
"Publisher. Platform. Pick one." ... 'Cause Power Pig said so!
Your large and ugly punishment boner is showing!!! Be decent, and COVER UP, will ya?!?!?
If you want to love animals, pamper your pets. If you love to eat meat, eat meat. Pick one, ONLY one!
You either love animals, or you eat meat… You can NOT do both! All pet owners who eat meat? Their pets will be slaughtered and their pet-meat distributed to the poor! Because I and 51% of the voters said so! And because we are power pigs, and LOOOOOVE to punish people!
I work from home providing various internet services for an hourly rate of $80 USD. I never thought it would be possible, but my trustworthy friend (aps-05) persuaded me to take the opportunity after telling me how she quickly earned 13,000 dollars in just four weeks while working on the greatest project. Go to this article for more information.
…..
——————————>>> https://smart.online100.workers.dev/
Dumb, bad analogy aside...
Honest question. If YouTube actively removes slanderous content about liberal public figures, but will not censor slanderous content about a conservative figure, why should they not be held accountable? By choosing to actively curate the content, how are they not endorsing what is allowed?
The word you want is "demonetize", but I like the concept of "demonitize", especially with Halloween coming up.
What's Hypmotize! chopped liver?
only on Valentime's Day.
A song? https://youtu.be/YOpYLKhUVxw
Twitter claims the power to ban anyone for any reason. If it knew about terrorists operating on its platform and did nothing to stop it, damn straight they should be liable for that. If they want the freedom to control their platform, they get the responsibility that comes with that.
Briggs Cunningham logic: Guns sellers knew that their guns would be used for murders. Therefor, let's feed the lawsuit lottery and the greedy lawyers, and pay $50,000 per gun, and $20 per round of ammo, to feed all of Briggs Cunningham's power lust, PLUS the lawyers!!!!
Buying a gun is not a crime. Terrorism is. If I walk into a gun shop and say "I need a shotgun for that armed robbery this afternoon, can you give me one" and the gun shop sells it to me knowing I am buying it to commit a crime, yes they are liable. If Twitter knows someone is using its platform to commit a crime and does nothing to stop it, they should be liable.
You are just retarded. All you do is shit all over the threads with your insanity.
"If Twitter knows someone is using its platform to commit a crime and does nothing to stop it, they should be liable."
The sellers of ink and paper KNOW damned well that THEIR products, too... Their "algorithms" that they promote, whereby people can and do pass data used to commit crimes... THEY, too, by the UTTER power-pig logic of Briggs Cunning-Cunt-ham, should be PUNISHED!!! As should farmers and grocers who KNOW that the foods that they produce and sell, will power the bodies of MURDERERS!!!! Power pigs have never know limits to their power-lusts!!!!
Buying a gun is not a crime and gun sellers generally don't demand the power to ban their customers "for any reason".
Come on, Sqrlsy - you can do better than that.
Narrator
"No he can't, he is brain damaged and a certifiable lunatic"
If power-pig logic can be applied to "topic A", then it will be applied to "topic B" as well. If you willy-nilly pussy-grab others, prepare to be pussy-grabbed right back!
The sellers of Q-tips WARN me NOT to use them to clean my ear canal... Since they have done that, can we now freely micro-manage them right back? Since they have tried to exert "editorial control" on our uses of Q-tips, we can pussy-grab the sellers of Q-tips at will? I told the clerk what I planned to do with that Q-tip, and she sold it to me anyway!!! SUE-SUE-SUE, ka-ching!!!!
If it knew about terrorists operating on its platform and did nothing to stop it, damn straight they should be liable for that.
How do you define terrorist? One man's terrorist is another man's freedom fighter.
That is like Shreek level stupid. If you are plotting violence on the platform and the platform lets it happen, it should be responsible.
All right-thinking folks know that ALL antifa sympathizers are a bunch of terrorists and that the Trumpanzees gone Apeshit on 6 January were "freedom fighters"! "Team D" = terrorists and "Team R" = "freedom fighters"!
Does that include freedom fighters plotting violence to Stop The Steal?
Leave the FBI out of this.
😉
I know you think that’s a gotcha, but one of the places where some of those idiots were planning stuff actually went to the FBI multiple times prior to J6.
Twitter and Facebook? Not so much.
Is it really "plotting violence" when everyone agrees that the people you're talking are destroying the country?
Started drinking early did you?
You assumed he stopped?
JesseBahnFuhrer... Have you stopped beating your wife yet? Yes or no!!!
Oh that's easy. If it's brown Muslim people, they are terrorists. If they are white people wearing MAGA hats shouting "hang Mike Pence", they are freedom fighters.
“Hang Mike Pence”!!! Yes, this!!!
PS, MAGA = Making Attorneys Get Attorneys!!!
https://www.businessinsider.com/trump-lawyers-ethics-complaints-joke-maga-making-attorneys-get-attorneys-2022-9
With more than 40 Trump lawyers singled out for ethics complaints and even more facing charges, legal experts joke MAGA now stands for 'Making Attorneys Get Attorneys'
If they’re just shouting it, then they’re just citizens exercising their first amendment.
How do you define terrorist? One man’s terrorist is another man’s freedom fighter.
Good point. The media often refers to Palestinian terrorists as militants or simply Palestinians. The feds define a terrorist as someone using deadly force for political goals or to change social norms.
According to Twitter's petition, the terrorist responsible for the attack wasn't even using its service. The plaintiffs insist that because other terrorists have been found to be using Twitter, it can nevertheless be held liable for not taking enough "proactive" action to stop terrorists from accessing the platform.
...
But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.
Cubby v. Compuserve. That is all.
A 1991 court decision is supposed to control the question about the constitutionality of a law passed in 1996?
Or are you trying to remind everyone that Cubby v Compuserve (along with Stratton Oakmont v Prodigy Services Co) is why Sec 230 was passed in the first place?
It can be two (or more) things. Primarily, Scott says “the two companies face lawsuits because they apparently didn’t censor enough”, S230 was passed for this purpose, specifically to obviate Cubby v. Compuserve. Further, Cubby, Gonzalez, Twitter, and Oakmont *could* have nothing to do with one another or *could* have precedent arbitrarily set and overruled between the four of them. The reason they are all definitively tied together is section 230.
If the terrorist in question didn't use Twitter, the first judge to see this case should have dismissed it.
If the question were about Twitter's culpability in the crime, sure. If it were about Twitter's responsibility to act as a good faith censor for speech on its platform... IDK. Now, I wonder where anyone would get the idea that platform providers are obligated to moderate their platforms in the public's best interest...
WTF is wrong with you Google and Twitter?
You built all these nice internet neighborhoods and now these ghetto-ass conservatives want to force their way in and bring your property values down?
Why can't you join the GOP wokes and allow the riff-raff in?
Twitter is literally 80% bots and another 10% garbage like you. Shreek, you are a pedophile, likely can't live within 1000 yards of a school. You have never lived in a nice neighborhood in your life.
Sbp loves Twitter because they have a ton of child porn on the platform.
Note to new people on the comments, spb posted links to child porn in the past. He is a terrible person and should be in prison.
"...can’t live within 1000 yards of a school"
Or a Chucky Cheese.
It's okay Disney will hire him
But the question presented by Gonzalez is whether Section 230 protects Google when YouTube's algorithmic program makes "targeted recommendations of information provided by another information content provider."
FYI, this is the subtle distinction that the Bros at *checks notes* tech-dirt never seem to understand. The promotion or "de-promotion" of a user-generated post on a platform relates to what the platform DID WITH THAT CONTENT vs the content sitting, neutrally somewhere on a forum. When the content is promoted and put in front of others' eyes, then that is argued that the platform is responsible, in some part for how the information is viewed.
Twitter v. Taamneh is not a Section 230–related case on the surface—though Twitter did try unsuccessfully to invoke its protections.
And this is another case of how the tech platforms just stand up and screech "section 230!!11! Section 230!!1!" anytime they find themselves in court.
But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.
Right, which should make these cases of particular interest to liberals and progressives who have almost UNIVERSALLY been calling for the tech companies to censor more.
There is a big difference imo between an algorithm that causes harm (just saying this is what is alleged) and one that simply sorts (edits) content. The algorithm itself is clearly not user-generated content.
I'm pretty sure the SC will understand the difference here even if you don't. Maybe not.
Algorithm is shorthand for a human created rule set.
It's a form of editing, pure and simple.
Let's say I have a website called BillBook. And let's also say I ban every reported post talking about hydroxychloroquine as a treatment for COVID, but I leave up multiple reported threads planning to beat SQRSY with baseball bats. It seems to me to defy credibility to claim that somehow I'm not participating in speech. The fact that I've exercised editorial control over the content of the site and used that editorial control to remove the hydroxychloroquine posts constitutes at least an implicit endorsement of the content I've chosen to keep up. It's clear I had the option of taking down the threads planning to beat SQRLSY with baseball bats, but decided that was worthwhile speech to keep up and publicize on my website.
So they have to censor everything, or nothing; but if they censor some things they are responsible for everything.
Possibly. Or there can be certain content-neutral rules (e.g. "Don't use the forum to plan violence") that apply universally. But, that's not what we face with social media giants. Here, they are accused of knowingly recommending a video recruiting people to engage in violence. And they, themselves, refer to what they do as "curation". At some point, I think it is fair to say that a curator is a participant in the expression.
I would prefer the rules just be specific. And if they choose to ignore one set of posts that are against the rules and selectively enforce the others, the rules become illegal. That is the problem. The selective enforcement of vague rules.
"The selective enforcement of vague rules."
THIS is why I encourage power pigs to contemplate the below!
OPEN QUESTIONS FOR ALL ENEMIES OF SECTION 230
The day after tomorrow, you get a jury summons. You will be asked to rule in the following case: A poster posted the following to social media: “Government Almighty LOVES US ALL, FAR more than we can EVER know!”
This attracted protests from liberals, who thought that they may have detected hints of sarcasm, which was hurtful, and invalidated the personhoods of a few Sensitive Souls. It ALSO attracted protests from conservatives, who were miffed that this was a PARTIAL truth only (thereby being at least partially a lie), with the REAL, full TRUTH AND ONLY THE TRUTH being, “Government Almighty of Der TrumpfenFuhrer ONLY, LOVES US ALL, FAR more than we can EVER know! Thou shalt have NO Government Almighty without Der TrumpfenFuhrer, for Our TrumpfenFuhrer is a jealous Government Almighty!”
Ministry of Truth, and Ministry of Hurt Baby Feelings, officials were consulted. Now there are charges!
QUESTIONS FOR YOU THE JUROR:
“Government Almighty LOVES US ALL”, true or false?
“Government Almighty LOVES US ALL”, hurtful sarcasm or not?
Will you be utterly delighted to serve on this jury? Keep in mind that OJ Simpson got an 11-month criminal trial! And a 4-month civil trial!
It never stops being a fucking retarded question that has nothing to do with the actual discussion being had.
So then, you look forward to being asked fucking retarded questions as a juror? And having your tax money wasted? Well, I guess you WOULD... Since YOU are fucking retarded!!!
Awww, is SQRLSY butt hurt because no one will play with him?
DesigNate is butt hurt and severely brain hurt because I kick the asses of the evil and stupid assholes who cannot and will not see their own blindness and power-lust.
I wonder if this is the time to *checks notes* re-re-re-re-remind Reason that the tech companies have been begging to have section 230 "updated" for some years now? You know, so they don't have the responsibility of having to monitor and ban conservatives and other assorted terrorists themselves. They can just make Zuckerberg's quiet admission (that I'm not sure if Reason even covered) official policy. "Hey, the FBI told us to!"
If we’re checking notes and re-re-re-re-reminding Reason about things, can we add to the notes that if various public school systems are banning Beloved in their elementary schools at the behest of the community, then Congress is rather unequivocally banning Conservative opinions and inconvenient facts, regardless of the various business decisions taking place in the back of Ali Baba’s Cave.
like just being terrorists on twitter or using twitter to terrorist?
There's no such thing as make-believe anymore. Whatever you say you are, you really are.
Thomas will back Gonzales and Taamneh because Constitutional protections only apply to social media platforms in place at the time the BoR came into effect. Or some such spurious or preposterous rationalisation.
"Electrons change the fundamental nature of speech in a way that one of our country's most accomplished black men just wouldn't understand." - SRG
"even though Twitter forbids such use"
Many tech companies "forbid" children under 13 knowing that they will lie about their age.
What a shock that the entity most responsible has deep pockets.
Wait, am reading this correctly? The terrorist who killed the plaintiffs’ relatives did not use Twitter, but because other terrorists not involved in that killing did use Twitter, Twitter is somehow liable for what the first terrorist did? Whatever happened to the concept of causation?
Whatever happened to the concept of causation?
Section 230. Big Tech is not culpable for speech on their platforms based on actual crime or involvement or any action or even lack of action on their part. They're in charge of being good faith Samaritans censoring offensive material on the internet because Congress deigned it so.
Oh! So the theory is, “being exempted from liability for taking action implicitly makes you liable for failing to take action, even if the harm I suffer has no causal relationship to your failure to take action?”
That’s certainly creative.
I hope Twitter and Google aren't held liable for the terrorist recruiting, but their cases are hurt by all the times they banned other content.
In only 5 weeks, I worked part-time from my loft and acquired $30,030. In the wake of losing my past business, I immediately became depleted. (ras-01) Luckily, I found this occupations on the web, and subsequently, I had the option to begin bringing in cash from home immediately. Anybody can achieve this tip top profession and increment their web pay by:.
.
EXTRA DETAILS HERE:>>> https://extradollars3.blogspot.com/