Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Social Media

Supreme Court To Hear 2 Cases About Social Media Moderation and Liability for Terrorism

Does Section 230 shield YouTube from lawsuits about recommendations? Can Twitter be forced to pay damages over the terrorists it hasn’t banned?

Scott Shackford | 10.3.2022 2:00 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
YouTube recommendations | Sebastian Czapnik / Dreamstime.com
(Sebastian Czapnik / Dreamstime.com)

The Supreme Court is back in session this morning and has agreed to hear nine new cases, two of which relate to the extent that online platforms can be held liable for terrorist recruitment efforts.

One of the cases will directly address the extent of Section 230 protections of the Communications Decency Act of 1996. In Gonzalez v. Google, Reynaldo Gonzalez sued the company under Section 2333 of the federal Anti-Terrorism Act, claiming that YouTube algorithms helped the Islamic State group radicalize and recruit terrorists through videos and that this led to the death of his daughter, Nohemi, in an ISIS attack on a Parisian bistro in 2015. Gonzalez argues that Google (which owns YouTube) could be held liable for damages under the act.

Under Section 230, Google is generally not legally liable for the content that third parties post online through their platforms. Lower courts have ruled against Gonzalez thus far. But the question presented by Gonzalez is whether Section 230 protects Google when YouTube's algorithmic program makes "targeted recommendations of information provided by another information content provider." In other words, when YouTube recommends videos to people who use the platform without these users actively searching for them, can Google then become liable for the content of said videos?

In the second case, Twitter v. Taamneh, the Court will consider under the same section of the Anti-Terrorism Act whether Twitter can be found to be aiding and abetting terrorists (and, as with the last case, be held liable for damages in civil court) because its service is used by terrorists, even though Twitter forbids such use and actively removes accounts of terrorists when they're found. The plaintiffs in this case are relatives of Nawras Alassaf, who was killed in a terrorist attack by an ISIS member at a nightclub in Istanbul, Turkey, in 2017. According to Twitter's petition, the terrorist responsible for the attack wasn't even using its service. The plaintiffs insist that because other terrorists have been found to be using Twitter, it can nevertheless be held liable for not taking enough "proactive" action to stop terrorists from accessing the platform.

Twitter v. Taamneh is not a Section 230–related case on the surface—though Twitter did try unsuccessfully to invoke its protections.

The U.S. 9th Circuit Court of Appeals ruled in January on both the Google and the Twitter cases in the same decision. The court decided that Google was shielded by Section 230 but that Twitter could be held financially liable for terrorists using its platform, even if the actual terrorist involved wasn't using Twitter. This decision linked the two cases together, which can explain why the Court took up both of them.

Some media coverage of the Supreme Court granting the cases today notes that Justice Clarence Thomas has been vocal that the Court will eventually have to examine the power Big Tech companies have over who has access to their platforms and whether Section 230 has any limits to what content/users companies may remove.

It doesn't seem like these are necessarily the types of cases Thomas meant, though the rulings may still be significant. Thomas and many other conservatives are worried about the power Twitter, Facebook, and other social media companies have to "deplatform" users based on their viewpoints. But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.

Section 230 helps protect online speech. If Google loses its case, the probable outcome will be a significant reduction of video recommendations on YouTube, making it harder for people to find videos related to what they're viewing and harder for content providers to reach viewers. If Twitter loses its case, it will most likely result in even more account bans at even the slightest suggestion of anything violent or inappropriate.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: NYU Finally Allows Students To Choose Whether They Mask

Scott Shackford is a policy research editor at Reason Foundation.

Social MediaSupreme CourtSection 230GoogleTwitterLawsuitsLiabilityTerrorismCensorship
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Hide Comments (66)

Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.

  1. rev-arthur-l-kuckland   3 years ago

    Seeing as YouTube edits content, no they are not protected.
    The are also the Ives when they "demonitize" channels, as they still collect money from the advertisements, they just dont split it with the creator.

    1. Briggs Cunningham   3 years ago

      The are also the Ives when they “demonitize” channels, as they still collect money from the advertisements, they just dont split it with the creator.

      That is straight up fraud. If they want to demonitize, they should not be able to sell advertising at all.

    2. SQRLSY One   3 years ago

      "Publisher. Platform. Pick one." ... 'Cause Power Pig said so!
      Your large and ugly punishment boner is showing!!! Be decent, and COVER UP, will ya?!?!?

      If you want to love animals, pamper your pets. If you love to eat meat, eat meat. Pick one, ONLY one!

      You either love animals, or you eat meat… You can NOT do both! All pet owners who eat meat? Their pets will be slaughtered and their pet-meat distributed to the poor! Because I and 51% of the voters said so! And because we are power pigs, and LOOOOOVE to punish people!

      1. evadavid035   3 years ago (edited)

        I work from home providing various internet services for an hourly rate of $80 USD. I never thought it would be possible, but my trustworthy friend (aps-05) persuaded me to take the opportunity after telling me how she quickly earned 13,000 dollars in just four weeks while working on the greatest project. Go to this article for more information.
        …..
        ——————————>>> https://smart.online100.workers.dev/

      2. Junkmailfolder   3 years ago

        Dumb, bad analogy aside...

        Honest question. If YouTube actively removes slanderous content about liberal public figures, but will not censor slanderous content about a conservative figure, why should they not be held accountable? By choosing to actively curate the content, how are they not endorsing what is allowed?

    3. Vernon Depner   3 years ago

      The word you want is "demonetize", but I like the concept of "demonitize", especially with Halloween coming up.

      1. Diane Reynolds (Paul.)   3 years ago

        What's Hypmotize! chopped liver?

        1. Dillinger   3 years ago

          only on Valentime's Day.

        2. MK Ultra   3 years ago

          A song? https://youtu.be/YOpYLKhUVxw

  2. Briggs Cunningham   3 years ago

    Twitter claims the power to ban anyone for any reason. If it knew about terrorists operating on its platform and did nothing to stop it, damn straight they should be liable for that. If they want the freedom to control their platform, they get the responsibility that comes with that.

    1. SQRLSY One   3 years ago

      Briggs Cunningham logic: Guns sellers knew that their guns would be used for murders. Therefor, let's feed the lawsuit lottery and the greedy lawyers, and pay $50,000 per gun, and $20 per round of ammo, to feed all of Briggs Cunningham's power lust, PLUS the lawyers!!!!

      1. Briggs Cunningham   3 years ago

        Buying a gun is not a crime. Terrorism is. If I walk into a gun shop and say "I need a shotgun for that armed robbery this afternoon, can you give me one" and the gun shop sells it to me knowing I am buying it to commit a crime, yes they are liable. If Twitter knows someone is using its platform to commit a crime and does nothing to stop it, they should be liable.

        You are just retarded. All you do is shit all over the threads with your insanity.

        1. SQRLSY One   3 years ago

          "If Twitter knows someone is using its platform to commit a crime and does nothing to stop it, they should be liable."

          The sellers of ink and paper KNOW damned well that THEIR products, too... Their "algorithms" that they promote, whereby people can and do pass data used to commit crimes... THEY, too, by the UTTER power-pig logic of Briggs Cunning-Cunt-ham, should be PUNISHED!!! As should farmers and grocers who KNOW that the foods that they produce and sell, will power the bodies of MURDERERS!!!! Power pigs have never know limits to their power-lusts!!!!

      2. Rossami   3 years ago

        Buying a gun is not a crime and gun sellers generally don't demand the power to ban their customers "for any reason".

        Come on, Sqrlsy - you can do better than that.

        1. Briggs Cunningham   3 years ago

          Narrator

          "No he can't, he is brain damaged and a certifiable lunatic"

        2. SQRLSY One   3 years ago

          If power-pig logic can be applied to "topic A", then it will be applied to "topic B" as well. If you willy-nilly pussy-grab others, prepare to be pussy-grabbed right back!

          The sellers of Q-tips WARN me NOT to use them to clean my ear canal... Since they have done that, can we now freely micro-manage them right back? Since they have tried to exert "editorial control" on our uses of Q-tips, we can pussy-grab the sellers of Q-tips at will? I told the clerk what I planned to do with that Q-tip, and she sold it to me anyway!!! SUE-SUE-SUE, ka-ching!!!!

    2. sarcasmic   3 years ago

      If it knew about terrorists operating on its platform and did nothing to stop it, damn straight they should be liable for that.

      How do you define terrorist? One man's terrorist is another man's freedom fighter.

      1. Briggs Cunningham   3 years ago

        That is like Shreek level stupid. If you are plotting violence on the platform and the platform lets it happen, it should be responsible.

        1. SQRLSY One   3 years ago

          All right-thinking folks know that ALL antifa sympathizers are a bunch of terrorists and that the Trumpanzees gone Apeshit on 6 January were "freedom fighters"! "Team D" = terrorists and "Team R" = "freedom fighters"!

        2. sarcasmic   3 years ago

          Does that include freedom fighters plotting violence to Stop The Steal?

          1. VULGAR MADMAN   3 years ago

            Leave the FBI out of this.

            1. Ersatz   3 years ago

              😉

          2. DesigNate   3 years ago

            I know you think that’s a gotcha, but one of the places where some of those idiots were planning stuff actually went to the FBI multiple times prior to J6.

            Twitter and Facebook? Not so much.

        3. sarcasmic   3 years ago

          Is it really "plotting violence" when everyone agrees that the people you're talking are destroying the country?

          1. VULGAR MADMAN   3 years ago

            Started drinking early did you?

            1. JesseAz   3 years ago

              You assumed he stopped?

              1. SQRLSY One   3 years ago

                JesseBahnFuhrer... Have you stopped beating your wife yet? Yes or no!!!

      2. chemjeff radical individualist   3 years ago

        Oh that's easy. If it's brown Muslim people, they are terrorists. If they are white people wearing MAGA hats shouting "hang Mike Pence", they are freedom fighters.

        1. SQRLSY One   3 years ago

          “Hang Mike Pence”!!! Yes, this!!!

          PS, MAGA = Making Attorneys Get Attorneys!!!

          https://www.businessinsider.com/trump-lawyers-ethics-complaints-joke-maga-making-attorneys-get-attorneys-2022-9

          With more than 40 Trump lawyers singled out for ethics complaints and even more facing charges, legal experts joke MAGA now stands for 'Making Attorneys Get Attorneys'

        2. DesigNate   3 years ago

          If they’re just shouting it, then they’re just citizens exercising their first amendment.

      3. USA_Jew   3 years ago

        How do you define terrorist? One man’s terrorist is another man’s freedom fighter.
        Good point. The media often refers to Palestinian terrorists as militants or simply Palestinians. The feds define a terrorist as someone using deadly force for political goals or to change social norms.

  3. mad.casual   3 years ago

    According to Twitter's petition, the terrorist responsible for the attack wasn't even using its service. The plaintiffs insist that because other terrorists have been found to be using Twitter, it can nevertheless be held liable for not taking enough "proactive" action to stop terrorists from accessing the platform.
    ...
    But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.

    Cubby v. Compuserve. That is all.

    1. Rossami   3 years ago

      A 1991 court decision is supposed to control the question about the constitutionality of a law passed in 1996?

      Or are you trying to remind everyone that Cubby v Compuserve (along with Stratton Oakmont v Prodigy Services Co) is why Sec 230 was passed in the first place?

      1. mad.casual   3 years ago (edited)

        It can be two (or more) things. Primarily, Scott says “the two companies face lawsuits because they apparently didn’t censor enough”, S230 was passed for this purpose, specifically to obviate Cubby v. Compuserve. Further, Cubby, Gonzalez, Twitter, and Oakmont *could* have nothing to do with one another or *could* have precedent arbitrarily set and overruled between the four of them. The reason they are all definitively tied together is section 230.

    2. CE   3 years ago

      If the terrorist in question didn't use Twitter, the first judge to see this case should have dismissed it.

      1. mad.casual   3 years ago

        If the question were about Twitter's culpability in the crime, sure. If it were about Twitter's responsibility to act as a good faith censor for speech on its platform... IDK. Now, I wonder where anyone would get the idea that platform providers are obligated to moderate their platforms in the public's best interest...

  4. Sarah Palin's Buttplug 2   3 years ago

    WTF is wrong with you Google and Twitter?

    You built all these nice internet neighborhoods and now these ghetto-ass conservatives want to force their way in and bring your property values down?

    Why can't you join the GOP wokes and allow the riff-raff in?

    1. Briggs Cunningham   3 years ago

      Twitter is literally 80% bots and another 10% garbage like you. Shreek, you are a pedophile, likely can't live within 1000 yards of a school. You have never lived in a nice neighborhood in your life.

      1. rev-arthur-l-kuckland   3 years ago (edited)

        Sbp loves Twitter because they have a ton of child porn on the platform.

        Note to new people on the comments, spb posted links to child porn in the past. He is a terrible person and should be in prison.

      2. Quo Usque Tandem   3 years ago

        "...can’t live within 1000 yards of a school"

        Or a Chucky Cheese.

        1. rev-arthur-l-kuckland   3 years ago

          It's okay Disney will hire him

  5. Diane Reynolds (Paul.)   3 years ago

    But the question presented by Gonzalez is whether Section 230 protects Google when YouTube's algorithmic program makes "targeted recommendations of information provided by another information content provider."

    FYI, this is the subtle distinction that the Bros at *checks notes* tech-dirt never seem to understand. The promotion or "de-promotion" of a user-generated post on a platform relates to what the platform DID WITH THAT CONTENT vs the content sitting, neutrally somewhere on a forum. When the content is promoted and put in front of others' eyes, then that is argued that the platform is responsible, in some part for how the information is viewed.

    Twitter v. Taamneh is not a Section 230–related case on the surface—though Twitter did try unsuccessfully to invoke its protections.

    And this is another case of how the tech platforms just stand up and screech "section 230!!11! Section 230!!1!" anytime they find themselves in court.

    But in both of these cases, the two companies face lawsuits because they apparently didn't censor enough; both of these cases are about who Google and Twitter didn't deplatform.

    Right, which should make these cases of particular interest to liberals and progressives who have almost UNIVERSALLY been calling for the tech companies to censor more.

    1. JFree   3 years ago

      There is a big difference imo between an algorithm that causes harm (just saying this is what is alleged) and one that simply sorts (edits) content. The algorithm itself is clearly not user-generated content.

      I'm pretty sure the SC will understand the difference here even if you don't. Maybe not.

      1. ThomasD   3 years ago

        Algorithm is shorthand for a human created rule set.

        It's a form of editing, pure and simple.

  6. Bill Dalasio   3 years ago

    Let's say I have a website called BillBook. And let's also say I ban every reported post talking about hydroxychloroquine as a treatment for COVID, but I leave up multiple reported threads planning to beat SQRSY with baseball bats. It seems to me to defy credibility to claim that somehow I'm not participating in speech. The fact that I've exercised editorial control over the content of the site and used that editorial control to remove the hydroxychloroquine posts constitutes at least an implicit endorsement of the content I've chosen to keep up. It's clear I had the option of taking down the threads planning to beat SQRLSY with baseball bats, but decided that was worthwhile speech to keep up and publicize on my website.

    1. Quo Usque Tandem   3 years ago

      So they have to censor everything, or nothing; but if they censor some things they are responsible for everything.

      1. Bill Dalasio   3 years ago

        Possibly. Or there can be certain content-neutral rules (e.g. "Don't use the forum to plan violence") that apply universally. But, that's not what we face with social media giants. Here, they are accused of knowingly recommending a video recruiting people to engage in violence. And they, themselves, refer to what they do as "curation". At some point, I think it is fair to say that a curator is a participant in the expression.

      2. JesseAz   3 years ago

        I would prefer the rules just be specific. And if they choose to ignore one set of posts that are against the rules and selectively enforce the others, the rules become illegal. That is the problem. The selective enforcement of vague rules.

        1. SQRLSY One   3 years ago

          "The selective enforcement of vague rules."

          THIS is why I encourage power pigs to contemplate the below!

          OPEN QUESTIONS FOR ALL ENEMIES OF SECTION 230

          The day after tomorrow, you get a jury summons. You will be asked to rule in the following case: A poster posted the following to social media: “Government Almighty LOVES US ALL, FAR more than we can EVER know!”

          This attracted protests from liberals, who thought that they may have detected hints of sarcasm, which was hurtful, and invalidated the personhoods of a few Sensitive Souls. It ALSO attracted protests from conservatives, who were miffed that this was a PARTIAL truth only (thereby being at least partially a lie), with the REAL, full TRUTH AND ONLY THE TRUTH being, “Government Almighty of Der TrumpfenFuhrer ONLY, LOVES US ALL, FAR more than we can EVER know! Thou shalt have NO Government Almighty without Der TrumpfenFuhrer, for Our TrumpfenFuhrer is a jealous Government Almighty!”

          Ministry of Truth, and Ministry of Hurt Baby Feelings, officials were consulted. Now there are charges!

          QUESTIONS FOR YOU THE JUROR:

          “Government Almighty LOVES US ALL”, true or false?

          “Government Almighty LOVES US ALL”, hurtful sarcasm or not?

          Will you be utterly delighted to serve on this jury? Keep in mind that OJ Simpson got an 11-month criminal trial! And a 4-month civil trial!

          1. DesigNate   3 years ago

            It never stops being a fucking retarded question that has nothing to do with the actual discussion being had.

            1. SQRLSY One   3 years ago

              So then, you look forward to being asked fucking retarded questions as a juror? And having your tax money wasted? Well, I guess you WOULD... Since YOU are fucking retarded!!!

              1. DesigNate   3 years ago

                Awww, is SQRLSY butt hurt because no one will play with him?

                1. SQRLSY One   3 years ago

                  DesigNate is butt hurt and severely brain hurt because I kick the asses of the evil and stupid assholes who cannot and will not see their own blindness and power-lust.

  7. Diane Reynolds (Paul.)   3 years ago

    I wonder if this is the time to *checks notes* re-re-re-re-remind Reason that the tech companies have been begging to have section 230 "updated" for some years now? You know, so they don't have the responsibility of having to monitor and ban conservatives and other assorted terrorists themselves. They can just make Zuckerberg's quiet admission (that I'm not sure if Reason even covered) official policy. "Hey, the FBI told us to!"

    1. mad.casual   3 years ago (edited)

      If we’re checking notes and re-re-re-re-reminding Reason about things, can we add to the notes that if various public school systems are banning Beloved in their elementary schools at the behest of the community, then Congress is rather unequivocally banning Conservative opinions and inconvenient facts, regardless of the various business decisions taking place in the back of Ali Baba’s Cave.

  8. Dillinger   3 years ago

    like just being terrorists on twitter or using twitter to terrorist?

    1. Vernon Depner   3 years ago

      There's no such thing as make-believe anymore. Whatever you say you are, you really are.

  9. SRG   3 years ago

    Thomas will back Gonzales and Taamneh because Constitutional protections only apply to social media platforms in place at the time the BoR came into effect. Or some such spurious or preposterous rationalisation.

    1. mad.casual   3 years ago

      "Electrons change the fundamental nature of speech in a way that one of our country's most accomplished black men just wouldn't understand." - SRG

  10. John F. Carr   3 years ago

    "even though Twitter forbids such use"

    Many tech companies "forbid" children under 13 knowing that they will lie about their age.

  11. Brian   3 years ago

    What a shock that the entity most responsible has deep pockets.

  12. Number 2   3 years ago (edited)

    Wait, am reading this correctly? The terrorist who killed the plaintiffs’ relatives did not use Twitter, but because other terrorists not involved in that killing did use Twitter, Twitter is somehow liable for what the first terrorist did? Whatever happened to the concept of causation?

    1. mad.casual   3 years ago

      Whatever happened to the concept of causation?

      Section 230. Big Tech is not culpable for speech on their platforms based on actual crime or involvement or any action or even lack of action on their part. They're in charge of being good faith Samaritans censoring offensive material on the internet because Congress deigned it so.

      1. Number 2   3 years ago

        Oh! So the theory is, “being exempted from liability for taking action implicitly makes you liable for failing to take action, even if the harm I suffer has no causal relationship to your failure to take action?”

        That’s certainly creative.

  13. USA_Jew   3 years ago

    I hope Twitter and Google aren't held liable for the terrorist recruiting, but their cases are hurt by all the times they banned other content.

  14. Diane E. Hibbard   3 years ago (edited)

    In only 5 weeks, I worked part-time from my loft and acquired $30,030. In the wake of losing my past business, I immediately became depleted. (ras-01) Luckily, I found this occupations on the web, and subsequently, I had the option to begin bringing in cash from home immediately. Anybody can achieve this tip top profession and increment their web pay by:.
    .
    EXTRA DETAILS HERE:>>> https://extradollars3.blogspot.com/

Please log in to post comments

Mute this user?

  • Mute User
  • Cancel

Ban this user?

  • Ban User
  • Cancel

Un-ban this user?

  • Un-ban User
  • Cancel

Nuke this user?

  • Nuke User
  • Cancel

Un-nuke this user?

  • Un-nuke User
  • Cancel

Flag this comment?

  • Flag Comment
  • Cancel

Un-flag this comment?

  • Un-flag Comment
  • Cancel

Latest

How Britain's Protectionist Trade Policies Created Valley Forge

Eric Boehm | From the June 2025 issue

Brickbat: Reading Problem

Charles Oliver | 5.13.2025 4:00 AM

Trump's Tariffs and Immigration Policies Destroy Thousands of Acres of Tomato Crops in Florida

Autumn Billings | 5.12.2025 5:14 PM

Defenders of Trump's Birthright Citizenship Order Offer an Implausible Take on a 127-Year-Old Precedent

Jacob Sullum | 5.12.2025 4:52 PM

Why DOGE Failed

Eric Boehm | 5.12.2025 3:20 PM

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!