Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Media

Australia's Highest Court Holds Media Outlets Financially Liable for Trolls and Shitposters

Here’s why Section 230 is so important.

Scott Shackford | 9.9.2021 12:10 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
aussiefacebook_1161x653 |  Sedovukr / Dreamstime.com
( Sedovukr / Dreamstime.com)

Australia's highest court has upheld a controversial and potentially destructive ruling that media outlets are legally liable for defamatory statements posted by online commenters on Facebook, a decision that could result in massive amounts of online censorship out of fear of lawsuits.

The case revolves around a television program from 2016 on Australia's ABC TV (no relation to America's ABC network) about the mistreatment of youths in Australia's jail system. Footage of Dylan Voller in a restraining chair was part of the coverage. When media outlets covered this program and posted links to the coverage on Facebook, users made comments about Voller, and this prompted Voller to sue the media outlets. The comments were defamatory, Voller claimed, and he argued that the media outlets themselves were responsible for publishing them.

The media outlets countered that, no, they were not the publishers of third-party comments on Facebook and were not responsible for what they said. The outlets have been appealing to the courts to toss out the lawsuits, and they've been losing.

Reason first reported on this case in 2020 as it worked its way up through the courts, and Wednesday's decision from the High Court of Australia is the final stop: The country's top justices determined that media outlets in the country are, indeed, publishers of the comments that users post on Facebook under stories that they link.

The logic here is absolutely terrible and destructive. Facebook has control over the tools for managing comments on media pages. The media outlets themselves do not, and they can't "turn off" commenting on their Facebook pages. They do have the power to delete comments after the fact or use filtering tools that target keywords (to stop people from making profane or obscene comments) and can block individual users from the page.

Using these tools to try to prevent defamatory comments requires constant monitoring of the media outlet's Facebook page and would demand that moderators be so agile as to remove potentially defamatory content the moment it appears before anybody else could see it. Nevertheless, the justices concluded that this is enough control over the comments for media outlets to be considered publishers. Two of the justices were very blunt that simply participating on Facebook made Fairfax Media Publications a publisher of the comments:

In sum, each appellant intentionally took a platform provided by another entity, Facebook, created and administered a public Facebook page, and posted content on that page. The creation of the public Facebook page, and the posting of content on that page, encouraged and facilitated publication of comments from third parties. The appellants were thereby publishers of the third-party comments.

Except, of course, that the extent to which media outlets may monitor or control third-party comments is completely in the hands of Facebook. Facebook decides how much a media outlet can block comments or users or whether people can comment at all.

Not all of the judges agreed. Justice James Edelman dissented from the other judges and noted the potentially dire consequences of this decision:

Merely allowing third-party access to one's Facebook page is, of itself, insufficient to justify a factual conclusion that the Facebook page owner participated in the publication of all the third-party comments posted thereafter. Were it not so, all Facebook page owners, whether public or private, would be publishers of third-party comments posted on their Facebook pages, even those which were unwanted, unsolicited and entirely unpredicted. Indeed, it might extend to cases where a Facebook page is hacked and then has posted on it entirely unwelcome, uninvited and vile defamatory comments, whether by the hacker or in response to a post made by the hacker. It might also render Facebook itself, at common law, the publisher of all posts made on Facebook.

It is easy to assume, as these other justices apparently have, that such a decision could not possibly cause a disastrous amount of online censorship because media outlets should know when a controversial story might lead to defamatory comments. The judges actually note this in the ruling. They seem to think that this is only an issue with certain types of stories and that the appearance of defamatory comments can be predicted in advance.

This is complete rubbish, and anybody with any experience on social media already knows this. Trolls, scammers, and spammers range far and wide (that's the point of them), and it's incredibly naive to think that a story that has no controversial elements can't end up with third parties posting defamatory nonsense under them.

Edelman has the right of it here, and it's why Section 230 of the U.S. Communications Decency Act, which generally protects websites and social media platforms (and you) from liability for comments published by others, is so important. It's not just to protect media outlets from being held liable for comments from trolls. It's to allow social media participation to even happen at all. Some large media outlets or companies might be able to afford around-the-clock moderation to attempt to catch problems. But even if they could, let's be clear that they're going to avoid as much risk as possible and delete any comment that has a whiff of controversy. Why would they allow it to stand if it could get them sued?

But smaller companies and outlets—and there's no reason to think this ruling applies only to media outlets—will either have to hope Facebook gives them better tools to control who posts on their page or just not have social media presences at all.

It's a terrible ruling. The only positive here is that the courts have not yet ruled on whether the contested comments are actually defamatory. Subsequent rulings may clear Fairfax. But if courts decide that these random comments were defamatory, the downstream consequences will be disastrous for free speech. Expect to see commenting start to disappear across media platforms.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: There's Little Rationale for Masking School Kids, but Teachers Unions Are Demanding It

Scott Shackford is a policy research editor at Reason Foundation.

MediaFacebookAustraliaSocial MediaLiabilitySection 230Defamation
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Show Comments (102)

Latest

German Censorship Highlights Europe's Eroding Free Speech Protections

J.D. Tuccille | 5.12.2025 7:00 AM

How Joe Biden and Donald Trump's Perverse Pardons Undermined the Rule of Law

Jacob Sullum | From the June 2025 issue

Brickbat: 940 Days in the Hole

Charles Oliver | 5.12.2025 4:00 AM

Mothers Are Losing Custody Over Sketchy Drug Tests

Emma Camp | From the June 2025 issue

Should the
Civilization Video Games Be Fun—or Real?

Jason Russell | From the June 2025 issue

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!