Social Media

Muscular Content Moderation Is a Really Bad Idea That's Impossible To Implement Well

Be afraid as more journalists and politicians start calling for stronger policing of online speech.

|

The calls for more and tougher social media content moderation—including lifetime bans against malefactors—keep getting louder and louder, especially from politicians. Here's a new story from The Atlantic's Alexis C. Madrigal, who recounts how social-media providers increasingly began to call their services platforms to avoid responsibility for the content that users post while profiting off hate speech and other forms of risible expression.

That's a very debatable history of social media, but let's leave that aside for the time being. Madrigal's conclusion is a good summary of what is fast becoming the new conventional wisdom among the smart set, including legislators. Put simply, the new consensus holds that no social media company should ever make money or grow its reach by allowing speech or content that we all agree is patently false, dangerous, or offensive.

Facebook drew on that sense of being "just a platform" after conservatives challenged what they saw as the company's liberal bias in mid-2016. Zuckerberg began to use—at least in public—the line that Facebook was, "a platform for all ideas."

But that prompted many people to ask: What about awful, hateful ideas? Why, exactly, Facebook should host them, algorithmically serve them up, or lead users to groups filled with them?

These companies are continuing to make their platform arguments, but every day brings more conflicts that they seem unprepared to resolve. The platform defense used to shut down the why questions: Why should YouTube host conspiracy content? Why should Facebook host provably false information? Facebook, YouTube, and their kin keep trying to answer, We're platforms! But activists and legislators are now saying, So what? "I think they have proven—by not taking down something they know is false—that they were willing enablers of the Russian interference in our election," Nancy Pelosi said in the wake of the altered-video fracas. [Facebook refused to take down a video edited to make the Speaker of the House appear drunk.]

The corollary to maximalist content moderation is that it will be easy and simple to implement because, well, aren't these online types all super-geniuses? But implementation is proving to be extremely difficult and stupid, for all sorts of reasons and with all sorts of unintentionally comic results. Just yesterday, for instance, Twitter suspended the account of David Neiwert, a progressive journalist who is the author of Alt-America: The Rise of the Radical Right in the Age of Trump, a critical book on the topic. His crime? Festooning his Twitter homepage with an image from Alt-Right's cover image, which includes "Ku Klux Klan hoods atop a series of stars from the American flag." Better yet, Twitter froze the parody account The Tweet of God for the following message:

Twitter said the tweet violated its rules against "hateful conduct."

The account has since been restored and it's worth noting that God is keeping the jokes coming:

Similar issues inevitably arise whenever social media platforms (that word!) try to create more robust versions of content moderation that inevitably rely on automated processes that can't detect intent, much less irony or humor. Last week, for instance, YouTube announced that it was changing its terms of service and would now ban "videos alleging that a group is superior in order to justify discrimination, segregation or exclusion." As Robby Soave noted, among the first batch of people blocked by YouTube was former Reason intern Ford Fischer, whose channel News2Share documents and exposes extremism.

This way madness lies. But it's not simply an implementation problem. While the doctrinaire libertarian in me is quick to point out that Facebook, Twitter, et al have every legal right to run their platforms however they want to, the free-speech enthusiast in me is equally quick to say not so fast. Contra The Atlantic's Madrigal, it's not always easy to define "awful, hateful ideas" or to agree on what sorts of videos present, insane conspiracies versus unconventional ideas. I suspect very few people really want Rep. Nancy Pelosi (D–Calif.)—or President Donald Trump, or any politician—to decide what is false and thus censorable.

But we shouldn't be so quick to allow private companies to take on that role either.

Is Facebook really a better place now that Alex Jones and Louis Farrakhan are gone? Probably, but given all the gray areas that come with defining who and what is ban-worthy (should, say, radical diet groups get bounced?), the superior solution is minimal content moderation, restricted to true threats and clear criminal behavior such as fraud, and improved tools that allow users to tailor what they see and experience. Certainly, Facebook is better for leaving up the altered Nancy Pelosi video while calling attention to its doctored state and allowing a "deepfake" video of Mark Zuckerberg to remain on the platform (that word again).

We are plainly asking too much of social media right now, even as we are blaming it for problems that long predate its rise. Rather than trying to force it to respond to all our contradictory demands, it's far better for us collectively and individually to develop new forms of media literacy that allow us to separate the wheat from the chaff. Individual definitions will vary and many heated conversations will follow, which is exactly as it should be in a complex world that resists even our best-faith attempts to regiment it to our most-deeply held beliefs.

Advertisement

NEXT: Shock Poll: Amash Down 16 Points in Republican Primary

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. […] Muscular Content Moderation Is a Really Bad Idea That’s Impossible To Implement Well – https://reason.com/2019/06/12/muscular-content-moderation-is-a-really-bad-idea-thats-impossible-to-i… (adsbygoogle = window.adsbygoogle || []).push({}); Be Sociable, […]

  2. […] Muscular Content Moderation Is a Really Bad Idea That’s Impossible To Implement Well – https://reason.com/2019/06/12/muscular-content-moderation-is-a-really-bad-idea-thats-impossible-to-i… (adsbygoogle = window.adsbygoogle || []).push({}); Be Sociable, […]

  3. not always easy to define “awful, hateful ideas,”

    It is always easy, clear, and simple.
    If it advocates individual freedom, it is good.
    If it advocates government control of individual action, it is bad.
    Government control of social media is bad.
    Q.E.D.

    1. Those are examples of good and bad things, not definitions.

      There are good and bad things outside of the realm of government action vs. individual freedom.

      I do agree with your argument and conclusion, though.

      1. Everyone knows that sometimes the government does need to “control individual action,” especially when it involves so-called internet “speech” that could damage a reputation. Here at NYU we had to call in the police, and then work very closely with prosecutors, to put a stop to Gmail “parodies” that were disrupting our tranquility. Unfortunately, since the laws were a bit hazy, we were forced to spend nine years litigating the charges, and many of them were declared “unconstitutional” on a variety of foolish grounds, but at least we got some of them to stick, and we certainly succeeded in reasserting our authority and control, as well as basic discipline here on campus. Hopefully these results will help Mr. Nunes with his lawsuit, although some of us here feel it would have been better if he could have found a legal pretext to press criminal charges the way we did. At any rate, more government regulation is certainly called for, just as so many journalists have been arguing. See the documentation of our nation’s leading criminal “satire” case at:

        https://raphaelgolbtrial.wordpress.com/

        1. At one time, I thought you were sarcastic. I’ve since learned you’re serious.
          Fuck off, slaver.

          1. It’s good to see there are those who understand how serious this matter really is. We don’t encourage “sarcasm” here at NYU, as it can easily lead to reprehensible mockery and other inappropriate forms of “speech,” including even illegal “parody.” We have enough prurient behavior as it is in our classrooms, without having to deal with conduct that requires us to seek assistance from law enforcement authorities.

  4. “”it’s not always easy to define “awful, hateful ideas,” “”

    One man’s awful, hateful ideas is another man’s comedy.

    1. …and another man’s political party platform.

    2. And in many cases, one man’s awful, hateful ideas are the same man’s comedy.

  5. Those _______ censors can go and ____ my ____ and then ____ themselves right up the ___. I’m tired of these _____________ telling me what I can and cannot say.

    1. That makes me think of the Family Guy episode.

      It’s the Bleep Van Bleep show starring Bleep Van Bleep.

      1. Conan’s unnecessary censorship skits were always hilarious too. I may watch some on the YouTube platform tonight.

    2. That makes me think of the Family Guy episode.

      It’s the Bleep Van Bleep show starring Bleep Van Bleep.

    3. I love mad libs!
      adjective: fat
      verb: run
      noun: vegetable
      verb: kiss
      noun: box
      noun (plural): burgers
      “Those fat censors can go and run my vegetable and then kiss themselves up the box. I’m tired of these burgers telling me what I can and cannot say.”

      1. Reason comments madlibs is a fantastic idea, so tremendous, really great.

  6. How ironic that one of the current great opponents of a robust freedom of the press are the legacy press, because the proliferation of outlets for people outside of the creditionaled press to exercise their freedom of expression means that the legacy media cannot act as gatekeepers over the content of information easily accessible to the public. The rabble are choosing to pay attention to the wrong things and interpreting them in ways they are not supposed to and the journalists, as a class, are out of sorts about how to get things back to how they were.

  7. The calls for more and tougher social media content moderation—including lifetime bans against malefactors—keep getting louder and louder, especially from politicians.

    From the links in this article, to what my own lying eyes are telling me, it seems to be getting louder from the media, and politicians are just ticking up their rhetoric in response to the full-court press the… er, press is currently enaging:

    Beyond the Maza-Crowder dispute, just in the past week, researchers demonstrated that YouTube’s algorithm seemed to lead users to sexualized videos of children, and The New York Times ran a front-page story about how a young man was radicalized by right-wing videos on the site.

    1. Didn’t the “young man radicalized by YouTube videos” turn out to be a pretty standard conservative?

      1. I haven’t seen the report, but I have seen reports of the reports, and the way the reports of the reports reported the report, it made me highly skeptical of the original report.

      2. Yes. Reason writers proves they were pretty gullible with that one.

  8. Google CEO: YouTube Will Begin Targeting ‘Content Which Doesn’t Exactly Violate Policies’
    Google, said Pichai, “rank[s] content based on quality.” They plan to apply that same approach to YouTube. “And so we are bringing that same notion and approach to YouTube so that we can rank higher quality stuff better and really prevent borderline content,” he said.

    Pichai then offered a definition of what he means by “borderline content.” “Content which doesn’t exactly violate policies, which need to be removed, but which can still cause harm,” he said, in language echoing YouTube’s statement to Crowder days later.

  9. try to create more robust versions of content moderation that inevitably rely on automated processes that can’t detect intent, much less irony or humor.

    And yet the singularity is JUST around the corner!

    1. Of course even I admit that we ARE being controlled by our robot overlords, our overlords just happen to have all the cognitive ability of a toaster oven. And the results are about what you’d expect.

      1. our overlords just happen to have all the cognitive ability of a toaster oven.

        They keep complaining that this place is getting hotter?

    2. Everything is just around the corner on a universal timescale.

      Except heat death. That’s still really far off.

  10. “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion.” As Robby Soave noted, among the first batch of people blocked by YouTube was former Reason intern Ford Fischer, whose channel News2Share documents and exposes extremism.

    While this is important to point out, repeatedly crying that “they’re catching the good guys too” isn’t always helpful. The problem we have now is that mundane content or content that’s critical of certain ideas is now considered “extremism” by the mainstream, yet isn’t. It’s not about making mistakes and getting “the wrong people” it’s that the “wrong people” have to be defined as such by a bunch of intersectional twinks at Buzzfeed. That’s the problem.

    1. A little of column A, a little of column B?

  11. Is Facebook really a better place now that Alex Jones and Louis Farrakhan are gone? Probably, but given all the gray areas that come with defining who and what is ban-worthy (should, say, radical diet groups get bounced?),

    No, they’re worse places, demonstrably so, which is why they’re already starting to bleed users and competitors are starting to pop up, despite the media attacking those platforms as hate platforms. I’m guessing Ford Fischer doesn’t think YouTube is a better place after he got demonetized.

  12. “Muscular Content Moderation” is nothing more than censorship which is a good idea if we are ever going to achieve our coveted goal of a socialist slave state.

    1. I thought it was a dogwhistle for Tony

  13. I’m starting to learn how incestuous the whole social media silicon valley “community” is. Everyone is either invested with everyone else or their company owns part of the other company or the CFO of one is married to the COO of the other, etc. This could easily explain why they all seem to do the same things at the same time.

  14. “Last week, for instance, YouTube announced that it was changing its terms of service and would now ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion.””

    Marxism uses the asserted moral and revolutionary superiority of the working class to justify discrimination against, and segregation and exclusion of, the bourgeoisie. Can we expect that YouTube will now delete all socialist, communist, and all other variety of Marxist videos?

  15. I have never been to Facebook, instagram, or twitter.

    I use you tube often for stuff I like you can’t get on TV, music, classic comedy or other entertainment.

    Found some great abbot and Costello skits there the other night.

    I’ll show ya funny.

    https://m.youtube.com/watch?v=9o1SAS8KyMs

    The rest of y’all figure this out. Just don’t take this away.

    Oh and I love tech. Really do and you young idiots spoil it I will not forgive you. The takeaway is I can sit with my pad thing and watch something from the past before my aging ass was born almost free.

    Watch whatever you want. Talk with whomever you want. Say whatever you want. You have no idea how much freedom that is. Do not fuck this up.

  16. So YouTube is fine if I go around arguing some group is inferior as long as I don’t think they should face discrimination, segregation, or exclusion? Nifty.

  17. Meetup is a social media website that handles monitoring effectively by making people pay a fee if they want the ability to start a group. This ensures that the people in charge of each group are more committted to maintaining it well. It also allows Meetup to run without advertisements. All the problems of Facebook are missing at Meetup. People who complain about Facebook can just switch to Meetup if they don’t mind paying for leadership roles and interracting with people who make a point of leaving the house on occassion.

  18. It would be difficult to implement well even if FB, Twitter, Google, and so forth INTENDED to implement it well.

    As it is, all they intend is political censorship, so implementing it well isn’t even a goal here.

  19. I am making a good salary online from home.I­’­v­e m­a­d­e $­97,­999 s­o f­o­r last 5 months w­or­k­i­n­g on­l­i­n­e a­n­d I­’­m a f­u­l­l t­i­m­e s­t­u­d­e­n­t­. I­’­m u­s­i­n­g a­n on­l­i­n­e b­u­s­i­n­e­s­s o­p­p­o­r­t­u­n­i­t­y I h­e­a­r­d a­b­o­u­t a­n­d I­’­v­e m­a­d­e s­u­c­h g­r­e­a­t m­o­n­e­y­.I am genuinely thankful to and my administrator, I­t­’s’ r­e­a­l­l­y u­s­e­r f­r­i­e­n­d­l­y a­n­d I­’­m j­u­s­t s­o h­a­p­p­y t­h­a­t I f­o­u­n­d o­u­t a­b­o­u­t i­t , …… payhd.com

  20. I’m surprised how little many advocates of censorship have thought about it.

    One seeming goal, of some of them, is to present barriers to entry.

    It’s worth thinking about ways to avoid this silliness. Maybe form our own discussion groups, robust like the original design of the ARPA/Internet.

Please to post comments

Comments are closed.