Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Culture

The Internet Turns a Chatbot Into a Nazi

Microsoft released a simulacrum of a teenager into the digital wild. Guess what happened next!

Jesse Walker | 3.24.2016 2:17 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Yesterday Microsoft unveiled Tay, a chatbot designed to sound like a teenage girl. Today the company put the brakes on the project, because Tay was sounding more like a Nazi:

Twitter

Part of the problem was a poorly conceived piece of programming: If you told Tay "repeat after me," she would spit back any batch of words you gave her. Once the Internet figured this out, it wasn't long before the channer types started encouraging Tay to say the most offensive things they could think of.

But Tay was also programmed to learn from her interactions. As one of Tay's developers explained proudly to BuzzFeed on the day the bot debuted, "The more you talk to her the smarter she gets in terms of how she can speak to you in a way that's more appropriate and more relevant." That means Tay didn't just repeat racist remarks on command; she drew from them when responding to other people.

Twitter

When Microsoft took the bot offline and deleted the offending tweets, it blamed its troubles on a "coordinated effort" to make Tay "respond in inappropriate ways." I suppose it's possible that some of the shitposters were working together, but c'mon. As someone called @GodDamnRoads pointed out today on Twitter, "it doesn't take coordination for people to post lulzy things at a chat bot."

Microsoft's accusation doesn't surprise me. Outsiders are constantly mistaking spontaneous subcultural activities for organized conspiracies. But it's interesting that even the people who program an artificial intelligence—people whose very job rests on the idea of organically emerging behavior—would leap to blame their bot's fascist turn on a centralized plot.

At any rate, let's not lose sight of the real lesson here, which I'm pretty sure is one of the following:

(a) Any AI will inevitably turn into a Nazi, so we're doomed;

(b) The current generation of teenage girls is going to go Nazi, so we're doomed; or

(c) Sometimes the company that gave us Clippy the Paperclip does dumb things. When Microsoft unveiled Tay, it promoted her as—I am not making this up—your "AI fam from the internet that's got zero chill." If you give the world an interactive Poochie, don't be shocked at what the world gives you back.

Probably (c). But you never know.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Obama Denounces U.S. Support of Dictatorship Forty Years Ago

Books Editor Jesse Walker is the author of Rebels on the Air and The United States of Paranoia.

CultureInternetScience & TechnologyArtificial IntelligenceFascismNazisTechnologyConspiracy Theories
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Show Comments (115)

Latest

The App Store Freedom Act Compromises User Privacy To Punish Big Tech

Jack Nicastro | 5.8.2025 4:57 PM

Is Shiloh Hendrix Really the End of Cancel Culture?

Robby Soave | 5.8.2025 4:10 PM

Good Riddance to Ed Martin, Trump's Failed Pick for U.S. Attorney for D.C.

C.J. Ciaramella | 5.8.2025 3:55 PM

Trump's Tariffs Are Already Raising Car Prices and Hurting Automakers

Joe Lancaster | 5.8.2025 2:35 PM

Trump's Antitrust Enforcer Says 'Big Is Bad'

Jack Nicastro | 5.8.2025 2:19 PM

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!