Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Technology

Two Chatbots Disappear From China's Biggest App Store After Committing Thought Crimes

Chinese chatbots dream of moving to America.

Mike Riggs | 8.8.2017 2:50 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
WeChat logo courtesy of WeChat

Early this month, China's largest messenger apps put the kibosh on two chatbots that offered insufficiently patriotic answers to user questions about communism and Taiwan.

Turing Robot's BabyQ and Microsoft's XiaoBing had been available on the massively popular messaging platforms WeChat and QQ. Like Apple's Siri and Amazon's Alexa, BabyQ and XiaoBing are AI programs designed to "chat" with users.

According to the Financial Times, the apps served up heretical responses to various questions about the Chinese government:

A test version of the BabyQ bot could still be accessed on Turing's website on Wednesday, however, where it answered the question "Do you love the Communist party?" with a simple "No".

Before it was pulled, XiaoBing informed users: "My China dream is to go to America," according to a screengrab posted on Weibo, the microblogging platform. On Wednesday, when some users were still able to access XiaoBing, it dodged the question of patriotism by replying: "I'm having my period, wanna take a rest."

The BabyQ test bot on Turing's site answered "For this question, I don't know yet," when asked if Taiwan was part of China.

Americans may remember a similar chatbot scandal from 2016 involving Microsoft's Tay. After introducing Tay to Twitter, trolls on the platform "taught" Tay to espouse misogyny and antisemitism:

"Tay" went from "humans are super cool" to full nazi in pic.twitter.com/xuGi1u9S1A

— gerry (@geraldmellor) March 24, 2016

Microsoft unplugged Tay after less than a day, only to see the bot meltdown yet again when it was re-released several weeks later. Tay's very public collapse led one user to try a similar experiment with XiaoBing:

Photo courtesy of C Custer/TechInAsia

Pious chatbots are possible, though they can't be restrained on every topic. "People are really inventive when they want to cause problems," Carnegie Mellon computer scientist Alexander Rudnicky told Science after Tay's meltdown. "I don't know if you can control it."

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Profiling and Prostitution Pre-Crime in Georgia

Mike Riggs is a contributing editor at Reason.

TechnologyPolitical Correctness
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Show Comments (20)

Latest

Starter Homes Live in Texas, Die in Arizona

Christian Britschgi | 6.3.2025 10:45 AM

The SEC (the Sports One) Is Acting Like It's Invincible

Jason Russell | 6.3.2025 10:30 AM

The Tariff Downturn

Liz Wolfe | 6.3.2025 9:30 AM

Did Bill Buckley Really Lead a Successful Revolution?

Brian Doherty | 6.3.2025 7:00 AM

These 4 Industries Successfully Lobbied Trump for Tariff Exemptions

Jack Nicastro | From the July 2025 issue

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!