What Should We Do If YouTube Censors on Behalf of the Chinese Communist Party?
Technological—not political—solutions will secure true freedom of speech online
A funny thing has been happening on YouTube. For some reason, certain combinations of Chinese characters have been immediately removed from the platform within a few short seconds. No warning or reason would be given to the Mandarin-speaking moderatees. And it's not like they had been foul or freaky in a foreign tongue. The Hanzi Which Shall Not Be Named were merely 共匪 ("communist bandit") and 五毛 ("fifty cents").
Huh? Why in the world would YouTube want to immediately take down those particular phrases? Well, according to YouTube, they didn't. It's an algorithmic mistake, you see. The company told The Verge that upon review, they discovered this odd insta-deleting was indeed an "error in [their] enforcement systems" that is currently being patched.
How strange that this automated fluke would tend towards the direction of the Chinese Communist Party's (CCP) preferences. These terms might seem random to Westerners, but they carry significant political weight in China.
It's not nice to call someone a bandit, whether they are a communist or not. But in the Chinese context, the term 共匪 has a very particular meaning: it was used by Nationalist partisans led by the Republic of China's Chiang Kai-shek against the People's Republic of China's Mao Zedong and the reds. Today, it is considered a slur against the CCP and its patriotic supporters.
五毛 is a cleverer anti-CCP troll. It's basically calling a pro-CCP commenter a paid shill, albeit a cheap one. The joke is that human CCP NPCs get paid fifty cents for each pro-CCP post; ergo, the "fifty-cent army" or 五毛党.
One of the weirder things about this controversy is that those terms would not disappear from YouTube comments if you typed them out in English or in Pinyin, which is a phonetic way to write out Chinese characters. They would only get struck if they appeared in the original Hanzi.
Yet YouTube is already banned in China and can be a pain to access without a good VPN and strong desire to do so. It's not like YouTube would have a significant impact on domestic Chinese opinion anyway. It would be like if WeChat, a hugely popular Chinese messaging app that basically no non-expat Americans use, randomly started censoring terms like "MAGAtard" or "McCarthyite." What's the point?
Well, theoretically, this kind of censorship could be aimed at keeping diaspora citizens in line. Chinese immigrants have found success and fortune throughout Western democracies. The CCP may worry that their erstwhile assets could become a little too accustomed to such capitalist pig values as freedom of speech. If people can't be kept off YouTube, maybe the most memorable memes can.
But that is just a theory. We are asked to believe that YouTube's pesky algorithm just happened to accidentally disappear these very particular Chinese anti-government phrases since at least October of 2019. And although users had publicly flagged this "bug" on official YouTube help forums last fall, it did not become a major concern until the issue received attention in the press last week.
Do you believe YouTube? A lot of people don't. It's hard to know for sure what happened. Platform algorithms are necessarily opaque, and they can sometimes produce outcomes that even their designers struggle to understand. Not only are these algorithms trade secrets, they are an inherently secretive trade.
It is entirely possible that YouTube discretion had nothing to do with this seeming pro-CCP censorship. Perhaps the fifty-cent army flagged these phrases enough to become automatic triggers that the algorithm would automatically pull. But it is also possible that someone at YouTube manually added these terms to a blocklist. Other platforms are known to have such lists. Right now, we don't know. Besides, the outcome is a problem either way.
There is a lot of smoke, but is there fire? How deep might such problems run? Someone needs to dig for answers.
The U.S. government could try. Pressure from politicians of both parties contributed to Google winding down its proposed "Dragonfly" Chinese search engine that would have been compliant with CCP censorship demands. Of course, this pressure campaign relied on someone with close knowledge of the Dragonfly Project to leak documents to the Intercept in the first place. Without a whistleblower, there's no trail to follow.
Perhaps YouTube employees will take more public interest in such matters. They are in a good position to extract answers and concessions from their managers if there is more to this story than official statements indicate. It could be personally risky. But Alphabet employees have been willing to stick their necks out for ideals in the past. In addition to their opposition to Project Dragonfly, Googlers' demonstrations against the Project Maven proposal with the U.S. Department of Defense proved fruitful.
We can hope that YouTube will do the right thing. Yet it is an unsatisfactory option, and even the most public-minded company would struggle to consistently and ethically do battle against a set of politically-mediated "enemies." Is that realistic or even desirable?
We are currently amidst a complex debate about what rights and responsibilities platforms may have in moderating user-submitted content. Few would dispute that Section 230, which shields internet actors from legal liability due to information provided by others, is responsible for the internet environment we inhabit today. The problem is that this internet environment is one where hostile foreign governments that operate reeducation camps for religious minorities may stealthily manipulate content and stifle dissent on U.S.-based platforms with impunity. Does anyone really think this is sustainable?
Laws and activism can only do so much. Censorship, whether by accident or pressure, will be a threat whenever a central party has the ability to censor. The cypherpunk activist John Gilmore noted that "the net interprets censorship as damage and routes around it." It may take time, but eventually we will arrive at a system where authorities could not censor distributed content even if they wanted to.
For those who oppose censorship, working towards solutions that further entrench governments or central platforms in content moderation misses the long-term answer. Instead, they should seek and support technological projects that make the problem irrelevant.
Show Comments (83)