Social Media

Why Did YouTube Remove This Reason Video?

Our coverage of biohackers working on a DIY vaccine last year was solid reporting on an important subject. If YouTube insists on banning journalism like this, what's next?

|

On Monday, YouTube sent Reason an automated takedown notification for a March 13, 2020, video titled "Biohackers Are on a Secret Hunt for the Coronavirus Vaccine." The message said our video violated the company's spam, deceptive practices, and scams policy.

YouTube denied Reason's appeal, informing us that the video violates the company's "medical misinformation policy."

Did this 16-month-old video really promote "medical misinformation"?

Speaking as the journalist who produced it: absolutely not. While YouTube, as a private company, is within its rights to decide what to carry, the decision to remove this video illustrates a disturbing, censorial trend that has accelerated in the age of COVID.

For years, I have covered biotechnology's potential to improve our well-being and to liberate us from the constraints of our own biology. My reports have explored everything from lab-grown meat that could profoundly improve the global food supply to mail-order CRISPR kits that hand the keys to the genome to any biohacker bold enough to grab them.

The latter project introduced me to the world of biohackers: a grassroots movement of professional scientists, students, and hobbyists who contend that cutting-edge biology can occur outside large government and corporate laboratories.

"Are people dying and suffering needlessly because of all these committees and all these rules?" Josiah Zayner, founder of the biotech company The Odin, explained to Reason in 2016. "And what if people just say, 'Fuck you, I'm going to do it anyway'? And what if people start getting cured?"

My interest in this defiant subculture led me to the story of a rogue biologist raising money to create a "knock-off" version of one of the various mRNA or DNA coronavirus vaccines in development in early March 2020. I contacted this individual, whose credentials and initial seed funding I was able to verify, and Reason agreed to his condition of anonymity because of his fear of retribution from the Food and Drug Administration (FDA).

"If someone is trying to develop and distribute an unapproved medicine, [the FDA] will come down hard, and they have," he told Reason in March 2020. "It's a severe risk to our livelihoods outside of this project if we were to be deanonymized."

The result was my video, which points out that synthetic DNA and mRNA vaccines were, at the time, brand new territory—technologies that most Americans had never heard of, much less considered a possible solution to a pandemic. As I reported, my biohacker source "acknowledges that most people won't be willing to inject a non-FDA-approved vaccine but believes that if the pandemic gets bad enough they could fill a gap between the time the government approved an official vaccine and the time that vaccine is shown in trials to be relatively safe and able to generate antibodies in the blood."

As it turned out, DIY vaccines haven't played a major role in conquering the pandemic, in large part because of the remarkable speed with which the official vaccines were developed and deployed. But the FDA's emergency authorization for those vaccines doesn't suddenly transform my video into "medical misinformation." It remains a snapshot of an early longshot effort to prepare for a frightening and uncertain pandemic, and it continues to raise questions about whether the future of science needs to be as centrally managed as it has been for the past century. It's a factual report, not misinformation.

Incidentally, Zayner has publicly demonstrated that it may indeed be possible to create a generic COVID-19 vaccine outside of a corporate laboratory, having published results showing an increased reading of COVID-19 antibodies after injecting himself with a DIY vaccine.

YouTube probably flagged Reason's video as part of its effort to combat misinformation about the COVID-19 vaccines, which is indeed rampant. This exemplifies the problem with broad, automated content moderation: It creates a chilling effect, creating a de facto prohibition on the legitimate discussion of certain topics. Stanford technologist Daphne Keller has documented this chilling effect in other realms, such as the documentation of war atrocities, calling the imprecision of mass automated moderation the "dolphin in the net" effect.

Biohackers like Zayner have been complaining about the problem for a long time. He has seen his company's products removed from Facebook and his livestreams from YouTube. This is bad for free inquiry and for the future of science, as are efforts to suppress discussion of any COVID-19 prevention or treatment besides those explicitly authorized by the FDA.

Meanwhile, misinformation hasn't just emanated from dark corners of the internet. It has travelled from the top down, from politicians—starting with the president—and public health officials, such as Anthony Fauci, who has said that his early stance against face masks wasn't really about their efficacy but about preserving supply for health care workers. Major media outlets preemptively labeled the hypothesis—currently being investigated by the federal government—that COVID-19 escaped from a viral laboratory in Wuhan a debunked conspiracy theory. The World Health Organization initially denied the efficacy of masks, and it became clear early on that it might be privileging the leadership's relationship with the Chinese government over scientific concerns. Yet YouTube CEO Susan Wojcicki told CNN in April 2020 that "anything that goes against [World Health Organization] recommendations would be a violation of our policy."

A major lesson of the pandemic should be that there's no guarantee political leaders and public health officials are going to deliver us accurate facts. We badly need skepticism and an unfettered marketplace of ideas to challenge conventional wisdom, now more than ever.

It remains essential to defend YouTube's right to make poorly reasoned and executed content moderation decisions; any government regulation of speech on social media is likely to backfire and hamper the free exchange of ideas. But it's also essential to recognize and critique censorious overreach if we want the market to respond to such errors. And a healthy market response is exactly what we need when the boundaries of acceptable discourse are being hemmed in by large companies susceptible to political pressure.