Yesterday representatives of Facebook, Google, and Twitter testified before a Senate subcommittee about online "Russian disinformation," sounding a note of alarm that echoed legislators' concerns and therefore grossly exaggerated the threat. "When it comes to the 2016 election," said Facebook General Counsel Colin Stretch, "the foreign interference we saw is reprehensible and outrageous and opened a new battleground for our company, our industry, and our society. That foreign actors, hiding behind fake accounts, abused our platform and other internet services to try to sow division and discord—and to try to undermine our election process—is an assault on democracy, and it violates all of our values."
The idea that Russian ads on Facebook, Russian tweets on Twitter, and Russian videos on YouTube "undermine our election process" and constitute "an assault on democracy" (let alone that such propaganda "violates all of our values") is hard to take seriously given what we know about the nature and scale of this operation. Social media platforms have every right to insist that users follow their terms of service, which in Facebook's case ban phony source descriptions (falsely identifying a Russian's posts as an American's, for example). But the expectation that Facebook, Twitter, and Google will police political discourse to minimize "Russian influence" is not just impractical but, if backed by the threat of legislation, contrary to the First Amendment.
It is important to keep in mind that we are not talking about direct interference with the election process (by hacking computers that tally votes, say). We are talking about efforts to persuade people—or, as seems to have been more common, reinforce their pre-existing opinions—through words and images. Although some of these messages can fairly be described as "disinformation" (such as a fake letter posted on Twitter supposedly documenting a $150 million contribution to Hillary Clinton's campaign by the conservative Bradley Foundation), some (such as reports about police shootings) were entirely accurate, while others were expressions of opinion on subjects such as racism, LGBT issues, immigration, and gun rights.
Except for the fact that the messages appear to have been sponsored by the Russian government, there was nothing especially sinister or insidious about them. Cases of broken English and awkward phrasing aside, they were indistinguishable from the mixture of facts, lies, and blather that constitutes good, old-fashioned, American-produced political discourse. So when The New York Times reports that Russian-sponsored "political ads and other content" (including videos from the government-sponsored news outlet RT) "were meant to sow discord or chaos," it is either being hysterical or ascribing absurdly unrealistic ambitions to the Russian government. Our social and political order is not one viral video or inflammatory tweet away from catastrophic collapse.
Russian participation in the online U.S. political debate looks even less scary when you consider how tiny its footprint seems to be. On Monday a Times headline proclaimed that "Russian Influence Reached 126 Million Through Facebook Alone," which sounds impressive unless you realize that an ad can "reach" people without being noticed or read, let alone persuading anyone. What the headline really meant, as the Times explained in its report on yesterday's hearing, is that "more than 126 million users potentially saw inflammatory political ads bought by a Kremlin-linked company, the Internet Research Agency." Yes, and since the post you are reading is available on the internet, it could potentially be seen by 3.6 billion people.
No doubt Vladimir Putin would love to determine the outcome of presidential elections by spending $100,000 on Facebook ads. It is far less clear that he (or anyone else) has the power to do that, no matter how big the ad buy. As Brian Doherty noted here on Monday, the evidence suggests that politicians and campaign "reformers" greatly overestimate the effectiveness of political advertising.
According to Facebook, the ads bought by the Internet Research Agency represented "four-thousandths of one percent (0.004%) of content in News Feed, or approximately 1 out of 23,000 pieces of content." The Times concedes that "Russia-linked posts represented a minuscule amount of content compared with the billions of posts that flow through users' News Feeds every day." Between 2015 and 2017, the paper notes, "people in the United States saw more than 11 trillion posts from pages on Facebook."
The Russian contribution on other platforms looks similarly unimpressive. Twitter Acting General Counsel Sean Edgett testified that "the 1.4 million election-related Tweets that we identified through our retrospective review as generated by Russian-linked, automated accounts constituted less than three-quarters of a percent (0.74%) of the overall election-related Tweets on Twitter at the time." The Times admits that tweets by Russian operatives posing as Americans "may have added only modestly to the din of genuine American voices in the pre-election melee," and "many of the suspect posts were not widely shared."
Still, the paper insists, the tweets "helped fuel a fire of anger and suspicion in a polarized country." As Ed Krayewski noted a few weeks ago, the headline over another Times story breathlessly claimed that "Russia Harvested American Rage to Reshape U.S. Politics." Talk about disinformation.
Richard Salgado, Google's senior counsel on law enforcement and information security, testified that the company found 18 YouTube channels offering about 1,100 videos with political content that were "uploaded by individuals who we suspect are associated with this [Russian] effort." The videos, which totaled 43 hours on a platform where 400 hours of content are uploaded every minute and more than 1 billion hours are watched every day, "mostly had low view counts," with less than 3 percent attracting more than 5,000 views.
"While this is a relatively small amount of content," Salgado hastened to add, "we understand that any misuse of our platforms for this purpose is a serious challenge to the integrity of our democracy." I assume Salgado was trying to placate senators outraged (or pretending to be outraged) by "Russian interference in our electoral process." But if our democracy cannot survive another 43 hours of political videos on YouTube, it is already doomed.