The Washington Post Says Democracy Demands Less Freedom of Speech
The paper worries that "social media companies are receding from their role as watchdogs against political misinformation."
Donald Trump was back on X, the social media platform formerly known as Twitter, last night for the first time since he got the boot in 2021 following the riot by his supporters at the U.S. Capitol. Trump posted the mug shot of him that was taken at Atlanta's jail this week when he was booked on the charges laid out in his Georgia indictment, which stem from his efforts to overturn the 2020 presidential election results in that state. He included a caption that described the indictment as "ELECTION INTERFERENCE" and urged his followers to "NEVER SURRENDER!"
After taking over the platform that was then known as Twitter last year, Elon Musk, an avowed "free speech absolutist," reinstated Trump's account. But this is the first time that Trump, who started a competing platform that is still known as Truth Social, has made use of Musk's permission. The Washington Post, in a news story published this morning, portrays Musk's decision and the attitude underlying it as part of a worrisome trend that threatens "democracy" by allowing "political misinformation" to proliferate on social media. The piece nicely illustrates the confusion, obfuscation, and hypocrisy that characterize mainstream press coverage of that subject.
As is typical of this journalistic genre, Post reporters Naomi Nix and Sarah Ellison never address the question of what counts as "misinformation," a highly contested category. Nor do they grapple with the content moderation problem of how to deal with politicians who say things of public interest that are arguably or demonstrably untrue. And although they allude to a constitutional challenge provoked by the federal government's efforts to restrict speech on social media platforms, they never mention the First Amendment. That is a pretty striking omission by people whose profession relies on that amendment's protections and who claim to be worried about the health of our democracy.
Nix and Ellison warn that "social media companies are receding from their role as watchdogs against political misinformation, abandoning their most aggressive efforts to police online falsehoods in a trend expected to profoundly affect the 2024 presidential election." Under Musk's baneful influence, they complain, Facebook and YouTube have "backed away from policing misleading claims" and "are receding from their role as watchdogs against conspiracy theories."
The main conspiracy theory that Nix and Ellison have in mind, of course, is the one claiming that systematic fraud, including deliberately corrupted voting machines and massive numbers of phony ballots, deprived Trump of his rightful victory in the 2020 election. As they note, neither Trump nor his lawyers ever produced any credible evidence to support that theory. Yet Trump, who currently is by far the leading contender for the 2024 Republican presidential nomination, still claims he actually won reelection, and the sincerity of that belief is a central issue in both the Georgia case and his federal prosecution for conspiring to make that fantasy a reality. As Nix and Ellison note, most Republican voters—63 percent, according to a CNN poll conducted in May—agree with Trump that Joe Biden "did not legitimately win enough votes to win the presidency."
As Nix and Ellison see it, none of those people should be allowed to express that view on social media. They also think it was clearly wrong for X to let Tucker Carlson post his recent interview with Trump, which was timed to coincide with the Republican presidential debate he skipped. "Trump capitalized on [Musk's] relaxed standards" in that interview, they complain, by reiterating his "false claims that the 2020 election was 'rigged' and that the Democrats had 'cheated' to elect Biden."
For me, that unilluminating, sycophantic interview, during which Carlson never asked a challenging question and let Trump ramble on about whatever random subjects flitted through his mind, was hard to watch. But as I write, it has racked up more than 256 million views, which suggests that more than a few people were interested in what Trump had to say. By comparison, Fox News says fewer than 13 million people watched its broadcast of the debate that Trump skipped.
X, in short, seems to be giving people what they want, which makes good business sense. One might also argue, as Carlson did, that "whatever you think of Trump…voters have an interest in hearing what he thinks," since he is the "indisputable, far-and-away front-runner in the Republican race."
Nix and Ellison do not see it that way. For the good of democracy, they think, social media platforms should be showing users political content only if it can be certified as accurate. That is, of course, an impossible challenge, one that is magnified by the difficulty of determining when speech, although not demonstrably false, nevertheless qualifies as "misinformation" because it is "misleading." Policing "hate speech," which Nix and Ellison also want the platforms to do, poses similar problems of interpretation and judgment.
The major platforms define their content moderation mission more narrowly than Nix and Ellison would like. "We remove content that misleads voters on how to vote or encourages interference in the democratic process," YouTube told the Post. "Additionally, we connect people to authoritative election news and information through recommendations and information panels." Meta, which owns Facebook, Instagram, and Threads, was vaguer. "Protecting the U.S. 2024 elections is one of our top priorities," it said, "and our integrity efforts continue to lead the industry."
No matter how they decide to flag or suppress content, the platforms will be pissing off a lot of people. There is "no winning," Katie Harbath, former director of public policy at Facebook, told the Post. "For Democrats, we weren't taking down enough, and for Republicans we were taking down too much." In light of those conflicting demands, Harbath said, Facebook decided "it's just not worth it anymore."
This situation becomes even more difficult and complicated when federal officials start demanding that social media companies do more to suppress speech those officials view as dangerous to democracy, public health, or national security. It also becomes constitutionally problematic—a point that Nix and Ellison do not even acknowledge. Instead they complain that "an aggressive legal battle over claims that the Biden administration pressured social media platforms to silence certain speech has blocked a key path to detecting election interference."
Those are not merely "claims." The Biden administration indisputably "pressured social media platforms," publicly and privately, "to silence certain speech." The legal question is whether that pressure amounted to government-directed censorship, in violation of the First Amendment. A federal judge concluded that it did.
Nix and Ellison probably disagree with that decision. But they do not even mention it, let alone explain why they think it was wrong. More generally, they seem completely untroubled by the free speech implications of not-so-subtly threatening social media companies with antitrust litigation, heavier regulation, and increased exposure to civil liability if they fail to follow the government's content moderation recommendations.
Nix and Ellison repeatedly raise the specter of foreign interference with U.S. elections. The "new approach" to content moderation, they say, "marks a sharp shift from the 2020 election, when social media companies expanded their efforts to police disinformation. The companies feared a repeat of 2016, when Russian trolls attempted to interfere in the U.S. presidential campaign, turning the platforms into tools of political manipulation and division."
Those sinister-sounding efforts were pretty pitiful, less than a drop in the bucket of the "misinformation" and "disinformation" that Americans themselves regularly produce. By invoking a foreign threat, Nix and Ellison distract readers from the central issue, which is whether democracy is better served by heavy-handed moderation that aims to shield social media users from false, misleading, and hateful speech or by the more free-wheeling approach that Musk prefers. They think the answer is obvious, which is why they present their advocacy as straight news reporting.
Show Comments (180)