The scientific fact that facts don't matter turns out to be factually wrong.
Sorry, let me try to put that more clearly. In a superb article at Slate, Daniel Engber revisits the research that concluded that blind partisanship and motivated reasoning are pervasive and that everyone seeks out "facts" that comport with what they already believe. Worse yet, those studies suggested that when highly ideological people are provided simultaneously with misinformation—that weapons of mass destruction were found in Iraq, that vaccines are unsafe, that Barack Obama is a Muslim—and with corrections to the falsehoods, that paradoxically reinforces their prior belief in the false information. In other words, the correction "backfires."
But as Engber reports, scientists have had trouble replicating the research that purported to reveal a post-truth world. Facts turn out to matter after all.
Engber cites a new study, "The Elusive Backfire Effect," soon to appear in the journal Political Behavior. The researchers tested more than 10,000 subjects for backfire responses to corrections of 52 factually wrong statements made by various politicians. The factually wrong statements included Hillary Clinton's claim that gun violence is spiraling; Barack Obama's assertion that discrimination is the sole cause of the gender wage gap; Donald Trump's charge that Mexican immigrants engage disproportionately in criminal behavior; and Ted Cruz's declaration that violence against police officers is rising.
After running their experiments, the researchers report:
Across all experiments, we found no corrections capable of triggering backfire, despite testing precisely the kinds of polarized issues where backfire should be expected. Evidence of factual backfire is far more tenuous than prior research suggests. By and large, citizens heed factual information, even when such information challenges their ideological commitments.
Slate also points to some as-yet-unpublished research on backfire by two psychology graduate students, Andrew Guess and Alexander Coppock. Guess and Coppock showed subjects evidence about the effects of capital punishment, minimum wage increases, and gun control. The results?
Across three studies, we find no evidence of the phenomenon [of backfire]. Pro-capital-punishment evidence tends to make subjects more supportive of the death penalty and to strengthen their beliefs in its deterrent efficacy. Evidence that the death penalty increases crime does the opposite, while inconclusive evidence does not cause significant shifts in attitudes or beliefs. Arguments about the minimum wage likewise move respondents in the direction of evidence—toward supporting a higher or a lower dollar amount according to the slant of the evidence presented in the video treatments. Finally, evidence that gun control decreases gun violence makes people more supportive while evidence that it increases violence does the opposite.
In other words, people do pay attention to evidence even when it cuts against their pre-existing beliefs.
Engber notes that there has been considerable academic resistance to publishing research that questions the pervasiveness of the backfire effect:
I asked Coppock: Might there be echo chambers in academia, where scholars keep themselves away from new ideas about the echo chamber? And what if presenting evidence against the backfire effect itself produced a sort of backfire? "I really do believe my finding," Coppock said. "I think other people believe me, too." But if his findings were correct, then wouldn't all those peer reviewers have updated their beliefs in support of his conclusion? He paused for a moment. "In a way," he said, "the best evidence against our paper is that it keeps getting rejected."
Engber concludes, "It's time we came together to reject this bogus story and the ersatz science of post-truth. If we can agree on anything it should be this: The end of facts is not a fact."
As a journalist who has always believed that facts do matter and does his best to report on science and policy as accurately as possible, I heartily endorse his conclusion.