When Choosing What To Believe, People Often Choose Morality Over Hard Evidence
In new studies, many people "reported that morally good beliefs require less evidence to be justified, and that, in some circumstances, a morally good belief can be justified even in the absence of sufficient evidence."
What happens when moral beliefs collide with documented evidence? For many people, it means doubling down on whichever compliments their worldview.
It's not hard to find evidence of this phenomenon in everyday life. Now, new research from Princeton University cognitive science researcher Corey Cusimano and Princeton psychologist Tania Lombrozo explore more about how and why this seems to be.
In a pre-publication paper titled "Morality justifies motivated reasoning in the folk ethics of belief," Cusimano and Lombrozo report that people "treat moral considerations as legitimate grounds for believing propositions that are unsupported by objective, evidence-based reasoning." The researchers also found that people who deemed beliefs morally good also considered those same beliefs logically sound, even when the "good" belief lacked supportive evidence.
"Across three studies, many people prescribed motivated reasoning to others, reported that morally good beliefs require less evidence to be justified, and that, in some circumstances, a morally good belief can be justified even in the absence of sufficient evidence," Cusimano and Lombrozo write.
What does it mean to engage in "motivated reasoning?"
Cusimano and Lombrozo conducted three studies. In the first, 441 women and 395 men—average age 38, recruited through Amazon gig-work program Mechanical Turk—read one of six stories about a character who "acquires strong but inconclusive evidence for a proposition that they have a moral reason to reject," as the paper puts it. Participants were then asked which version of things the main character was likely to believe, which the character ought to believe, and how close these beliefs matched what was likely to be true.
The result?
"Participants on average reported that what someone ought to believe should be more optimistic (in favor of what is morally good to believe) than what is objectively most accurate for that person to believe based on their evidence," states the paper. In addition:
Participants reported that, relative to an impartial observer with the same information, someone with a moral reason to be optimistic had a wider range of beliefs that could be considered 'consistent with' and 'based on' the evidence.
In two subsequent studies—encompassing a total of 1,254 participants, average age 40 in one study and 39 in the other—the Princeton professors had participants read hypothetical conundrums then asked participants to predict or evaluate the characters' beliefs. This included "the extent to which formed beliefs were justified, were permissible, were supported by sufficient evidence, constituted knowledge, and were moral."
On average, participants "agreed more strongly that someone who had a moral reason to adopt a desirable belief had sufficient evidence to do so compared to someone who lacked a moral reason, even though they formed the same belief on the basis of the same evidence."
Likewise, "when someone adopted the morally undesirable belief, they were more often judged as having insufficient evidence for doing so relative to someone who lacked a moral reason (again, even though they formed the same belief on the basis of the same evidence)."
…But Why?
The authors offer two models for this system of rationalization. In the first model, moral concerns shift the correct criteria for making judgments—for instance, by lowering the amount of hard evidence deemed sufficient to justify a particular belief. "Morality changes how much evidence [people] consider to be required to hold [a particular] belief in an evidentially-sound way," the authors write.
Another theory model is that people view "a belief's moral quality"—whether it's something considered good or bad to believe in a given situation—as having its own independent weight, thereby providing "an alternative justification" for believing it.
The Princeton psychology experiments found evidence for both theories.
"Across all studies, participants routinely indicated that what a believer ought to believe, or was justified in believing, should be affected by what would be morally good to believe," Cusimano and Lombrozo write.
"Additionally, participants reported that moral concerns affected the standards of evidence that apply to belief, such that morally-desirable beliefs require less evidence than morally-undesirable beliefs."
Perhaps there's some small comfort in a part near the end of the paper, where the authors point out that plenty of people still support evidence-based reasoning and consider facts to be an important counter to feelings.
"It is important to note that we also found substantial evidence that people think beliefs ought to be constrained by the evidence," the researchers write. "While our findings show that people will integrate moral considerations into their belief evaluations, they also show that people think others should balance these considerations with the evidence that they have."
Show Comments (74)