In their book The Misinformation Age: How False Beliefs Spread, a pair of professors of logic and philosophy of science from University of California, Irvine, Cailin O'Connor and James Owen Weatherall, use computer models to explain how judgments about things, such as the causes or best cures for certain illnesses, spread through scientific networks—and how that spread can cause false beliefs to dominate a community even if each of the participants is individually rational.
The authors offer juicy takeaways: We need to legally punish people for spreading unlabeled "fake news" and should abandon democratic control of "issues that require expert knowledge." On the latter point, they touch on what some economists call "rational ignorance" or even "rational irrationality"—when you have no direct influence over something, as in the outcome of voting, you have no incentive to do it intelligently.
Their models indicate that "mistrusting those with different beliefs is toxic" when it comes to truth discovery, and the stories they tell of science gone wrong prove that even smart, well-meaning people can come to different judgments. They also insist that "if scientists claim they are gathering evidence and that evidence is convincing, we have little choice but to take their word for it."
The policy conclusions step over the fact that often values are at issue in democratic governance. Centralizing decision making in a powerful elite helps generate mistrust by those with different beliefs and can halt exploration of new ideas.
The book unintentionally provides a stark warning that respect for free expression of ideas judged "dangerous" is dissipating as fast, and as hazardously, as the ozone layer was before we banned CFCs—a story they tell at length to hit home the problem of allowing non-government-sponsored scientific ideas to spread.
This article originally appeared in print under the headline "The Misinformation Age".