The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Can we trust government to correct our cognitive biases?
Economist Richard Thaler recently won the 2017 Nobel Prize in economics for his important work documenting widespread cognitive errors in human decision-making. All too often, people fail to act as rationally as conventional economic models assume, and at least some of those errors are systematic in nature. Such errors can lead to mistakes that greatly diminish our health, happiness, and welfare.
Thaler and many other behavioral economics scholars argue that government should intervene to protect people against their cognitive biases, by various forms of paternalistic policies. In the best-case scenario, government regulators can "nudge" us into correcting our cognitive errors, thereby enhancing our welfare without significantly curtailing freedom.
But can we trust government to be less prone to cognitive error than the private-sector consumers whose mistakes we want to correct? If not, paternalistic policies might just replace one form of cognitive bias with another, perhaps even worse one. Unfortunately, a recent study suggests that politicians are prone to severe cognitive biases too—especially when they consider ideologically charged issues. Danish scholars Caspar Dahlmann and Niels Bjorn Petersen summarize their findings from a study of Danish politicians:
We conducted a survey of 954 Danish local politicians. In Denmark, local politicians make decisions over crucial services such as schools, day care, elder care and various social and health services. Depending on their ideological beliefs, some politicians think that public provision of these services is better than private provision. Others think just the opposite. We wanted to see how these beliefs affected the ways in which politicians interpreted evidence….
In our first test, we asked the politicians to evaluate parents' satisfaction ratings for a public and a private school. We had deliberately set up the comparison so one school performed better than the other. We then divided the politicians into two groups. One group got the data—but without any information as to whether the school was public or private. The schools were just labeled "School A" and "School B." The other group got the exact same data, but instead of "School A" and "School B," the schools' titles were "Public School" and "Private School."
If politicians are influenced by their ideologies, we would expect that they would be able to interpret the information about "School B" and "School A" correctly. However, the other group would be influenced by their ideological beliefs about private versus public provision of welfare services in ways that might lead them to make mistakes….
This is exactly what we found. Most politicians interpreting data from "School A" and "School B" were perfectly capable of interpreting the information correctly. However, when they were asked to interpret data about a "Public School" and a "Private School" they often misinterpreted it, to make the evidence fit their desired conclusion.
Even when presented additional evidence to help them correct their mistakes, Dahlmann and Petersen found that the politicians tended to double down on their errors rather than admit they might have been wrong. And it's worth noting that Denmark is often held up as a model of good government that other countries should imitate. If Danish politicians are prone to severe ideological bias in their interpretation of evidence, the same—or worse—is likely to be true of their counterparts in the United States and elsewhere.
Politicians aren't just biased in their evaluation of political issues. Many of them are ignorant, as well. For example, famed political journalist Robert Kaiser found that most members of Congress know little about policy and "both know and care more about politics than about substance." When Republican senators tried to push the Graham-Cassidy health care reform bill through Congress last month, few had much understanding of what was in the bill. One GOP lobbyist noted that "no one cares what the bill actually does."
Given such widespread ignorance and bias, it is unlikely that we can count on politicians to correct our cognitive errors. To the contrary, giving them the power to try to do so will instead give free rein to the politicians' own ignorance and bias.
But perhaps voters can incentivize politicians to evaluate evidence more carefully. In principle, they could screen out candidates who are biased and ill-informed, and elect knowledgeable and objective decision-makers. Sadly, that is unlikely to happen, because the voters themselves also suffer from massive political ignorance, often being unaware of even very basic facts about public policy. And like the Danish politicians in Dahlmann and Petersen's study, voters also tend to be highly biased in their evaluation of evidence.
Significantly, both voters and politicians tend to be much more biased in evaluating evidence on political issues than similar evidence about other matters. The politicians surveyed by Dahlmann and Petersen had little difficulty in evaluating data on the performance of "School A" versus "School B," but were highly biased in considering the performance of private schools as compared to public ones. The latter is a controversial political issue, while the former is not.
This is not a surprising result. Making rational decisions and keeping our biases under control often requires considerable effort. Voters have very little incentive to make such an effort on political issues because the chance that any one vote will make a difference to the outcome of an election is extraordinarily small. As a result, it is actually rational for them to be ignorant about most political issues and to make little or no attempt to evaluate political information in an unbiased way. By contrast, private sector decisions are more likely to make a difference. This creates stronger incentives to both acquire information and evaluate it objectively, though obviously few of us avoid bias completely. It is no accident that most people spend more time and effort seeking out and evaluating information when they decide what TV or smartphone to buy than when they decide who to vote for in a presidential election—or any other election.
Politicians arguably have stronger incentives to learn about politics than voters do. Their decisions on policy issues often do make a difference. But because the voters themselves are often ignorant and biased, they tend to tolerate—and even reward—policy ignorance among those they elect. Politicians have strong incentives to work on campaign skills, but relatively little incentive to become knowledgeable about policy. It is not surprising that most do far better on the former than the latter.
Far from working to correct voters' ignorance and bias, politicians often try manipulate it to their advantage. Donald Trump is just an extreme case of a common phenomenon: a political leader who effectively capitalizes on voter ignorance, while being poorly informed about policy himself. More conventional politicians—including Trump's predecessor Barack Obama—often use similar tactics, even if not quite to the same extent.
Some behavioral economics scholars—including Thaler's frequent coauthor Cass Sunstein—argue that instead of relying on the normal democratic process to address cognitive bias, we should delegate more power to expert bureaucracies. Bureaucratic experts may be more knowledgeable than voters and elected officials.
But expert regulators have significant knowledge limitations of their own, and the ignorance and bias of voters and politicians often create perverse incentives for bureaucrats and exacerbate their shortcomings. Moreover, bureaucrats are far from free of cognitive biases of their own.
Critics have found evidence that some of the cognitive biases identified by behavioral economists are not as severe as is often claimed, or may even be artifacts of flawed experimental methods. Nonetheless, as Thaler and others have effectively shown, cognitive bias is often a genuine problem. Few if any people are as systematically rational as homo economicus or the Vulcans of Star Trek. But if we rely on government to try to fix our biases, we can easily end up exacerbating the very problem we set out to correct.
To get the Volokh Conspiracy Daily e-mail, please sign up here.
Show Comments (0)