You probably suffer from the "illusion of explanatory depth." Moreover, you often succumb to the "illusion of understanding." So say two cognitive scientists, Philip Fernbach of Colorado University and Steven Sloman of Brown, in The Knowledge Illusion: Why We Never Think Alone.
Disagree? OK, then write down how a zipper works. Or draw all the parts of a simple bicycle in their proper places. If that's too complicated, tell me: How does a flush toilet operate?
The illusion of explanatory depth was exposed in experiments by Frank Keil, a cognitive scientist at Cornell. Keil asked subjects to rate on a scale of 1 to 7 how confident they were about their understanding of how such mechanisms as zippers, flush toilets, helicopters, quartz watches, and piano keys worked. Then Keil asked them to write down a detailed explanation. Most could not. Afterwards, Keil reported, "many participants reported genuine surprise and new humility at how much less they knew than they originally thought."
Fernbach and Sloman then report cognitive scientist Thomas Landauer's estimate that the average adult's brain has the capacity to store about a gigabyte of information. The computer on which I am typing this review has about 1,000 times more memory than that. "Human beings are not warehouses of knowledge," the authors observe. Instead, we maneuver through the complexities that surround us by abstracting the relevant information that enables us to achieve our goals. The purpose of thinking, Fernbach and Sloman argue, is to choose the most effective action given the current situation.
Our minds think causally, not logically. To illustrate that, the authors offer a logical puzzle: If my underwear is blue, then my socks must be green. My socks are green. Therefore my underwear is blue. When asked, many people agree with the conclusion. But what about: If I fall into a sewer, then I need a shower. I took a shower. Therefore, I fell into a sewer. It's the same logical mistake, but this time our knack for causal thinking prevents most people from making it.
The authors also note that we are much better at thinking about how a cause produces an effect than we are at reasoning backward from an effect to find its cause. It is easier for a doctor predict that an ulcer will cause stomach pain than that stomach pain is the result of an ulcer. We are better at prediction than diagnosis.
The authors also cite Daniel Kahneman, the economics Nobelist who elucidated the difference between intuitive and deliberative thinking. Think of an animal whose name starts with E. For most Americans, elephant comes to mind quickly and intuitively. (For the record, I thought of echidnas. I don't know why.) Now unravel the anagram: vaeertidebli. The answer is "deliberative" and, for most of us, it takes deliberative thinking to figure it out.
The authors argue that we depend upon intuitive thinking to navigate most of our daily lives. We tend to turn to deliberative thinking when we encounter novel situations or engage in cooperative activities with others. Sloman and Fernbach note that more deliberative folks are somewhat less subject to the illusion of explanatory depth, and that they score better on the standard 3-item test measuring cognitive reflection. (Less than 20 percent of the U.S. population gets all three answers right.)
If we are all so deeply ignorant, how is the modern world possible? The book's answer is that we live in hive mind where knowledge is distributed throughout the human community. We are, in the authors' words, "built to collaborate." When we don't know something, we tap into the knowledge and expertise of our fellow human beings." Ignorance has to do with how much you, whereas being dumb is relative to other people," the authors point out. Like everyone else you are ignorant, but you are not therefore necessarily dumb.
We don't need to know how a flush toilet or the internet works. All we need to know is how to use these tools effectively to achieve our goals. Most our "knowledge" is really a set of placeholders and pointers that tell us how to access the information we need in the communal knowledge base.
So far, so good. But now we come to my main peeve about The Knowledge Illusion. Sloman and Fernbach are all about communal knowledge and cooperation, but they largely ignore the most effective institution that ever evolved for assembling and making effective use of dispersed knowledge in human societies: markets.
The duo does spend a page or two on the "hive economy," noting that "economies chug along merrily because they don't depend on individuals' understanding. An economy works because we each to our own little part." They are unknowingly recapitulating economics Nobelist Friedrich Hayek's insight, in his 1945 essay "The Use of Knowledge in Society," that information is radically decentralized, with each individual having "knowledge of the particular circumstances of time and place." This dispersed knowledge is coordinated in markets via the price system.
And the prices in markets are superb knowledge placeholders. As Hayek explains, "It is more than a metaphor to describe the price system as a kind of machinery for registering change, or a system of telecommunications which enables individual producers to watch merely the movement of a few pointers, as an engineer might watch the hands of a few dials, in order to adjust their activities to changes of which they may never know more than is reflected in the price movement."
How human cooperation mobilized by world-spanning markets makes our complicated and increasingly wealthier world possible is illustrated by Leonard Read's magnificent essay "I, Pencil." If you want a real test of the illusion of explanatory depth, ask someone: How many people does it take to make a pencil? "Not a single person on the face of this earth knows how to make me," claims the essay's eponymous writing instrument. And that's entirely true.
How do untold numbers of people cooperate across the globe to bring together milled wood from cedar trees felled in California with graphite mined and refined in Sri Lanka and rubber for the eraser tapped and congealed Malaysia? Not to mention the metal band to hold the eraser, the ink to print the brand name, and the rest. Nor the coffee the loggers drank and the bread they ate, the makers and drivers of the trucks and ships that transported the raw materials, and so forth.
Little of this comes through in the book. This avoidance is particularly puzzling because the authors do worry about how our pervasive ignorance plays out in the larger realms of policy and politics. They aptly note that when confronted with the complexities of public policy—what to do about health care, climate change, genetically modified crops, the possibility of nuclear war—most of us are not engaged in hard causal thinking about the consequences of different choices. Instead we tend to make our decisions based on the fast intuitive thinking embodied in our tribal loyalties and confirmation bias.
To overcome the problem of policy ignorance, they urge us to become more scientifically literate, learn critical thinking skills, and rely more on qualified experts. Such recommendations may help a bit, but that advice doesn't really overcome the problem of pervasive radical ignorance so well identified by the authors.
It's probably my confirmation bias talking (although I'm relying on folks I believe to be experts), but the deep problem here is the centralization and bureaucratization of decision-making. As the Utah State University anthropologist Joseph Tainter noted in his 1988 classic, The Collapse of Complex Societies, "In a hierarchical institution, the flow of information from the bottom to the top is frequently inaccurate and ineffective." Think of Soviet Five-Year Plans.
So let's reprise Fernbach and Sloman's main insight: "Intelligence resides in the community and not in any individual. So decision-making procedures that elicit the wisdom of the community are likely to produce better outcomes than procedures that depend upon the relative ignorance of lone individuals." Humanity has so far found no better general procedure for eliciting wisdom from the community than free markets. The more we use markets for mobilizing information to make decisions, the better those decisions are likely to be. It's a shame that the authors of this otherwise fascinating book are largely ignorant of that knowledge.