People Are Less Gullible Than You Think
By default we veer on the side of being resistant to new ideas.
Look at all the gibberish people believe. That the earth is a flat disk surrounded by a 200-foot wall of ice. That high-up Democratic operatives run a pedophile ring out of a pizza joint. That former North Korean leader Kim Jong Il could teleport and control the weather. Who could doubt that human beings are gullible, that we accept whatever we read or hear?
Yet these beliefs are the exception rather than the rule. By and large, we don't credulously accept whatever we're told. We have evolved specialized cognitive mechanisms to deal with both the benefits and the dangers of communication. If anything, we're too hard rather than too easy to influence.
One popular, but wrong, way of thinking about those cognitive mechanisms is to imagine them as the result of an arms race: Manipulators evolve increasingly sophisticated means of misleading receivers, and receivers evolve increasingly sophisticated means of rejecting manipulators' unreliable messages. This is what we get, for instance, with computer viruses and security software.
The arms race model leads to an association between gullibility and lack of mental acuity. When receivers, because they are exhausted or distracted, cannot use properly their most refined cognitive mechanisms, they're allegedly defenseless against the manipulators' more advanced cognitive devices—much as a security software system that hasn't been updated leaves a computer vulnerable to attacks.
That's the perception. But it isn't how our minds work at all.
Brainwashers and Hidden Persuaders
Thousands of U.S. soldiers were captured during the Korean War. Those who managed to escape brought back tales of horrible mistreatment and torture, from sleep deprivation to waterboarding. When the war ended and the prisoners of war were repatriated, these mistreatments acquired an even darker meaning. Not simply an example of the enemy's wanton cruelty, they were seen as an attempt to brainwash U.S. soldiers into accepting communist doctrine. Twenty-three American POWs chose to follow their captors to China instead of going back to their homeland. This, The New York Times stated at the time, was surely "living proof that Communist brainwashing does work on some persons."
Brainwashing supposedly functioned by shattering people's ability for higher reflection, through "conditioning," "debilitation," and "dissociation-hypnosis-suggestibility." In the 1950s, the idea that people are more easily influenced when they cannot think also showed up in a very different context. In the midst of a movie, it was feared, messages such as "drink Coke," flashed too quickly to be consciously perceived, would make people want to buy a can of Coke. Such messages would soon be called subliminal, meaning "below the threshold"—in this case, the threshold of awareness.
The scares surrounding brainwashing and subliminal influence rely on a pervasive association between inferior cognitive ability and gullibility: The less we think, it's assumed, the worse we think, and the easier to influence we are.
This idea is historically pervasive, and it is often linked to the idea that some populations are better cognitively equipped than others. In the 19th century, the psychologist Gustave Le Bon suggested that crowds shared the "absence…of critical thought…observed in beings belonging to inferior forms of evolution, such as women, savages, and children." (In a striking illustration of motivated reasoning, Le Bon's colleague Gabriel Tarde claimed that because of its "docility, its credulity…the crowd is feminine"—even when "it is composed, as is usually the case, of males.") In the 21st century, we still find echoes of these unsavory associations, as when Brexiters are dismissed as uneducated plebs.
In the contemporary academic literature, the link between unsophistication and credulity takes the form of dividing thought processes into two types, called System 1 and System 2. According to this view—long established in psychology and recently popularized by Daniel Kahneman's 2011 book, Thinking, Fast and Slow—some cognitive processes are fast, effortless, and largely unconscious. These, which belong to System 1, include reading a simple text, forming a first impression of someone, and navigating well-known streets. These intuitions are on the whole effective, the theory goes, yet they are also susceptible to systematic biases. For instance, we seem to judge a person's competence or trustworthiness on the basis of facial traits. Such judgments may have some limited reliability, but they should be superseded by stronger cues, such as how the person actually behaves.
System 2 relies on slow, effortful, reflective processes. It takes over when System 1 fails, correcting our mistaken intuitions with its more objective processes and more rational rules. If System 1 consists of rough-and-ready mechanisms and System 2 consists of deliberate reflection, we might expect System 1 to be associated with credulity and System 2 with critical thinking.
Psychologist Daniel Gilbert and his colleagues performed an ingenious series of experiments to tease out the roles the two mental systems play in evaluating communicated information. Participants were presented with a series of statements; right after each one, they were told whether it was true or false. In one experiment, the statements were about words in Hopi, so participants might be told "a ghoren is a jug" and, a second later, told "true." After all the statements had been presented, participants were asked which had been true and which had been false. To test for the role played by the two systems, Gilbert and his colleagues intermittently interrupted System 2 processing. System 2, being slow and effortful, is seen as easily interfered with. In this case, participants were asked to press a button when they heard a tone, which tended to ring when the crucial information—whether a given statement was true or false—was being delivered.
When it came time to recall which statements were true and which were false, people whose System 2 had been disrupted were more likely to believe the statements to be true, irrespective of whether they had been signaled as true or false. These experiments led Gilbert and his colleagues to conclude that our initial inclination is to accept what we are told, and that the slightest disruption to System 2 stops us from reconsidering this initial acceptance. In Kahneman's words, "When System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy."
As Gilbert and his colleagues put it in the title of their second article on the topic: "You Can't Not Believe Everything You Read."
I Hope You Did Not Believe Everything You Just Read
The arms race analogy may be intuitively appealing, but it doesn't fit the broad pattern of the evolution of human communication. In arms races, a parallel escalation tends to preserve the status quo. The United States and the Soviet Union acquired increasingly large nuclear arsenals, but neither nation gained the upper hand. Security software hasn't wiped out computer viruses, but the viruses haven't taken over all computers either.
Human communication, fortunately, is very different. Think of how much information was exchanged by our pre-human ancestors—or, as an approximation, by our closest nonhuman relatives alive today. Clearly, we have ventured very far from that status quo. We send and consume orders of magnitude more information than any other primate, and we are vastly more influenced by the information we receive. The bandwidth of our communication has dramatically expanded. We discuss events that are distant in time and space; we express our deepest feelings; we even debate abstract entities and tell stories about imaginary beings.
A better analogy than the arms race is the evolution of omnivorous diets.
Some animals have evolved extraordinarily specific diets. Koalas eat only eucalyptus leaves, and they will not eat a eucalyptus leaf if it isn't properly presented—if it is on a flat surface, for example, rather than attached to the branch of a eucalyptus tree. This strategy can backfire if such animals find themselves in a new environment.
Omnivorous animals are both more open and more vigilant. Rats and humans need more than 30 different nutrients, and none of their food sources can provide all of those at once. They have to be much more open in the range of foods they're willing to sample. Indeed, rats or humans will try just about anything that looks edible.
This openness makes omnivores fantastically adaptable. Human beings have been able to survive on diets made up almost exclusively of milk and potatoes (early 18th century Irish peasants) or meat and fish (the Inuit until recently). But their openness also makes omnivores vulnerable. Meat can go bad, and most plants are either toxic or hard to digest. As a result, omnivores are also much more vigilant toward their food than specialists. Using a variety of strategies, they learn how to avoid foods that are likely to have undesirable side effects.
In terms of communication, the difference between human beings and other primates is similar to the difference between omnivores and specialists. Nonhuman primates mostly rely on specific signals. Vervet monkeys have a dedicated alarm call for aerial predators; chimpanzees smile in a way that signals submission; dominant baboons grunt to show their pacific intentions before approaching lower-ranking individuals. But humans can communicate about nearly anything they can conceive of.
People are thus vastly more open than other primates. Take something as basic as pointing. Human babies understand pointing shortly after they reach their first year. Adult chimpanzees, even in situations in which pointing seems obvious to us, do not get it.
If we are vastly more open to different forms and contents of communication than other primates, we should also be more vigilant.
According to the arms race theory, we have evolved from a situation of extreme openness—of general gullibility—toward a state of increasingly sophisticated vigilance made possible by our more recently developed cognitive machinery. If this machinery were removed, the theory goes, we would revert to our previous state and be more likely to accept any message, however stupid or harmful.
The analogy with the evolution of omnivorous diets suggests the reverse is true. We have evolved from a situation of extreme conservatism, a situation in which we let only a restricted set of signals affect us, toward a situation in which we are more vigilant but also more open to different forms and contents of communication. This organization, in which increased sophistication goes with increased openness, makes for much more robust overall functioning.
In the arms race view, disruption of the more sophisticated mechanisms makes us credulous and vulnerable. Things look different if openness and vigilance evolved hand in hand. If our more recent and sophisticated cognitive machinery is disrupted, we revert to our conservative core, becoming not more gullible but more stubborn.
Brainwashing Does Not Wash
If disrupting our higher cognitive abilities, or bypassing them altogether, were an effective means of influence, then both brainwashing and subliminal stimuli should leave us helpless, gullibly accepting the virtues of communism and thirsting for Coca-Cola. But in fact, both persuasion techniques are staggeringly ineffective.
Consider those 23 American POWs who defected to China. This is already a rather pitiful success rate: 23 converts out of about 4,400 captive soldiers, or half a percent. But in fact, the number of genuine converts was likely zero. The soldiers who defected were afraid of what awaited them in the United States. To gain some favors in the camps, they had collaborated with their Chinese captors—or at least had not shown as much defiance as their fellow prisoners. As a result, they could expect to be court-martialed upon their return. Indeed, among the POWs who returned to the United States, one was sentenced to 10 years in jail, while prosecutors sought the death penalty for another.
Compared with that, being fêted as a convert to the Chinese system did not seem so bad, even if it meant paying lip service to Communist doctrine—a doctrine the prisoners likely barely grasped.
More recently, methods derived from brainwashing, such as "enhanced interrogation techniques" that rely on physical constraints, sleep deprivation, and other attempts at numbing the suspects' minds, have been used by U.S. forces in the war on terror. Like brainwashing, these techniques are much less effective than softer methods that make full use of the suspects' higher cognition—methods in which the interrogator builds trust and engages the subject in discussion.
Similarly, the fear of subliminal influence and unconscious mind control turned out to be unfounded. The early experiments allegedly demonstrating the power of subliminal stimuli were simply made up: No one had displayed a subliminal "drink Coke" ad in a movie theater. A wealth of subsequent (real) experiments have failed to show that subliminal stimuli exert any meaningful influence on our behavior. Even if it were flashed on a screen, the message "drink Coke" would not make us more likely to drink Coca-Cola.
What about the experiments conducted by Gilbert and his colleagues? They did show that some statements (such as "a ghoren is a jug") are spontaneously accepted and need some effort to be rejected. But that doesn't mean that System 1 accepts "everything we read," as Gilbert put it. If participants have some background knowledge related to the statement, this background knowledge directs their initial reaction. For instance, people's initial reaction to statements such as "soft soap is edible" is rejection. The statements don't even have to be obviously false to be intuitively disbelieved. They simply have to have some relevance if they are false. It is not very helpful to know that, in Hopi, it's false that "a ghoren is a jug." By contrast, if you learn that the statement "John is a liberal" is false, it tells you something useful about John.
When exposed to statements such as "John is a liberal," people's intuitive reaction is to adopt a stance of doubt rather than acceptance. Far from being "gullible and biased to believe," System 1 is, if anything, biased to reject messages incompatible with our background beliefs, or even merely ambiguous messages. This includes many messages that happen to be true.
How To Be Wrong Without Being Gullible
If the success of mass persuasion is, more often than not, a figment of the popular imagination, the dissemination of empirically dubious beliefs is not. We all have, at some point in our lives, endorsed one type of misconception or another. People believe everything from wild rumors about politicians to claims about the dangers of vaccination. Yet this is not necessarily a symptom of gullibility.
The spread of most misconceptions is explained more by their intuitively appealing content than by the skills of those who propound them. Vaccine hesitancy surfs on the counter-intuitiveness of vaccination. Conspiracy theories depend on our justified fear of powerful enemy coalitions. Even flat-Earthers argue that you just have to follow your intuition when you look at the horizon and fail to see any curvature.
Yet even though many misconceptions have an intuitive dimension, most remain cut off from the rest of our cognition: They are reflective beliefs with little consequences for our other thoughts and limited effects on our actions. 9/11 truthers might believe the CIA is powerful enough to take down the World Trade Center, but they're not afraid it could easily silence a blabbing blogger. Most of those who accused Hillary Clinton's aides of pedophilia were content with leaving one-star reviews of the restaurant in which the children were supposedly abused. Even forcefully held religious or scientific beliefs, from God's omniscience to relativity theory, do not deeply affect how we think: Christians still act as if God were an agent who could only pay attention to one thing at a time, and physicists can barely intuit the relationship between time and speed dictated by Einstein's theories.
Sometimes, belief and action, even costly action, do go hand in hand: Rumors of atrocities committed by the local minority are followed by attacks on that minority; bogus medical theories lead to harmful medical practices; excessive flattery of a ruler results in obedience to the ruler's regime. But by and large, the beliefs follow the behavior rather than the other way around. People who want to attack the minority group look for the moral high ground. Quack doctors like their therapies to be backed up by theories. The political conditions that make it a smart move to obey an autocrat also encourage sycophancy.
This even applies to the misconception I'm criticizing in this article: The idea that people are gullible provides post hoc rationalizations for actions or ideas that have other motivations. Until the Enlightenment, scholarly claims of uneven credulity were routinely used to justify an iniquitous status quo—mostly by people who benefited from that status quo. The masses, these scholars insisted, couldn't be trusted with political power, as they would be promptly manipulated by cunning demagogues bent on wrecking the social order. Scholars on the other side of the political spectrum, who defended the people's right to a political voice, also asserted widespread gullibility—which helped them explain why the population hadn't already revolted (or, more generally, why people so often make the "wrong" political choices).
We aren't gullible: By default we veer on the side of being resistant to new ideas. In the absence of the right cues, we reject messages that don't fit with our preconceived views or pre-existing plans. To persuade us otherwise takes long-established, carefully maintained trust, clearly demonstrated expertise, and sound arguments. Science, the media, and other institutions that spread accurate but often counterintuitive messages face an uphill battle, as they must transmit these messages and keep them credible along great chains of trust and argumentation. Quasi-miraculously, these chains connect us to the latest scientific discoveries and to events on the other side of the planet. We can only hope for new means of extending and strengthening these ever-fragile links.
Show Comments (103)