So, would you wear a serial killer's sweater? No blood spatters or anything. Heck, let's even say it has been dry cleaned.
Psychologist and cognitive neuroscientist Bruce Hood has been known to brandish a cardigan belonging to the serial killer Fred West in the lecture hall. West tortured, raped, and murdered at least 12 women. Of course, a moment's reflection will reveal that his sartorial choices probably had nothing to do with his grisly hobby. And there's no possibility of catching serial killer disease from his sweater, right? Nonetheless, most people will refuse to wear the sweater once they know its provenance (false provenance, actually, the sweater Hood uses is not really West's). Odder still, in large lecture halls, members of the audience will physically recoil from the few people who say they are willing to wear the sweater. The crowds, which often consist of highly-educated, secular people, laugh nervously as this little drama is played out, says Hood, because they realize that there is something odd and illogical about their reaction.
Hood has made a study of these intuitive ways of seeing the world. In his new book Supersense: Why We Believe in the Unbelievable (HarperOne), he looks at the moments where our intuitions come into conflict with our rational faculty. We're born with a proclivity to see patterns that aren't there, to sense agency where there is only randomness, and to tell stories about cause and effect that may or may not be true. Hood examines religion through this lens, but most of the book focuses on the ways that even people who don't consider themselves religious—or even superstitious—are governed by intuition.
Associate Editor Katherine Mangu-Ward spoke with Hood earlier this week, via Skype.
Reason: So, why do most people get creeped out by the serial killer's cardigan?
Bruce M. Hood: If you talk to people about this in a very analytical way, or you describe it to them, people say "Oh, of course I wouldn't find a problem with that," because they can sit and reason about it. But if you suddenly present them with the real cardigan and don't tell them the context and get them to respond immediately, that's the beauty of the situation, because you get the intuitions automatically being triggered. This is my point: We can reflect on our decisions and we can reflect upon our so-called beliefs. And we can come up with justifications or reasons for the decisions we take. But if you're actually put in a dramatic situation, you will flinch. You will respond in a different way. And that's because there are different speeds at which you make decisions. Some can be very fast and intuitive, and other ones you can reflect upon.
Reason: What is supersense?
Hood: Many people, they don't often recognize they have supernatural beliefs in many of their decision processes and behaviors. For believers, there's a hidden dimension to reality. There are passions out there, there are energies operating in the world controlled by invisible forces. Believers recognize this as the basis for a lot of their supernatural beliefs about the world. But even atheists and non-believers still operate with behaviors which reflect this assumption that there is something invisible operating. So that's what I've done in the book is address what this is and where it comes from.
Reason: You frequently describe these intuitions as being something we "revert" to. Is our supersense inborn?
Hood: The intuitive system is one which is not taught. It's one that's wired into the brain as part of our natural reasoning. We're designed to seek out patterns in the world. We will detect co-occurrences. In doing so, we try to understand what caused them to happen. We can't stop ourselves [from] doing it. You've got two ways of making judgments. You've got one that is very rapid, unconscious, and untutored—the intuitive system. And the second rational experiential system, the one which is analytical, is much slower and laborious. What I think is going on is that when you have a situation where the intuitive system is running riot, you have to suppress it, as it were.
To give you an example outside the context of supernatural belief: Most people, if you ask them about two cannon balls, one weighs 100 times the weight of another and you drop them from the same height, what happens? And most people say, "Oh, the heavy one falls faster and hits the ground." Now, that's not correct, that's a naive misconception about weight and speed, Galileo taught us that centuries ago. But then you can educate people with Newtonian physics and you can tell them about Newton's third law of motion. And on that level they'll be able to tell you that they should land at the same time.
The point is that when you teach people the rational propositional way of understanding that particular problem, they may understand it initially, but when they go away they very quickly forget it again. They revert back to the intuitions. And even when you're teaching people to learn the new law, to understand the scientific law, we know from brain studies that they're actively suppressing parts of their fast, rapid reasoning system in order to acquire what is an unlikely or counter-intuitive assumption. When you get counter-intuitive propositions, you have to inhibit this intuitive way of thinking. If we turn to supernatural beliefs, because I'm suggesting that these are premised on a lot of intuitions, then we may learn more rational models, or more scientific models of the world. But the intuitive models of thinking never entirely go away, and that's why you get people when you're compromising their ability to use their analytical system they can often revert or go back to their intuitions.
Reason: How does our desire to see agency in the world play out in politics?
Hood: The one thing the human brain is not very good at is dealing with random events. We're always seeing structure where there may not be any structure. And we don't like the idea of there being no predictable outcome. That creates a sense of stress or a sense of uncertainty. One way of dealing with uncertainty is to engage in acts or beliefs which you think give you some perception of control. That's why we have superstitious rituals, doing something we believe might have some influence on the outcome, and that then becomes self-reinforcing.
If we're not doing the controlling ritual, then we're very happy to concede or hand over to those individuals who we perceive, or other people perceive, or the group perceives, as actually being someone who can control the outcome. So that's where the emergence of the priests and the captains of industry and the idea that there is a consensus these people have control. Very often that kind of a group consensus is sufficient in itself. There doesn't have to be a lot of objective evidence to prove this person's control. He doesn't have to prove himself each time. So it's very hard to be critical of these individuals who have established themselves, by consensus, as being the people who can do things.
Reason: One example that you give in the book is the case of a child who needs medical attention that will cost $1 million, which the hospital will have to cover. You describe this phenomenon where most people not only have the instant intuition that the child should be saved no matter what, but they are actually disgusted by anyone who tries to do a cost-benefit analysis on the question, even if that person comes to the "right" conclusion in the end.
Hood: It was Philip Tetlock, who is an economic psychologist, who first pointed that out. The fact that you might deliberate over it, the fact that you might even have to apply some kind of cost benefit analysis is in itself abhorrent. Because it's a violation of what should be an instantaneous assumption, something that the group should automatically feel. Leon Kass called that the gut reaction, the politics of decision making, that you should just feel the answer to be correct. These are all driven by intuitions. The moral disgust that we feel is again something that you shouldn't have to think about. But that's really quite arbitrary, because in many ways—I think the hospital administrator example is perfect for that—in terms of what's best for the group, it's clear that these are tough decisions. In fact, it's clearly a decision that people do agonize about. But if you put them into the open domain they'd be very reluctant to say that or to admit that publicly. But that's in fact exactly what does have to go when you're in a position to hold the purse strings.
Reason: How does this play out in health care policy?
Hood: I would suspect that there are decisions made behind closed doors that are generally not discussed in the open domain because they would evoke so many problems. Genetic modification being one of those. We can quite easily talk about inserting various genes, but when people learn they come from jellyfish, for example, then an intuition kicks in that there's some violation of God's law or there's some sort of Frankensteinian type of science going on. You almost have to cover it up with an anonymity of jargon. I think the people who are making the decisions are always quite aware that there's always a sensitive set of issues, which are sensitive because they violate the sacred values.
Reason: So people have strong views about which kind of thinking they prefer—intuitive over rational, in this case—even if they can't articulate it. When we choose our leaders, we want the guy with the right guts and then we'll accept whatever other decisions he makes.
Hood: That's right. The fact that people can't articulate it, that they feel these responses before they've had a chance to actually reason them out, is part of the problem. And of course, not everyone's the same. We should point out there are many people who actually will sit down and ponder things out. But you know, they tend not to be the people who get the votes. We tend to go for the more emotionally charged, decisive, emphatic members of our society who can hit upon these—I think that was why Bush worked so well. He said a lot of stupid things, but he was a very powerful communicator and he was very impassioned. I think he was very good at that. Obama, he's very good as well, but he seems more cautious, more reasoned, a different style, which is very interesting. So we shall see how all that pans out.
Reason: So are you defending a kind of noble lie proposition, to circumvent intuitive responses?
Hood: I'm pretty certain that we probably wouldn't have made such strides and advances in a lot of our medical technologies if everything had been in an open forum and an open platform for discussion. I think there would have been a lot of concerns which come from—I'm not saying ignorance—but more driven by intuitions about what's right and wrong. It's only relatively recently that powerful lobby groups have gone out to try and investigate these things. But, of course, it can create some real problems.
The intuition about the triple vaccine, for example. People try to make sense about autism, which seems to come from nowhere. All it took was one scientist to make a claim about single vaccines and then people started to see patterns everywhere. They started to recognize; Oh, the person they knew who had a vaccination and they see that was causally determined. And you can show them as much statistics as you like, and all the studies that have been done in Scandinavia to prove that there is no relationship between the issue of triple vaccine and autism. And yet people still draw that causal link because of the fact that these things just happen to appear in time. That's an example about the co-occurrence and two events—vaccination and an increase in autism—these people say that they're directly related and it's a very difficult one that's out there, it's very difficult to dispel these sorts of beliefs. So yeah, there are situations where sometimes it's best to keep the information reined in rather than just letting it spill out into the public and then eventually trying to get it back under control. It's just too difficult.
Katherine Mangu-Ward is an associate editor at Reason.