In Search of Free Will and Moral Responsibility

A review of Who's In Charge: Free Will and the Science of the Brain


In Who's in Charge?: Free Will and the Science of the Brain, University of California, Santa Barbara cognitive neuroscientist Michael Gazzaniga sets off in search of free will. After four chapters of digging for a useful theory of mind among the neurons, his results are disappointing. Far more impressive, however, is his intriguing and persuasive treatment of the moral implications of modern neuroscience. Metaphysics aside, what is really important is that people believe we have free will.

Gazzaniga convincingly argues that morality is an emergent property of minds (brains) interacting with one another. His discussion of the evolution of human sociality is fascinating. Over the eons humans have changed their physical and social environments, which in turn has shifted the sorts of genes, behaviors, and brains that successfully reproduce in a generally more cooperative direction. Gazzaniga cites the hypothesis of primatologists Brian Hare and Michael Tomasello who suggest that humans may have undergone a process of self-domestication in which overly aggressive or despotic individuals were reproductively weeded out—by being ostracized or killed by the group.

What Gazzaniga really fears are the potentially baleful effects of neuroscientific findings on our notions of personal responsibility. In general, the concept of free will is closely connected to the concept of moral responsibility. If neuroscience shows that we are in thrall to our neurons, then how can we be held responsible for our actions, both condemnable and praiseworthy?

Gazzaniga is right to worry. He persuasively cites a 2011 study [PDF] in which researchers found that inducing disbelief in free will decreased helpfulness and increased aggression among experiment participants. He also notes that other recent studies [PDF] reported that people were more likely to cheat in psychological experiments after reading passages that encouraged a belief in determinism. The researchers note with irony, "Perhaps, denying free will simply provides the ultimate excuse to behave as one likes." The results also caused the researchers to worry that "if exposure to deterministic messages increases the likelihood of unethical actions, then identifying approaches for insulating the public against this danger becomes imperative." Surely they can't mean that wise sages must tell a "noble lie" about free will in order to keep the plebes in line?

"Is accountability what keeps us civilized?," asks Gazzaniga. He pretty clearly believes that answer is yes and he fears that if people don't believe in free will, they will no longer hold themselves or others accountable for their actions. This is dangerous because numerous psychological experiments have shown that punishment [PDF] is the key to cooperation. If free riders cannot be punished, the networks of cooperation that underpin civilization break down. Of course, the worst type of free riding is criminal behavior, e.g., robbing, assaulting, and murdering others for personal gain.

"Our current legal system has emerged from innate intuitions, honed by evolution, just as our moral systems have been," argues Gazzaniga. He notes that people often cite different rationales for punishing others, e.g., retribution, utility, and restoration. Retribution punishes offenders in proportion to the moral magnitude of the harms they committed. Utility justifies punishment as incapacitation or deterrence. And restorative justice extracts reparations to victims from offenders. Research shows that while many people talk deterrence, when asked to judge how criminals should be punished the vast majority of people opt for retribution. People intuitively want to hold criminals personally responsible for their actions.

In fact, Gazzaniga's fears about how neuroscience might be (mis)used in the courts have begun to be realized. Defense lawyers are beginning to claim the equivalent of: "Members of the jury, my client is innocent because his amygdala made him kill his wife." For example, brain scans were used to get convicted murderer Simon Pirela off death row in Pennsylvania in 2004. His attorneys argued that the scans showed that Pirela had aberrant frontal lobes in his brain.

Gazzaniga illustrates how he thinks that courts misunderstand what neuroscience says about personal responsibility with the case of Atkins v. Virginia (2002). In that case the U.S. Supreme Court decided that a murderer should not be executed because he was deemed mentally retarded. Daryl Atkins and an accomplice drove to a convenience store planning to rob a customer. They abducted a victim at gunpoint, drove him to an automated teller machine forcing him to withdraw $200, and then drove to a deserted area where Atkins shot him to death. 

Neuroscience gives us no reason not to hold criminals like Atkins responsible, asserts Gazzaniga. "Responsibility reflects a rule that emerges out of one or more agents interacting in a social context, and the hope that we share is that each person will follow certain rules," he argues. "An abnormal brain does not mean that the person cannot follow rules." As Gazzaniga points out, Atkins' brain was functional enough that he could follow rules. Atkins clearly knew that he shouldn't rob and kill people. That explains why he inhibited his murderous actions until he was in a deserted area.

In the context of our legal system, Gazzaniga shows that neuroscience has lots to tell us about the unconscious biases found in the judge, jury, prosecutors, and defense attorneys, the reliability of eye-witness testimony and lie detecting, and even about our motivations for punishment. But neuroscience cannot absolve us from responsibility for our actions. Who's in charge? Each one of us is. 

Ronald Bailey is Reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is available from Prometheus Books.