Your Self-Driving Car: Minimize Casualties or Save You?
What algorithms will actually save the most lives in the long run?
Should your self-driving car be programmed to make a utilitarian calculation that sacrifices you to save the lives of more numerous others when an accident is inevitable? University of California, Irvine psychologist Azim Shariff and his colleagues reported the responses of Americans to this question in their study, "The Social Dilemma of Autonomous Vehicles," in Science. First, they find that most people in surveys do want self-driving vehicles to be programmed to minimize casualties even if that means sacrificing the vehicle's passengers. How very utilitarian!
But here's the kicker: Most respondents also say that they would not buy nor would they ride in automobiles run on algorithms that might sacrifice them to save the lives of others. If cars are programmed to make such utilitarian moral choices, that could well slow down their public acceptance which paradoxically would mean even more lives would be lost over time as error-prone humans continue to haplessly navigate highways and streets.
More than two years ago in my article, "The Moral Case for Self-Driving Cars," I cut this supposed Gordian knot of ethical perplexity. I pointed out that such fraught moral dilemmas are rare and will become even rarer with the advent of autonomous vehicles. Recognizing the natural instinct for self-preservation I argued, "Although it may not be the moral thing to do, most drivers will react in ways that they hope will protect themselves and their passengers. So as a first approximation, autonomous vehicles should be programmed to choose actions that aim to protect their occupants."
Since self-driving cars will be safer than human-driven vehicles, the right thing to do is to encourage people to trust them so that their adoption and deployment will proceed as quickly as reasonably possible. Consequently, programming cars to preserve the lives of their passengers is the ethical thing to do.
Show Comments (122)