Self-driving vehicles

Don't Blame Self-Driving Cars for Accidents Caused by Humans

The Arizona crash was caused by two human drivers, at least one of whom ran a red light. The car was just in the wrong place at the wrong time.


A car accident in the Phoenix suburbs that wouldn't otherwise have even made the local news reports has become a national story because of the involvement of an autonomous, or self-drving, car.

Involvement is the key word, because the self-driving car was an innocent bystander in a crash that was entirely the fault of two human-operated cars. But you wouldn't know that from some of the headlines spattered across social media over the weekend, often accompanied by pictures of the dented Chrysler Pacifica sporting the blue-and-green Waymo logo.

As for what actually happened, here's how the Chandler police described the Friday afternoon accident in their official report:

This afternoon around noon a vehicle (Honda sedan) traveling eastbound on Chandler Blvd. had to swerve to avoid striking a vehicle traveling northbound on Los Feliz Dr. As the Honda swerved, the vehicle continued eastbound into the westbound lanes of Chandler Blvd. & struck the Waymo vehicle, which was traveling at a slow speed and in autonomous mode.

And if you're skeptical of police officers' ability to tell the honest truth, dashcam video from the minivan confirms their account.

As both the police report and the dashcam video show, the accident was caused by the darker car attempting to enter Chandler Blvd., and causing the lighter colored car to swerve out of the way at the last second. It's not clear from the video which vehicle had the green light.

One thing that is clear about this is that the self-driving car simply happened to be in the wrong place at the wrong time. If the Waymo minivan had been parked on the side of the road when the crash occurred, it would have played the exact same role in what happened.

Local news playing a car accident for clicks and "likes" is to be expected, I suppose, but that sort of coverage has real consequences. First, there is the strong implication—if not an explicit message—that these self-driving cars are some sort of danger to public safety. Readers are quick to draw that conclusion, at least based on the responses to tweets like the ones above.

Second, that fear metastasizes into policy. As two writers at Wired put it, Friday's crash is "threatening to resurrect tough questions about the safety of autonomous technology and rip the barely-crusted scab off the technology's reputation."

That would be like blaming someone sitting in your back seat for an accident that happens three cars in front of you. No reasonable person can look at Friday's crash and conclude that it should raise any questions—tough or otherwise—about the self-driving minivan. What was the Waymo car supposed to do? Apparate to avoid the oncoming, swerving, human-operated car?

What this minor accident in Arizona really shows is that human beings are pretty shitty drivers. We make mistakes like pulling into oncoming traffic. Around 100 people lose their lives every day in car crashes in America, and about 90 percent of all car accidents (including the one in Chandler) are the result of human error. Americans spend $230 billion annually to cover the costs of accidents, accounting for approximately 2 to 3 percent of the country's GDP.

Autonomous cars won't be perfect, and they should face criticism when it's appropriate, but there is a humongous margin for self-driving vehicles to be imperfect but still better than human drivers. Within that margin, cars like the ones Waymo is currently testing will literally save lives.

Even when they are at fault for accidents—and sometimes they will be—we should keep testing self-driving cars. That is the only way to find out if they can indeed be safer than human-driven cars, and it is the only way the technology will improve.

Unless politicians get in the way, that is. And nothing makes politicians more likely to overreact to a perceived threat than a bunch of garbage journalism that inflates a potential threat—see: terrorism, human trafficking, letting your child play outside—like the coverage of Friday's accident.