Don't Blame Self-Driving Cars for Accidents Caused by Humans
The Arizona crash was caused by two human drivers, at least one of whom ran a red light. The car was just in the wrong place at the wrong time.

A car accident in the Phoenix suburbs that wouldn't otherwise have even made the local news reports has become a national story because of the involvement of an autonomous, or self-drving, car.
Involvement is the key word, because the self-driving car was an innocent bystander in a crash that was entirely the fault of two human-operated cars. But you wouldn't know that from some of the headlines spattered across social media over the weekend, often accompanied by pictures of the dented Chrysler Pacifica sporting the blue-and-green Waymo logo.
#BREAKING: @Waymo self-driving vehicle involved in crash in Chandler. Latest info: https://t.co/fLkpcwB8rj #abc15 pic.twitter.com/jPrJUzMDJk
— ABC15 Arizona (@abc15) May 4, 2018
There have now been two vehicle accidents involving autonomous vehicles in the Phoenix area.
https://t.co/4RPrghYUyj— Digital Trends (@DigitalTrends) May 6, 2018
As for what actually happened, here's how the Chandler police described the Friday afternoon accident in their official report:
This afternoon around noon a vehicle (Honda sedan) traveling eastbound on Chandler Blvd. had to swerve to avoid striking a vehicle traveling northbound on Los Feliz Dr. As the Honda swerved, the vehicle continued eastbound into the westbound lanes of Chandler Blvd. & struck the Waymo vehicle, which was traveling at a slow speed and in autonomous mode.
And if you're skeptical of police officers' ability to tell the honest truth, dashcam video from the minivan confirms their account.
As both the police report and the dashcam video show, the accident was caused by the darker car attempting to enter Chandler Blvd., and causing the lighter colored car to swerve out of the way at the last second. It's not clear from the video which vehicle had the green light.
One thing that is clear about this is that the self-driving car simply happened to be in the wrong place at the wrong time. If the Waymo minivan had been parked on the side of the road when the crash occurred, it would have played the exact same role in what happened.
Local news playing a car accident for clicks and "likes" is to be expected, I suppose, but that sort of coverage has real consequences. First, there is the strong implication—if not an explicit message—that these self-driving cars are some sort of danger to public safety. Readers are quick to draw that conclusion, at least based on the responses to tweets like the ones above.
Second, that fear metastasizes into policy. As two writers at Wired put it, Friday's crash is "threatening to resurrect tough questions about the safety of autonomous technology and rip the barely-crusted scab off the technology's reputation."
That would be like blaming someone sitting in your back seat for an accident that happens three cars in front of you. No reasonable person can look at Friday's crash and conclude that it should raise any questions—tough or otherwise—about the self-driving minivan. What was the Waymo car supposed to do? Apparate to avoid the oncoming, swerving, human-operated car?
What this minor accident in Arizona really shows is that human beings are pretty shitty drivers. We make mistakes like pulling into oncoming traffic. Around 100 people lose their lives every day in car crashes in America, and about 90 percent of all car accidents (including the one in Chandler) are the result of human error. Americans spend $230 billion annually to cover the costs of accidents, accounting for approximately 2 to 3 percent of the country's GDP.
Autonomous cars won't be perfect, and they should face criticism when it's appropriate, but there is a humongous margin for self-driving vehicles to be imperfect but still better than human drivers. Within that margin, cars like the ones Waymo is currently testing will literally save lives.
Even when they are at fault for accidents—and sometimes they will be—we should keep testing self-driving cars. That is the only way to find out if they can indeed be safer than human-driven cars, and it is the only way the technology will improve.
Unless politicians get in the way, that is. And nothing makes politicians more likely to overreact to a perceived threat than a bunch of garbage journalism that inflates a potential threat—see: terrorism, human trafficking, letting your child play outside—like the coverage of Friday's accident.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
One thing that is clear about this is that the self-driving car simply happened to be in the wrong place at the wrong time.
So by your own admission the accident wouldn't have happened if the self-driving car wasn't on the road.
Checkmate.
I'm making over $7k a month working part time. I kept hearing other people tell me how much money they can make online so I decided to look into it. Well, it was all true and has totally changed my life.
This is what I do... Dailycash.us
If the Waymo minivan had been parked on the side of the road when the crash occurred, it would have played the exact same role in what happened.
So no matter what it would have been an accident involving an autonomous vehicle. You just confirmed how dangerous these things are!
That's it, back to horses, it's the only way we can be safe.
Self-driving robot horses? I am down.
Fool. Tonight, in the Chandler Police impound lot, the Waymo wreck's liquid metal will melt together and re-form into a darling brown Spotted Saddle Horse named "Gus." It will blend in with the rest of the Phoenix Mounted Patrol detail, and trot out on the next shift, in search of human blood. And you will RUE the day you were down with self-driving robot horses. You. Will. RUE it.
/insert Westworld joke
Have you ever fallen off a horse at full gallop?
No, but Christopher Reeve did.
Thankfully, the killer robot car did not claim a human life in this incident. I hope it was damaged beyond repair that its career of un-manned terror-driving is permanently over. Waymo? How 'bout WAY NO!
Yea we can't let these robot cars become a common menace to life like guns have become
I'm more concerned about them becoming ubiquitous, and then mandated by government. As another means to control our movement.
Yet SIV continues to support the development of robot chickens. As long as they are cybernetically enhanced to incorporate a real, flesh and blood cloacas. Silicone cloacas just don't feel natural to him.
Yep the old "I don't understand it, so it should be banned" line.
Later reports say the Waymo van was driven by the human driver. Now let me explain why this is a bad-bad (lose-lose) situation for Waymo.
1. If the human was driving, shows they still do not trust their software to go by itself, with no human intervention. Waymo says they are testing an entirely driver-less system (and brag about launching a fully commercial service sometime this year), but still rely on a human sitting in the driver seat to take control if their imperfect and unreliable software fails at driving.
2. Despite the fact that Waymo and the cops just changed the story by saying there was a human driving the Waymo car, the company still has to release a VIDEO showing that driver controlling the car before and at the time of the collision. Why? Because Waymo makes official statements blaming their drivers (like this situation when their car run a red light but later they've "officially" blamed the driver for it) but never post a dash cam interior video (like Uber showed for the Elaine Herzberg fatal accident), clearly showing their drivers doing what Waymo says they were doing (driving the cars). My personal suspicion comes from the fact that initially, Chandler police said the car was in self driving mode, probably because the first question they've asked the human monitor from inside the car about what happened, the first thing she said was that the car was in "autonomous mode".
It is impossible for that human to make such a fundamental mistake and say the car was in autonomous mode, while she actually was driving it. In addition to that, without that essential information, probably the police would have said nothing, choosing to let people know they are still investigating the accident.
3. If the human was driving the car, out of self preservation and because the other car clearly is in human's monitor visual field, even if that collision was not avoidable, that human would have reacted, trying to get away from the incoming car by moving the steering wheel to the right. I agree the reaction time would have been insufficient to SAVE it, but the driver would have had plenty of time to TRY to move away from the other car's collision path.
4. If the car was in autonomous mode, the robot simply FAILED to properly react and try to save it.
I continue to be amazed by the opposition of many on here to self-driving cars. I am trying to figure out the psychological reason for this. I suppose it has to do with a fear of losing control. I wonder if the same people also don't trust other types of automation. Or technology in general, since so much of it is about automating manual tasks.
I'm all for it, so long as the government isn't running it.
I am amazed you pretend to be an enthusiast, but fail to understand "automation" and "autonomy" are two very different things. For more info check this out - http://www.thedrive.com/opinio.....el-x-crash
Lol. Human Driving Association. Yeah, that dude does not have an agenda at all. I am honestly starting to think the opposition to self-driving cars is religious at the root. It's as if if a car is able to drive autonomously, that means there is no soul.
If you look under my car you'll see dozens of little orphan feet sticking out through the floorboards like a Flintstone car. I don't trust those infernal combustion engine things to actually work as good as human feet for reliable transportation.
Very few people oppose the invention of and existence of self-driving cars, and their market availability as a choice. Of course, this is a pure strawman you totalitarians will continue beating up on.
What many of do oppose is the true endgame goal of totalitarians such as yourself, Ron Bailey, and Eric Boehm, which is to eventually mandate that all cars must be self-driving and the removal of the choice of operating our own vehicle.
Truly, mandates shouldn't be needed, if the automaticomobiles really work. They could easily evade human drivers.
When did I ever say that, you numbskull?
Except that all you people choosing to drive your own cars are increasing the danger to my life. I could care less what you do to yourself but your rights end when you start infringing on mine.
Worse, as autonomous vehicles become more common, human drivers will become even more aggressive than they are now, because they will quickly figure out that AVs will react defensively.
so in the future you want other people to be banned from driving because your robot car drives like a bitch? seems like that is some problem with your robot car.
I suppose it has to do with a fear of losing control.
Speaking only for myself, I enjoy driving. Sometimes I just get in my car and drive around for a while. Sometimes my wife comes with me. I don't want to be stuck scheduling a spur of the moment event and then not even be able to take part.
I understand that for some people driving is such a god awful chore that not having to do it anymore can't come fast enough. I'm not one of those people.
"I suppose it has to do with a fear of losing control."
Yes, it does, but not in the manner you're assuming.
Autos were and are one of the most liberating machines ever invented, and when the government does (as it *WILL*) take control of the autonomous cars, I'll say you've 'lost control'.
Of where you want to go and when.
Let's put it the other way: I'm amazed at those here who are naive enough to think this will not end up a government-run transport system.
For your own good, of course...
Well, let's get rid of the Internet, then, because the government continues to control various aspects of our use of the Internet.
Such as who can use what websites when?
And coopt your internet so you can't use it during emergencies...
So because you're afraid, I can't have a time-saving tool.
Sounds legit.
Why is your time inherently more valuable than my preference?
Valuable to who?
Exactly.
A friend of mine was in Vegas for a tech conference. He got to talking to someone involved in the self-driving car industry. My friend asked "How does your car handle in snow".
The industry guy laughed and said, "Oh, we don't have snow in Vegas."
Number one problem is of course government and privacy.
Two ideas that are not so far fetched, and almost certain to be requirements for getting an autonomous vehicle approved for sale.
Government mandate that the vehicles be subject to 3rd party control
So, lets say State/Local/Federal government for whatever reason wants no one to be out driving, be it weather related, visiting dictator (foreign or domestic), arbitrarily deciding there are too many vehicles downtown, you've exceeded your miles ration, whatever.
Because of the mandated 3rd party control the vehicle was required to include, the manufactures/government will be able to target all vehicles in a given area and prevent them from operating.
Government will also as a way of getting around the 4th amendment, mandate maintaining a location history.
Thanks to the 3rd party doctrine, they could subsequently search every vehicles history without a warrant. Much more effective than ALPR.
In all probability both of these will happen by rulemaking not the passage of a law. Elected politicians know that both would be quite unpopular, so unelected bureaucrats who don't have to face voters will do it for them, giving them an easy out of 'the experts say we should do this'
^ This.
I listened to a podcast (Slate Money) the other day that featured an interview with women who just wrote a book making arguments against self-driving cars. Maybe, hopefully, the book had better arguments than she did in the interview. Part of her argument against advancing technology was "this one time I was in the airport, and all three of my devices ran out of power at the same time."
It's Luddism. Fear of the unknown. Pure and simple.
They hate disabled people who can't drive cars.
"It's Luddism. Fear of the unknown. Pure and simple."
In some cases, I'm sure, but see just above.
No, thanks.
It's Luddism. Fear of the unknown. Pure and simple.
Right. The objections to electric and autonomous cars couldn't have anything to do with Google or Elon Musk.
I've repeatedly called out Ron Bailey and others who *keep* saying that electric cars will reduce the need for road space and parking when their own data shows the gains to be between minimal and insignificant.
What's the opposite of Luddism? Progressivism.
What's the opposite of Luddism? Progressivism.
Smart gun technology can't be marketed effectively because backwards, luddite gun owners cling to the false belief that old-fashioned firearms are safer and more effective.
There are a lot of people against self-driving cars that don't have rational reasons.
I believe the technology will work, but it's going to be much further out than the industry predicts, and it won't look like it does in its current state.
And yes, there will be government involvement, if for no other reason than to build and standardize the road infrastructure that will "aid" self driving cars in navigating certain thoroughfares.
I believe the technology will work, but it's going to be much further out than the industry predicts, and it won't look like it does in its current state.
It's not even really a belief, no one seriously believes we'll have true autonomous vehicles 'in the next five years' except for people that look no deeper than the hype. Notably, those people thought we'd have self-driving cars five or ten years ago.
The people I feel sorry for are the people that still think flying cars are five years from now. They actually already exist, but much like autonomous vehicles they're impractical, expensive, and the government won't let you fly it. Their lobby efforts have been, lets say, far less effective than the people who want you to believe 'machine learning' ends with a computer that's a better driver than you.
Now, I'm sure such an autonomous vehicle will be better than the lowest rung of human drivers. That much is probably true. But I doubt it's better than a decent driver, but of course central planners like to use the bottom rung as their metric of what to force on everyone else.
Of course they will be better than a decent driver because they will all be talking to each other.
The people I feel sorry for are the people that still think flying cars are five years from now. They actually already exist, but much like autonomous vehicles they're impractical, expensive, and the government won't let you fly it. Their lobby efforts have been, lets say, far less effective than the people who want you to believe 'machine learning' ends with a computer that's a better driver than you.
I'm not going to google it because I'm lazy, but there is a company that brought a flying car to market. The Moller Skycar has been futzing around with this for decades trying to build a car that flies. Another company produced a plane that drives. *Boom* on the market.
I don't know if they're successful or still in business, but they got approval and were selling units. Way less sexy than the Moller, but the solution was about like the Segway conundrum. *bam* third wheel!
Now, I'm sure such an autonomous vehicle will be better than the lowest rung of human drivers. That much is probably true.
It has the potential to be much better than a human driver, but the keyword here is "potential". Until they solve several of the most vexing problems that currently have "workarounds" in place, it's going to be a good long while.
What's interesting from the video is you clearly see the car cross the median and heading straight for the driverless vehicle. There wasn't a lot of time, and a LOT of human drivers probably wouldn't have reacted in time.
I think we're getting enough data from self-driving cars that a study could be done where human drivers are given various simulations and compared to driverless vehicles where the accident is caused by another vehicle. What percentage of randomly chosen human drivers could avoid the accident, vs the current self-driving technology.
You'd have to structure the tests fairly so the human test subjects weren't "primed for a an accident".
There is a concept in tort law called last clear chance. Just someone else is negligent does not relieve you of the duty to avoid an accident where possible. Here, no question that the human-driven car swerved into the oncoming traffic and was negligent. The company and the fan boys of this technology act like that is the end of the issue. It is not. It is not clear to me at all that the robot car didn't have a chance to avoid the accident. I think most human drivers if they were paying attention would have. The fact that the robot car did not, tells me that these things are not nearly as safe as claimed. It is not enough not to hit other cars. You have to avoid other cars when they are negligent as well or you are not a safe driver.
Also, isn't the point of an autonomous vehicle the very fact it can react in nanoseconds instead of seconds in the first place?
I mean, developing a car that can drive itself about as well as your grandmother isn't what's being promised here, right?
It can react but it has to first perceive what is going on. And that is a lot harder than they are letting on.
That is a good point. An alert human driver might have had a faster perception of that incoming dangerous situation, though their overall reaction times are much slower.
From a purely academic standpoint, I wonder how much better a human would have performed. There seems to be more than 1 but less than 2 seconds from the moment the lighter-colored vehicle swerves and the end of the video (which is basically where the collision occurred). That's enough time for a human to recognize and react, but they probably wouldn't have had any effect on the outcome. I suspect it balances out, since a human driver could have done something stupid and made the situation worse.
Autonomous cars won't be perfect, and they should face criticism when it's appropriate, but there is a humongous margin for self-driving vehicles to be imperfect but still better than human drivers. Within that margin, cars like the ones Waymo is currently testing will literally save lives.
But it's also entirely possible for self-driving vehicles to be imperfect and worse than human drivers. And one way they may turn out to be worse is in not avoiding accidents (or even causing accidents) in which they are not legally at fault. For example: http://www.thedrive.com/sheetm.....e-too-well
I'm not sure where the confidence comes from that, of course, self-driving cars will be better and get in fewer accidents per million miles than human drivers do. Personally, I wouldn't bet on it.
I would not either. And the other thing is that the proponents of this technology pull some real sophistry when they talk about the risk of human drivers. They always point to the average and then pretend that is the risk for every human driver. No, that is the aggregate. Most accidents are caused by new drivers, drunk drivers or people driving recklessly. If you don't do any of those things, sure you can still be in an accident, those drivers sometimes hit safe drivers, but your chances are much lower.
Driving is a very safe activity. I have not been in an accident of any kind this century. I have driven over 250,000 or so miles without a single accident. Just how much safer can these things make me? Not very. And chances are they will make me less safe. Unless you are unable to drive, are Bevis and Butthead, or a degenerate drunk, these things are not going to make you safer.
exactly. I am 47 and I have never been in an accident and I have driven to every state in the US and drive for a living. I drive at least 600 miles a week. I have not even had a ticket since 1996
Self-driving cars have already proven themselves to be poor at accident avoidance, hence a lot more research to be done.
And they have solved all of the easy problems. The rate of improvement is slowed to a standstill. They may not get any better than they are now for a very long time.
Well, a lot of the prospects early on relied on smart road technology and massive adoption. They were touting how networked vehicles and roads would invariably reduce fatalities worldwide while hackers at DEF CON were showing how trivially easy it was to hack the shit out of wireless networks, including cars.
Well, a lot of the prospects early on relied on smart road technology and massive adoption.
hint: For a massive rollout of driverless vehicles, they're going to have to get some smart road technology and widespread adoption with some sort of "national standards" stuff like ANSI or IEEE or ISO. Then all the manufacturers will have to adopt that standard. I almost guarantee it.
hint: For a massive rollout of driverless vehicles, they're going to have to get some smart road technology and widespread adoption with some sort of "national standards" stuff like ANSI or IEEE or ISO. Then all the manufacturers will have to adopt that standard. I almost guarantee it.
Agreed. The only way I could see an alternative would be with some manner of Google glass or automated assistant technology that is carried or worn as you enter and egress the vehicle. We'd likely still end up at the same place, but the driver would be much more in control of what standards went into the device and trickled down into his/her driving.
Regardless of how perfect they drive, self driving cars give control of where you want to go over to the government and tech companies, taking away your freedom
Yes. You are giving up control of the manner in which you drive as well as all of your privacy and ultimate control over the situation. How Libertarians can think that is a great idea is beyond me.
How Libertarians can think that is a great idea is beyond me.
I could get it if they were all in on open-source self-driving software, standards, and/or protocols, but that's a horrible horrible idea to them (How would we hold cars liable?).
They're generally all in on self-driving cars as an end product so that the experiments in socialism can begin (continue).
Self-driving cars offer no pretext for pseudoscientific conservative prohibitionists to have their goons holler "probable cause," point guns in your face and confiscate your assets.
The car can't react to avoid accidents, and that will never change.
Red. Barchetta.
The car is a better driver than 99% of the human population. The only human that might be able to beat it is a stunt driver or expert defensive driving instructor.
Apparate?