First Self-Driving Car Death
Tesla S car runs into a tractor trailer in Florida; still safer than human-driven cars

Almost two months after it happened, the Associated Press is reporting that a driver in Tesla S model car who was apparently riding it in "autopilot" mode was killed when his car crashed into a tractor trailer on a highway in Florida. The truck driver claims that the driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him." There are reasons to doubt that account, as Tesla Motors notes that it not possible to watch videos on the Model S touch screen. According to the AP, investigators believe that the car's cameras "failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes."
The current version of Tesla's Autopilot reminds drivers to keep their hands on the wheel and remain alert at all times. This is the first known death in over 130 million miles of Tesla Autopilot operation. For reference, the Insurance Institute for Highway Safety calculates that in the U.S. there are 1.08 deaths per 100 million vehicle miles traveled. As I reported in my Reason June feature article, "Will Politicians Block Our Driverlesss Future?," a Virginia Tech Transportation Institute study commissioned by Google estimated in January 2016 that human-driven vehicles crash 4.2 times per million miles traveled whereas current self-driving cars crash 3.2 times per million miles, a safety record that's likely to keep improving as robocars gain more real life experience on the roads.
Demanding that self-driving vehicles be perfect in comparison to human-driven vehicles is clearly not the right standard. Better is more than good enough.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Was gonna post this later in one of the links threads. Tack 10 years onto any previous time estimate for wide adoption of self-driving vehicles.
P: I think you're way too pessimistic. The self-driving services (Uber w/o drivers) will be so convenient AND so much safer that Americans will embrace them pretty fast when they become available.
I'm dubious of the Uber application. Won't this necessitate the vehicles, after they've dropped off one fare and on to pick up the next, operating with no human aboard? On streets where my precious little snowflake is playing hopscotch or jacks or craps or whatever?
"Welcome to the World Series o' Dice!"
Don't be silly, your kid isn't going to be allowed outside without five government nannies there to chaperon it. Jeez, what do you think this is, Somalia?
Until the self-driving car gets into a situation where it is programmed to murder you rather than the other vehicle because the other vehicle has more people in it.
Doesn't matter what people want Ron. The nannies already wish they could get rid of Uber in a ridiculous argument for safety. It will be a long time before they let us have full self-driving cars. Safety (or false sense of safety) plus putting truckers/cabbies out of work will slow this thing down for decades.
Legal liability will help. Like it or not, our system does require that someone be held accountable for things that can't just be called an act of god.
The first time an owner of a self-driving car tries to use the excuse 'don't blame me its my cars fault - I was just texting while passengering' when their car kills someone else, we will see the real test. If that defense works, then the legal system will become 'the poor are legally responsible for their actions - but the rich aren't legally responsible for anything'. If the defense doesn't work, then 'self-driving' cars will be seen in a very different light than just some technological geegaw.
The owner of any property which causes harm is responsible.
There is a HUGE difference between self-driving and driver assisted. Google explicitly commented on this during their TED talk. Driver stupidity will always outpace driver assistance. That is why Google only focuses on totally self driving cars. Unfortunately, Tesla's irresponsibility with their driver assistance program will setback Google's much better autonomous vehicle program.
Car manufacturers' approach is developing cars that have computers.
Google and other tech companies' approach is developing computers with wheels.
It appears that Tesla, like Microsoft, is using its customers ferret out the bugs in its products.
How is Tesla irresponsible? They tell the beta testers to keep their hands on the wheel and to pay attention.
If prior research clearly shows that most humans will not follow Tesla's advice in required/dangerous situations (which exists) and only by following that advice would humans be safe, then I think Tesla, at minimum, is guilty of negligence.
A look at the state of our self driving future.
"failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes."
Something any attentive human would do.
Better is more than good enough.
The driver shouldn't have put his faith in an electric deathtrap.
S: No he shouldn't have. As I reported in my June self-driving feature article:
[University of Texas researcher] Kockelman argues that semi-autonomous vehicles, or what NHTSA calls "limited self-driving automation," present a big safety problem. With these so-called Level 3 vehicles, drivers cede full control to the car for the most part but must be ready at all times to take over if something untoward occurs. The problem is that such semi-autonomous cars travel along safely 99 percent of the time, allowing the attention of their bored drivers to falter. In an August 2015 study NHTSA reported that depending on the on-board alert, it took some drivers as long as 17 seconds to regain manual control of the semi-autonomous car. "The radical change to full automation is important," argues Kockelman. "Level 3 is too dangerous. We have to jump over that to Level 4 full automation ....
Who at Tesla thought this was a good idea? The sensor system seems very limited, and the whole point of an autopliot car should be to enable the driver to do something else. Watching the road on a long drive is mind-numbing enough when a person has full control of the vehicle, when you are not even doing that, the temptation to focus on something else is too much.
MR: I think "Autopilot" Is more of a metaphor and you've basically restated the point made by Kockelman in my article.
I don ' t think my original post expressed how angry this makes me at Tesla. Putting a half assed system like this on a high profile vehicle sets back public acceptance of a properly done self driving system. Self driving cars may not be dangerous as a concept, but Tesla's version is dangerous.
Elon Musk. That's the problem, he needed to be firsties so he could go on TV and remind everyone how awesome he is. I hope he's held fully accountable.
As stated elsewhere semi-autonomy is super dangerous because it's good enough for the human to check out most of the time... which they do. This has nothing to do with the fully autonomous cars Google, Ford and others are testing.
drivers cede full control to the car for the most part but must be ready at all times to take over if something untoward occurs.
I would think this more stressful and tiring than actually driving the car. Any studies, Ron?
^This.
What is the purpose of a 'self-driving' car if you still have to do all the same things you had to do with a regular car (except physically moving the levers). I fail to see any utility.
It's a beta test.
"failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes."
It uses a camera for visual only? Yikes.
I want something that can see more than the visual spectrum on mine.
Right. Why not thermal, UV, ultrasound, Doppler, fuck every sensor you can get when rolling out a new system. It's not like tesla is shooting for the budget market.
Current autonomous vehicles are using sensors like lidar to build up a 3D representation of the environment around them. With additional progress in the development of deep learning AI's, they may be able to do this using only cameras. But they would still need something like radar to deal with rain, snow, etc.
John bait
Ha! Spot on.
The current version of Tesla's Autopilot reminds drivers to keep their hands on the wheel and remain alert at all times.
Then what the eff reason would a person have to shell out the extra dough for an autopilot-equipped vehicle?
I dislike driving as much as the next person but this is going to be unworkable for the near future. MAYBE if all vehicles on the road were automated but in a hybrid situation where human-controlled vehicles are interacting with automated ones, disaster. And who's going to invest in the technology when there's a good chance first orphan bus that gets taken out will bring about a ban?
A friend with the autopilot Tesla says it is more relaxing not having to constantly adjust the distance to the car ahead and he would not go back for anything. But then he's not trying to read or anything while it works; he does pay attention. Reading between the lines, after the fact, I think I understand it. he can concentrate on the big picture instead of focusing on that intra-car distance. Much like a ship captain telling helm what to do, rather than directly steering and ringing engine room bells.
Yeah I think I would only use it on the interstate and for parking. This case seems curious. It seems to me like the truck pulled out in front of him.
That is still a recipe for disaster, because being lulled by that following distance control while travelling at 60+ means "SUPR-".
And you're dead.
The difference here is that the tesla isn't self driving. It lacks the sensors and the testing to be self driving. They shoehorned autopilot onto a vehicle that wasn't designed to be automated (thus lacking the types of sensors you would really want to have) and then assumed an unrealistic usage case... humans remaining fully engaged while having nothing to do most of the time... people developing fully autonomous vehicles are specifically attacking the problems you mention before turning the product loose, Tesla did not... this is the result.
What part of "current version" do you not understand?
Brad Templeton - The Future of Self driving Cars (YouTube)
Are all human controlled vehicle "crashes" catastrophic? Until there is sufficient data, we really can't say whether rate X vs rate Y is a valid comparison.
There is also the issue that humans may (or may not) be in a better position when it comes to making a spot judgment when there is no happy outcome available to the driver.
"Tesla S car runs into a tractor trailer in Florida - still safer than human-driven cars"
Looking at averages as a predictor of individual behavior is problematic.
I suppose from the beauracrat's perspective the question might be about averages, but the interesting question to me isn't whether driverless cars are safer than human drivers on average. My question is whether driverless drivers are safer than me.
The correct answer is that driverless cars have a woefully inadequate safety record compared to me on my motorcycle--especially since I outperform them on being able to avoid other drivers rear ending me.
I may only have 100,000 miles on a bike, and someone may argue that's an insufficient sample size compared to driverless cars, but remember, I'm not really concerned about whether driverless cars outperform the average driver. I just want to know if they outperform me--and I've never been in a motorcycle accident, and I've never been hit by a careless driver either.
Shultz wins again!
KS: Still, I wonder what your spouse's opinion of your driving might be? 🙂
Opinion vs facts
My wife's opinion of my driving would suggest a cavalier attitude towards safety.
The facts of my driving record suggest otherwise.
I just want to know if they outperform me--and I've never been in a motorcycle accident, and I've never been hit by a careless driver either.
The best way to avoid accidents is to assume everyone else on the road is retarded. My concern about driverless cars is the evidence I've seen seems to suggest that by design, the driverless car assumes everyone else is doing what they're supposed to be doing.
That's an oversimplification, I'll admit, but as I've said before, the capacity for human stupidity is limitless. Programming a computer to reasonably react to limitless stupidity is a non-trivial problem.
This got covered yesterday by yours truly. It's pretty clear that the Tesla 'autopilot' isn't "self-driving" in any way most people think of self-driving. It's really more a driving 'assist' feature, and according to Tesla, when you engage it, both hands are supposed to remain on the wheel at 10 & 2, feed on the pedals, and you're supposed to be vibrating in preparation for anything that might go wrong on the road.
According to NPR, the Youtoobz is full people engaging the auto-pilot and going 'dangerous' stuff.
> in the U.S. there are 1.08 deaths per 100 million vehicle miles traveled
This is a bit of a misuse of statistics. The accident took place on a highway. The accident rate (per mile) for highway driving is 10x lower than for non-highway driving.
That was tragic
Well, that feature of auto-pilot mode is not a self-driving depiction. It is just the added assistance in normal driving. So, don't remain under wrong impression that the car was self driving. Tesla is using greatest of technologies in making the cars in auto pilot mode so as to further reduce the growing road accidents. The mistake, I think was on the part of the trailer driver. But, yes, we don't have to worry either, because Tesla is looking forward in making the auto pilot trucks and trailers, which could be benefit to certain reputed auto shipping contractors which have the routine work of transporting vehicles from one place to another. Then, probably such kind of careless trailer drivers would get warning signals from thet auto-pilot trailers!
What a horrible thing to experience. Whenever a situation like this occurs, one should seek an experienced attorney like Dennis Potts, https://www.denniswpottslaw.com.