The Slow Approval of Self-Driving Cars Is Costing Lives
Laws requiring a "driver" in driverless cars make as much sense as requiring a horse to be yoked to the front of an automobile, just in case.

A recent news story gave the anti-self-driving car pearl-clutchers some ammunition, although it was more comical than tragic. In West Hollywood last month, a goofy sidewalk-delivery robot collided with a robotaxi at around 4 miles per hour. There obviously were no injuries and neither vehicle was damaged, although writers had a field day depicting this as a preview of the coming dystopian war between robots.
Critics of autonomous vehicles, or AVs, can fixate on nonsense if they choose, but widespread legislative efforts to limit the development of these vehicles actually endanger us given that robots are much better drivers than human beings. They're not chatting on cellphones, fussing with the radio, or distracted by crying children. They are more attuned to surrounding traffic—and can "see" much farther ahead than the average driver. But fear of the unknown often drives policy.
Granted, I used to share the "isn't that cool, but I'm not getting into one of those" view as I've watched Waymo robotaxis covered with futuristic-looking cameras and lidar sensors zip around San Francisco. Then I grabbed a ride in one in Phoenix and was shocked by the non-shocking nature of the experience. It was like taking any taxi—except the car was a nice Jaguar, I could play my own music and I didn't have to listen to the driver's political theories.
The taxi ride from my hotel to tour Waymo's operations center south of downtown actually was a bigger ordeal as the human driver got lost and took me miles out of the way. The robotaxi drove carefully. I'm not saying things can't go wrong, but they can go wrong when riding in any vehicle. Accidents in traditional taxis and other human-piloted vehicles aren't exactly an unusual occurrence.
News reports also made a big deal out of a minor robotaxi accident in San Francisco, even though there were scores of human-related car accidents the same day. The latest peer-reviewed research from Swiss Reinsurance Co. found that robotaxis "significantly outperformed…the overall driving population" with an 88-percent drop in property-damage claims and a 92-percent drop in bodily-injury claims. Insurance research tends to be extremely reliable as insurers need to know the risks as they set rates.
Nevertheless, lawmakers and regulators continually try to slow the adoption of self-driving vehicles. One key reason is unions, which fear the loss of jobs—especially when it comes to the trucking industry, taxi businesses, and transit systems. Beyond raw protectionism, the key stated concern is safety. In that case, the data doesn't support the call for onerous regulation. Foes of these vehicles are trying to gin up unrealistic safety concerns to limit a life-saving advancement.
For instance, many states are pushing "driver-in" laws that require a human driver in the driver's seat, which defeats the purpose of the technology. It's as if early 20th-century legislators required a horse to be yoked to the front of emerging automobiles just in case. Others ban AVs for interstate commerce. This restricts the industry's expansion, forcing companies to spend time lobbying rather than creating a new industry.
In the new legislative session, California lawmakers are expected to revive a bill (Senate Bill 915) that would let larger municipalities impose their own limits on AVs. These include permitting rules, caps on the numbers of such vehicles, and mandatory inspections. The state Department of Motor Vehicles and California Public Utilities Commission already regulate AVs, so don't buy the nonsense that they are somehow unregulated. This measure is pernicious given that vehicles would have to comply with vastly different laws as they crossed city lines.
This regulatory situation reminds me of debates about the federal Food and Drug Administration, which is responsible for approving medicine and medical devices. Obviously, the nation needs some regulation to protect against dangerous products, but the FDA is known for its excruciatingly slow approval process. If the agency approves a drug too quickly and it leads to ill results, that means bad headlines and congressional inquiries.
Therefore, in bureaucracies, the incentive structure is to go slowly and impose myriad impediments to new developments. With the FDA, that has meant years of delay of life-enhancing drugs. As others have noted, the victims of this delay do not know why they can't access these treatments. These downsides are unseen, but they cost thousands of lives.
Back to vehicles, more than 42,000 Americans die in collisions every year. Based on the above-mentioned research, AVs have dramatically lower bodily-injury rates. If governments slow approval of self-driving cars—or give local governments the ability to stop their use based on anecdotes and irrational fears—then we'll likely have more deaths and injuries. And no one will know why.
So, sure, it might be funny to think about dueling robots on L.A. streets, but let's not let minor foibles cause us to lose sight of the bigger picture.
This was first published in The Orange County Register.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
"Obviously, the nation needs some regulation to protect against dangerous products" Please, don't give away whatever principles you might otherwise claim to have.
Again, this is how it starts.
In 30 yrs. we'll be getting lectures about Section 230 of The Roadway Decency Act, "The 1A of the Driverless Superhighway" and how States banning driverless cars from serving as pimps for teen sex workers is a violation of everyone's freedom and portent to the collapse of Western Society.
Frankly, in 30 years I would rather live in a Mad Max dystopia.
We've already lived in a Mad Max dystopia these past four years.
By then those teen sex workers will be robots too
"Libertarian rag for ... the regulatory state?"
Did the writers here even read up on the basic fundamental tenets of libertarianism?
Man trapped in self-driving car going in circles:
https://www.youtube.com/watch?v=tWPG8alh2Bs
"We are paying to be part of a human experiment."
Reminds me of this
https://www.youtube.com/watch?v=-trd_f6j3eI
Great show.
"Okay, why is this happening to me on a Monday?"
[Puts hand to brow to avert eyes from The Most Important Thing.]
If he gets up to about 200 mph, the pull chains on his garage door become self-tying.
People can get irrational when they have to go to the airport.
The fac the customer service rep does not have an easy way to shut down the car when it is malfunctioning is an issue.
Who is liable when a driverless car causes damage?
The owner?
The manufacturer of the vehicle?
The company that developed the software?
The company that developed the sensors?
The programmers?
Answer those questions and get back to me.
(and look up the videos of gangs stepping on front of a driverless car and abusing the occupants, because there is no way to make the car run away)
I also wonder how well the tech holds up in bad weather. Snow and heavy rain do shut down existing automatic safety features because the sensors cannot "see".
It really needs to be a two-tiered system and Reason needs to stop being retarded about this. Autonomous vehicles can't, don't, and shouldn't replace human automobile drivers. They *should* replace commuter train and bus drivers.
But then, you don't get the same amount of techno-futurist, gaia-worship, virtue points for not Sperg Dunking on every last Luddite- adjacent person as hard or harder than humanly possible.
(and look up the videos of gangs stepping on front of a driverless car and abusing the occupants, because there is no way to make the car run away)
Implicit in this situation, does a car with no driver or passengers have a legal *right* of way? A runaway cart without a horse is one thing, physics is physics. But if, for I and my 16 yr. old son, driving is a privilege, does a physical object have or attain the same 'privilege' simply by existing/being manufactured?
Also, It'll be interesting to see what happens when, one way or another, a self-driving vehicle turns up with a bag of weed or other drugs and no passengers, or even undocumented passengers.
Edit: [scratches chin] Maybe the Red SUV *was* guilty.
I definitely wonder about people using it as a delivery service. I suppose the effectiveness of that is reliant upon internal cameras and how the door locks function.
You beat me to it.
Will at least one of the passengers in a driverless car have to have a driver's license?
Will that person be held accountable if there is an incident?
Who is liable when a conventional car with a driver causes damage
The owner?
The driver?
The manufacturer of the vehicle?
The company that developed the software (of some component in the vehicle)?
Someone else?
The answer is "it depends". A few decades back, Toyota was liable when the root cause was malfunctioning brakes. Often, it's the driver for doing something stupid or careless but sometimes it's the owner (on the theory that ownership is provable via license plates and that the owner took the risk of letting the other person drive).
The real point is that courts are reasonably good at sorting it all out and there's little reason to think they won't be able to do the same thing for liability in driverless car accidents.
Wow - I hadn't even thought of that issue! No way to run over perps to escape! Yikes!
Have none of you ever lived in a rural environment?
Greenhut? Does Oxnard count?
No, they long for the Caves of Steel.
I really wonder how those vehicles handle roads with bizarre signage and lines. Construction areas seem like they'd be difficult for it to navigate. I also don't picture the self-driving car navigating the ice-covered back mountain road I was just on. How does it identify where the road even is, much less react to a loss of traction?
Very clear and sharp white lines at the edge of the road.
Which is why you find the phrase "on compliant roads" in the fine print.
Which won't work with roads covered in ice or snow.
Those Waymo cars don't do well with construction. They've been testing them here in Austin, and I've seen them get stuck when a road was closed for construction. I've even seen one stuck in the middle of a one lane road with a city bus behind it that couldn't get around because of the construction.
They will do horribly. The second most dangerous job is the people who work on the road around temporary construction sites with bad roads, temporary signs, and lots of unusual traffic mix - with cops required to override permanent lights when necessary. Those deaths/injuries are entirely driver caused. Those are the people who are training the AV's.
In addition, the cities where AVs are being trained are the most dangerous for pedestrians - high death/injury, fewer sidewalks, intersections that are car-friendly, kids are chauffered everywhere, etc. SF is the only exception - but I suspect the AV companies will use SF as the source of 'too cautious' error.
In any event, Americans (incl AV companies) don't give a shit about safety of those outside the vehicle. Which is why our kids are fat and spend their childhood inside the car. And why the AV companies choose places like Phoenix for their testing
BTW the most dangerous job is lumberjack/tree trimmer and the third is roofer.
Let's see. Dirt roads, no cell signal, erroneous "roads" in the onboard map database, and unpredictable conditions. What could go wrong?
Wasn't there a fun meme a few years ago about people renting cars that required an app to open and start, and when they got out of cell range and parked, they could not get back in (or contact the rental company for assistance)?
This article screams "I read an op Ed in a technical magazine, and will now repeat the retard I just read"
The sub headline is nonsensical, how is yolking a horse to a car the same as having a driver?
That and it automatically gaurentees top down control of people's movement.
It seems greenhut is a tech simp with no concept of second order effects. Aka a reason writer
Indeed.
Laws requiring a "driver" in driverless cars make as much sense as requiring a horse to be yoked to the front of an automobile, just in case.
I'm sure that analogy made a lot more sense in your head.
Jeffy gets it, but wonders who is responsible if there is a bear in the trunk of the driverless car.
The horse can be held responsible for any accidents.
Laws requiring a "driver" in driverless cars make as much sense as...
The longshoremen have the same benefit with the dock cranes. The shit is automated, but they require a $98 hr operator to be (sleep) in the cab while the crane is loading.
And a dozen union guys on "stand-by" in the break room?
42,000 Americans don't die in collisions of the type that autonomous vehicles in slower, urban environments are likely to prevent. Of course a vehicle that drives like an old person, but with the vision and reflexes of a teenager, is safer in those urban wastelands.
Sorry , Steven , that was unthinkingly stupid to say
Laws requiring a "driver" in driverless cars make as much sense as requiring a horse to be yoked to the front of an automobile, just in case.
Both a car driven by a human and one led by a horse will avoid horrors that the best AI will not. I mean, Balaan's Ass, right. from 2500 years ago.
28 And the Lord opened the mouth of the ass, and she said unto Balaam, What have I done unto thee, that thou hast smitten me these three times?
29 And Balaam said unto the ass, Because thou hast mocked me: I would there were a sword in mine hand, for now would I kill thee.
30 And the ass said unto Balaam, Am not I thine ass, upon which thou hast ridden ever since I was thine unto this day? was I ever wont to do so unto thee? and he said, Nay.
Dost thou not suspect my place? Dost thou not suspect
my years? O that he were here to write me down an ass!
But, masters, remember that I am an ass. Though it be not(70)
written down, yet forget not that I am an ass. No, thou villain,
thou art full of piety, as shall be proved upon thee by
good witness. I am a wise fellow; and which is more, an officer;
and which is more, a householder; and which is more,
as pretty a piece of flesh as any is in Messina, and one that(75)
knows the law, go to! and a rich fellow enough, go to! and a
fellow that hath had losses; and one that hath two gowns
and everything handsome about him. Bring him away. O that
I had been writ down an ass!
If self-driving cars become common, they'll become mandatory...with government overrides.
Hmm, so government-driven cars with user requests for travel?
I recall Ron Bailey was really enthused about autonomous cars justifying and resulting in a ban on all human driven cars outside of closed courses for wealthy people who enjoy playing "race car driver"
For starters, you should really drop the fear mongering hyperbole... you might as well have just headlined the article "OMG - Save the Children!".
While I agree that bureaucrats have a tendency to slow down progress and overregulation is a huge problem, this is one of the areas that does need a lot of discussion before it is ramped up to a national scale.
A few years ago there was a question poised of an engineer who oversaw a self driving car program (don't remember exactly, but this was the gist): Say a car is driving at speed and pedestrians run in front of the car. The car cannot stop in time and only has two choices - a) hit, and likely kill, the pedestrians b) swerve into a concrete barrier, likely killing the car's passengers. What choice should the car make? Does the age of the pedestrians matter (e.g. school kids)? Does the number of pedestrians vs. the number of people in the car matter? Does who has the legal right away matter?
I don't know that the answers to the question matters as much as the discussion itself, that there is transparency behind the decisions, and that we determine WHO exactly should be able to make these kinds of moral and ethical decisions.
I haven't seen any of that yet.
Maybe they will program to AI to make choices, like if the only choices are to hit an elderly lady or an infant, it will choose to hit the old lady?
IIRC infants were worth a lot more in Death Race 2000...
Someday Reason.com articles will be written by robots. Don't be in such a hurry for AI to take over.
Some Reason writers need to re-read Jack Williamson's "With Folded Hands."
Are you sure this article wasn't written by a robot?
I tend to believe reinsurers as they have a strong commercial incentive to get things right, rather than to push an agenda
'A recent news story gave the anti-self-driving car pearl-clutchers some ammunition, although it was more comical than tragic. In West Hollywood last month, a goofy sidewalk-delivery robot collided with a robotaxi at around 4 miles per hour. There obviously were no injuries and neither vehicle was damaged, although writers had a field day depicting this as a preview of the coming dystopian war between robots.'
Just wait until owners start arming their robots.
https://www.youtube.com/watch?v=y3RIHnK0_NE&t=7s
In West Hollywood last month, a goofy sidewalk-delivery robot collided with a robotaxi at around 4 miles per hour.
Auditioning for a role in a live-action version of Futurama, perhaps
And you're auditioning for a role as turd's replacement?
I didn't have to listen to the driver's political theories.
What a stupid-ass, elitist thing to say. "I like to put my ear to the ground with the local taxi driver" is basically a CEO cliche to remain grounded.
I remember being treated to an hour long lecture about the joys of Brexit on the way out to Gatwick, by a native Brit (white non-immigrant, believe it or not).
I remember many taxi rides in NYC learning about Pakistan or Morocco or wherever (immigrants, believe it or not).
See "A taxi driver writes" here: https://en.wikipedia.org/wiki/Recurring_jokes_in_Private_Eye
FWIW it goes in the other direction as well. I know a minor Hollywood producer/director who drives an Uber and engages his passengers as a way to get stories, because you can no longer go into a bar and strike up conversations - everyone's on phones.
And you are incapable of intelligent posts.
Complete bullshit.
The ruling elites could care less if we peasants die in a car crash, and this is a good example.
Indeed, I'm willing to wager the ruling elitist filth intentionally make laws to ensure our untimely demise.
We've already lived in a Mad Max dystopia these past four years.
We must continue to do things the same way we do them now!
Citing a big looking number without putting it into perspective is argumentum ad passiones (appeal to emotions fallacy).
The 2020 census recorded the population at 331,449,281. More than 42,000 deaths (I translated that to 43,000 for the math), is only about 0.013% of the 2020 population; a little over 1% of 1% of the population.
Cry me a fucking river.
^+1.
The pinhead bureaucrats can't help themselves. When it comes to regulation, they're like a dog with a new bone. Just can't get enough. Logic means nothing when you have the power to make others dance to your tune.
We live in the epicenter of the autonomous vehicle development; when it started, RE was cheap here for parking lots and the traffic was low-density for testing. The efforts were (15 years ago?) laughable; an AV would pull up behind a double-parked vehicle and sit there until it moved.
As someone involved in systems engineering, it became obvious that the folks involved in this had badly underestimated both the cognitive abilities and the decision-making requirements. And, after 15 (?) years, I would still not get in one, nor do I trust the response of one at an intersection.
You can trim a GA aircraft and enjoy a box lunch for half an hour with little attention; city-driving a vehicle requires *serious* decisions likely several times a minute, including deciding what *that* driver is going to do.
Failure. Continued failure.
Seems like you would have to test the hardware/software of an AV to the same level that aerospace handles safety critical systems.
Are they doing that? I don't think they are. You need hard RTOS's which I don't think auto companies utilize.
Steven Greenhut, please tell us when fusion energy will be available.
"Laws requiring a "driver" in driverless cars make as much sense as requiring a horse to be yoked to the front of an automobile, just in case."
One of the most idiotic examples of 'false equivalence' ever seen. Really! Truely fucking STUPID.
As a fucking idiot, explain how horses are capable of interpreting what horses coming into an intersection from a different direction are intending to do and how any possible interaction of same would in any way approximate the damage of what automobiles might do approaching an intersection at, oh, 30MPH.
Were you hired to be the imbecilic intern with regard to AVs, or are you simply an ignoramus?
Driverless/autonomous cars are complete idiocy designed to take away the freedom of the open road.
As others have said, much like EVs, as soon as there's rain or snow or cold, they'll be a disaster. They probably have some limited use, such as the driverless subway transit at some airports. But those are trains in a tunnel, not cars on the open road.
They might work for, say, Disney, or a golf course that wants to REALLY strictly enforce "cart path only." But again, those aren't cars, because autonomous cars are f**king stupid garbage that WILL eventually be used to take away your right to drive yourself and control your own life.
And then the remote tech will let the government and oligarchs lock the doors and drive you off a bridge to drive in a river when your social credit score is too low.
So, nope. DO NOT WANT.
Ban them now, before it's too late.
We don't want driverless cars in our neighborhood. Many children and pets. WE see Lyft drivers being attacked and perverted passengers riding where there are humans.Let's not give away the last deterrent to evil riders. If you can't find a human to drive you , maybe you are the type that humans avoid.