Josh Hawley's Anti–Driverless Cars Policy Would Kill a Lot of People
Tens of thousands of people die each year in crashes where human error was the cause or a contributing factor.

In a September 4 speech at the National Conservatism Conference, Sen. Joshua Hawley (R–Mo.) bemoaned the rise of artificial intelligence, listing, among other complaints, "Only humans ought to drive cars and trucks." This sentiment isn't only anti-innovation, it's a dangerous line of thinking when it comes to the realities of road safety.
Each year, more than 40,000 Americans die in auto accidents. State-by-state statistics show that nearly 1,000 Missourians—Hawley's constituents—died in auto accidents in 2023. The vast majority of these accidents are caused by human error. Although the exact number varies, as accidents often have multiple causes, studies over the years have found that human error caused or contributed to between 90 percent and 99 percent of auto accidents. Of course, many people operate vehicles safely, but this significant death toll caused by human drivers ought to make us open to safer solutions.
While much of our discourse around artificial intelligence (AI) has focused on generative AI products like ChatGPT, autonomous vehicles represent one of the exciting and often underappreciated applications of this emerging technology. AI is important not only for the development of the fully driverless cars that Hawley seems concerned about but also for many of the technologies we now expect in cars, such as lane-departure notifications and anti-lock braking systems.
Unlike human operators, autonomous vehicles don't get drunk, drowsy, or distracted. A 2024 study published in Nature Communications found that "vehicles equipped with Advanced Driving Systems generally have a lower chance of occurring than Human-Driven Vehicles in most of the similar accident scenarios."
Autonomous vehicles are already a reality, and millions of miles of testing have proven their safety. Like any technology, they are not without error, and there have been headline-grabbing examples of malfunctions or accidents, but data show these are rare.
As of March 2025, Waymo, one of the leading companies in autonomous vehicles, had operated vehicles over 71 million miles without a human driver. According to released safety data, the vehicles had reduced accidents by more than 78 percent in every category examined, for both passengers and other road users like pedestrians or cyclists. On a national scale, that trend would save more than 30,000 American lives each year.
This potential benefit to road safety has attracted many policymakers to look for ways to embrace, rather than discourage, this technological development. On the same day as Hawley's speech, Secretary of Transportation Sean P. Duffy announced plans to modernize safety standards with autonomous vehicles in mind, recognizing the important role of transportation innovation. Across the nation, both red and blue states have sought to encourage the testing and development of autonomous vehicles: 29 states have enacted legislation related to autonomous vehicles, and a further 11 have acted on the issue through executive orders. These policymakers recognize that we should see technology as a way to assist and improve transportation safety, encouraging human flourishing and improving our transportation ecosystem.
When autonomous vehicles get into accidents, it makes the national news because it is so rare. Meanwhile, human-operated vehicle accidents are so frequent that we consider them a sad but normal occurrence. Hawley may nearly have it backwards: It's humans, not AI, who have proven to be dangerous operators of cars and trucks. We should consider the costs of the lives lost to human error if we deter the development of this important technology. Instead of trying to restrict the potential application of AI technology, we should be excited about its potential.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
"Tens of thousands of people die each year in crashes where human error was the cause or a contributing factor."
...especially when the drivers are drunk or high.
or senile
or dicking around with a cell phone
Or an illegal, with a faked license. .. this is fun lets do more.
"Hawley may nearly have it backwards: It's humans, not AI, who have proven to be dangerous operators of cars and trucks."
It's a little early in the technology development and rollout stages to be making this sort of comment.
Consider where the autonomous vehicles are operated, in controlled environments even if they are on public roadways.
The majority however are mine sites with autonomous dump trucks which is not really acceptable evidence.
Who the fuck hires you people to write for Reason? Continued...
I see that your rhetorical style is appeal to emotion (a logical fallacy) using big, scary-looking numbers.
Don your critical thinking cap and put those numbers into perspective.
The population of the united states, in 2024, was approximately 340 million people. Forty thousand of that is 0.000124% of the population. It is statisically insignificant.
The population of Missouri in 2023 was approximately 6.2 million people. One thousand of that is, nominally higher than the national value, about 0.000163%.
Both values are marginally over one ten-thousandth of a percentage point. For reference, 1% of 340m is 3.4 million and 1% of 6.2m is 62,000.
FIFY
For reference, Waymo avoids conditions outside of its Operational Design Domain (ODD) which includes heavy rain, heavy snow, and ice. These conditions impair their sensors affecting their ability to detect lane markings, traffic, and road surfaces accurately. Their vehicles will pull over to a minimal risk condition, return to base, or reroute to avoid unsafe situations caused by such weather events.
Again, don your critical thinking cap. Just because there have been millions of safe miles in tested conditions doesn't mean they've been tested in every condition or terrain.
To just accept, "whelp, they're safe," based on their say so, based on performance in cherry-picked conditions, is simply appeal to authority (another logical fallacy).
[Aside: How many people will die because a robocar won't drive them to the ER or Urgent Care because it's raining outside? Often, driving in bad weather is simply unavoidable.]
Again, reference above. Waymo avoids conditions outside of its ODD.
Again, put on your critical thinking cap.
Any accident reduction was within Waymo's Operational Design Domain (i.e. cherry-picked conditions).
Applying the same math above, 30,000 Americans represent, based on 2024 population, only approximately 0.0000882% of the population. That is eight hundred-thousandths of one percent.
To my opening point, your title is deceptive and wrong.
First and foremost, prohibiting driverless cars will not kill anybody. Sure, it won't help reduce accidents and traffic fatalities like robocars marginally can in limited, optimal conditions. But Hawley's policy won't actually kill anyone.
Second, the verbiage and hyperbole ("Kill a Lot of People") are a blatant (and false) appeal to emotion. The number, while scary sounding -- ooooh! thousands of people -- is certainly not "a lot." It is a vanishingly small number of people.
"If it saves just one life!!" Right? Fuck off.
Note, I don't disagree with the overarching premise: government ought not deter innovation for stupid reasons. However, you fail to persuade with click-bait headlines, hyperbole, and logical fallacies (appeal to emotion and appeal to authority).
See.Less, in addition to being emotional you are also innumerate. Your calculations are all off, understating the percentages, by 2 orders of magnitude.
Oooh! Clever ad hominem!
More ad hominem.
Ah. Fast to ascribe a failure of education or understanding rather than simply note a procedural error. Another ad hominem! That's a hat trick!
Yes. You are correct. I neglected to multiply my quotient by 100. As such, correcting my numbers:
* 40k deaths nation wide divided by 340m Americans, then multiplied by 100, yields a result of approximately 0.012% (nominally over one one-hundreth of one percent).
* 1k deaths in Missouri (2023) divided by 6.2m Missourians, then multiplied by 100, yields a result of approximately 0.016% (nominally over one one-hundreth of one percent).
* 30k lives predicted to be "saved" by robocars divided by 340m Americans, then multiplied by 100, yields a result of approximately 0.001% (one one-thousandth of a percent).
As a percentage of the populations (national and Missouri), the numbers are still very, very tiny. Miniscule, even.
Therefore, my conclusions stand. While the numbers, as raw data points, may look scary, the reality is that they are very, very small values.
Now, do you have anything interesting or substantive in response to the content of my post?
Go try a Tesla self drive.
See less is just what you are doing, and you did it again. 30,000 is .0088% of 340,000,000. At least you are only off by a factor of 8.8 this time. If these numbers are very, very small values, what do you think of the much smaller average of 7,208 US deaths per year between 1965 and 1972 in the Vietnam War?
Speaking of logical fallacies:
"...If these numbers are very, very small values, what do you think of the much smaller average of 7,208 US deaths per year between 1965 and 1972 in the Vietnam War?..."
Misdirection is a hallmark of a dishonest POS; congratulations!
You are correct when you say Waymo does not operate under all weather conditions. The Waymo Operational Design Domain lists snow, ice and hail as not part of the current domain, but Waymo also states that it is testing operation under these conditions. I expect that there are other weather related conditions where they would also decide to not operate, e.g. hurricanes and flooding. Also they only operate in areas where they have mapped. I believe that they are currently mapping in Dallas.
"...but Waymo also states that it is testing operation under these conditions..."
So let's leave operations under those conditions out of the comparisons?
Given you are a propagandist, I'm sure you have a weasel-out to exclude those from any consideration, right?
Unless I missed it, the article also IGNORED total miles. When analyzed per mile driven, autonomous vehicles have a significantly worse safety record than humans.
Do you have a source for that?
Who pays you?
Waymo advertises 100M a year.
American drivers are estimated 3.2T a year.
Oh, and do you have any data supporting your (implied) claims, or (like the low-watt bulbs promoting EVs) hoping you can skate by requiring others to supply as yet unknown specifics?
"...According to released safety data, the vehicles had reduced accidents by more than 78 percent in every category examined, for both passengers and other road users like pedestrians or cyclists.
Again, put on your critical thinking cap..."
I doubt she has experience with driverless cars; if she did, she'd at least attempt to account for what those of us who do regularly resort to: Simply stopping and letting that idiot-driven vehicle get out of the way entirely before proceding.
Well, my Tesla is a safer driver than I am. That’s all I care about.
In which case, please notify the public when you're driving. Or quit driving. That's all I care about.
The vast majority of these accidents are caused by human error. Although the exact number varies, as accidents often have multiple causes, studies over the years have found that human error caused or contributed to between 90 percent and 99 percent of auto accidents.
Up to 90-99% of automobile deaths up until Mar. of 2020 and then back up to 90-99% again after Jan. 7, 2021. Trust the experts.
Until autonomous vehicles account for at least 1% of total miles driven, that is actually a condemnation of autonomous vehicle "safety".
Are you assuming the remainder of the 90-99% were the fault of vehicle autonomy?
We're assuming propagandists like you have no data to back your (implied) claims.
Prove me wrong.
When I was driving a semi truck my trucker's GPS consistently wanted me to make U turns. These turns were legal but as we have seen can be very dangerous. It takes a real live human being to weigh the risk. As far as I know autonomous trucks are limited to the interstates and nearby distribution centers. But big trucks go everywhere from shithole cities to dirt roads in the mountains and deserts. And sometimes it rains. And sometimes it snows, a lot. It's simply not possible to maintain a perfect driving environment for these things. And if they all park on the shoulder waiting for spring we'll have bigger problems.
Gee, Mark U and aajax have been challenged on their bullshit and seem to have wandered off to lick their wounds.
Fuck the both of you; those who post here are not imbeciles. We can spot paid propogandist shits for some well-funded tech a mile off.
Fuck off and die.