Trump Wants To Ease Rules for Self-Driving Cars
Regulations have made these vehicles less safe and more expensive.

After railing against self-driving cars on the campaign trail, President-elect Donald Trump plans to establish federal guidelines for autonomous vehicles, according to Bloomberg. Onerous regulations have halted the deployment of these vehicles and made them less safe and cost-effective. Updating standards would allow more Americans to revel in the benefits that boundary-pushing autonomous vehicles offer.
Despite significant investment from manufacturers like Tesla and General Motors (GM), self-driving cars have been stifled by federal rules that require these vehicles have pedals and a steering wheel. While the National Highway Traffic Safety Administration (NHTSA) permits the deployment of 2,500 self-driving vehicles under granted exemption per year, receiving an exemption is a long and costly process. After waiting more than two years for one, GM was forced to pull the plug on their Cruise model with no steering wheel or pedals.
"There's a strong case to be made that in the future, you might want to remove all of these human controls for the expressed purpose of building a robotaxi," says Marc Scribner, senior transportation policy analyst at the Reason Foundation, the nonprofit that publishes this magazine. If robotaxis are safer than humans driving as manufacturers argue, then removing these parts would make the cars safer. "They would also, in theory, make them cheaper," according to Scribner, which would further make them available to a larger swath of people.
Less burdensome regulations would make it easier for Elon Musk to launch his recently unveiled self-driving taxi called the Cybercab, which he expects will hit the market in large quantities before 2027.
Scribner tells Reason that even though federal guidelines would streamline the self-driving car approval process by creating a uniform standard, they might not be necessary pending results from ongoing investigations by the NHTSA.
NHTSA is currently investigating Zoox, Amazon's self-driving technology company. After claiming in 2022 that Zoox's autonomous vehicles (with no pedals or steering wheel) were "self-certified" to federal motor vehicle safety standards, two of Amazon's vehicles unexpectedly stopped, causing two motorcyclists to crash.
"If [Trump's] regulators find that Zoox's self-certification was valid, that means that we may not need to do all sorts of updates to Federal Motor Vehicle Safety Standards," says Scribner. Companies like Tesla would be able to follow Amazon's lead and self-certify their cars to meet the same safety standards as cars with steering wheels and pedals even if they don't have these installed.
With support from the presidential transition team, Scribner is "cautiously optimistic" about what a second Trump administration could mean for self-driving vehicles. Given the industry's emerging interest, it is time to reduce the regulatory barriers that hold this promising technology back from its full potential.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
" . . . two of Amazon's vehicles unexpectedly stopped, causing two motorcyclists to crash."
I wrote code for 45 years, but I could never make an object outside of the computer change behavior.
Any chance the motorcyclists were following too close or anything?
How about pretending you are a reporter.
It should be noted that no ethically-trained software engineer would self-certify software with a destroyBaghdad procedure. Basic professional ethics would instead require him to write a destroyCity procedure, to which Baghdad could be passed as a parameter.
At this point, I trust Silicon Valley about as far as any one of them could pull a cybertruck tied to their junk.
Seriously, haven't we gone through this a couple of times with a couple different train software suites? Where if the software loses connection it stops and nobody thought "You know, if one train loses connection and another doesn't or if the connection is re-established while one train is rolling and the other is stopped, that's going to wind up with two trains occupying the same space."
A lengthy investigation into assumed collusion between Trump and Musk regarding this will be called Elongate.
It could be the big one.
So long as the autonomous driver (company) is held liable for all damages like a real driver would be .... it'll probably work itself out.
stifled by federal rules that require these vehicles have pedals and a steering wheel
What we really need is a Section 230, the 1A of the internet, to provide
qualified immunitygood Samaritan protection to driverless vehicle manufacturers in order to protect their free speech right to deny previously promised or reasonably assumed service to their users in order to lock them in and program cars to run over people and make it look like an accident that came from a wet market.Also, if we could prosecute more ICE CEOs for programming their cars to game emissions standards, because EVs are just really popular, that'd be great.
Nice Try.
Section 230 doesn’t protect company made (programmed) threats. In your deceitful example case; Section 230 would just prevent you from suing GM because a careless driver ran you over.
You’d have to sue the careless driver (poster) as it should be unless you can make a case that the company itself is responsible for the accident and not the careless driver.
"...it's time to reduce the regulatory barriers..."
Regulation is a tax that destroys wealth.
The legal system provides an avenue for stopping dangerous tech.
The bureaucracy is political, parasitic, self-serving. For example, the finished solar and wind power plants sit idle for YEARS waiting for final permit. People die and over pay while fossil fueled plants that pollute should be closed. Politics as usual. The politics YOU paid taxes for and thereby granted your moral approval.
If you think about it, there's no reason those self-driving taxis need a windshield. A camera and a video display could work just as well.
If a self-driving taxi makes an improper u-turn, does the police officer get to beat up the rider?