Driverless Cars

Self-Driving Cars Will Make Most Auto Safety Regulations Unnecessary

Simplifying the rules could save lives on the highway.


HD Download

Federal auto safety regulations fill nearly 900 pages with standards that determine everything from rear-view mirror and steering wheel placement to the shape of vehicles and the exact placement of seats. Many of the rules don't make sense in the coming era of self-driving cars. Autonomous vehicles don't need rear-view mirrors, or (eventually) steering wheels. Their ideal physical form is still a work in progress.

But an even bigger rethink is in order. As motor vehicles become essentially computers on wheels, software, not hardware, will soon be paramount for safety. This will make most government regulation unnecessary, and, to the extent that it slows innovation, could even cost lives on the highway.

"Basically, the entire vehicle code can be boiled down to be safe and don't unfairly get in the way of other people," says Brad Templeton, an entrepreneur and software architect, who has worked as a consultant with Google on its self-driving car project. (He also blogs regularly on the topic.)

One difference between self-driving cars and traditional automobiles is that companies will have every incentive to fix safety problems immediately. With today's cars, that hasn't always been the case. Templeton cites General Motors' 2014 recall of 800,000 cars with faulty ignition switches. The company knew about the safety flaw over a decade prior, but didn't act on the information because recalls are so costly. The companies actions had dire consequences: One-hundred-and-twenty-four deaths were linked to the ignition defect.

But the safety problems of the future will primarily be bugs in software not hardware, so they'll be fixed by sending ones and zeros over the internet without the need for customers to return hundreds of thousands of vehicles to the manufacturer. "Replacing software is free," Templeton says, "so there's no reason to hold back on fixing something."

Another difference is that when hardware was all that mattered for safety, regulators could inspect a car and determine if it met safety standards. With software, scrutiny of this sort may be impossible because the leading self-driving car companies (including Waymo and Tesla) are developing their systems through a process called machine learning that "doesn't mesh in with traditional methods of regulation," Templeton says.

Machine learning is developed organically, so humans have limited understanding of how the system actually works. And that makes governments nervous. Regulations passed by the European Union last year ban so-called unknowable artificial intelligence. Templeton fears that our desire to understand and control the underlying system could lead regulators to prohibit the use of machine learning technologies.

"If it turns out that [machine learning systems] do a better job [on safety] but we don't know why," says Templeton, "we'll be in a situation of deliberately deploying the thing that's worse because we feel a little more comfortable that we understand it."

John Simpson from the California-based nonprofit Consumer Watchdog, who advocates stringent regulation of autonomous vehicles, says extreme caution is warranted because software in cars isn't like software in other contexts. "It's one thing to test Gmail in beta mode," he told Reason, "and software that when it glitches actually might kill you."

"That these need to be perfect before we can allow them on the road is a mindset that has affected a lot of urban planners and…the pro-regulation set," says Marc Scribner, a senior fellow at the free-market think tank the Competitive Enterprise Institute. Since the alternative to allowing imperfect self-driving cars on the highways is the status quo—100 Americans die every day in automobile crashes—perfection shouldn't be the standard.

"Delaying self-driving car technology," Scribner says, "means we're going to see additional deaths that we simply could have avoided if we allowed these vehicles on the road that are not perfectly safe but safer than the cars we have today."

The computerization of driving also has an impact on how we build and maintain highways. Some transportation planners say that to prepare for self-driving cars we need to construct smart infrastructure that communicates directly with vehicles, such as the Obama-era vehicle-to-vehicle safety mandate that was overturned by the Trump administration last week.

Templeton says that as driving become a software-based industry, the physical highways should mimic the internet.

The internet used to be called the "stupid network" because it runs on a few simple protocols and complexity was pushed to the edges. Similarly with cars, we want "stupid roads and smart cars," Templeton says. Then you "put the intelligence in the cars, [which] are being built by lots of competing companies." And that's where you get the internet's "pace of innovation," which "shook the world."

"If we need to install all of this electronic infrastructure on highways," Scribner says, "and we're having problems filling potholes, just from a pure budgetary and management perspective, you're dreaming."

Templeton says self-driving cars provide an opportunity to make our roads even dumber, and he's been promoting an idea called "virtual infrastructure," in which "you take what infrastructure you do need to have and you move it into the cloud." An example is the virtual maps Waymo has built for its self-driving cars.

"The impact that computers will have on society is really, really big [with self-driving cars]," Templeton told Reason. "You need a world where everyone is free to come up with innovations that everyone else thinks are crazy."

Shot, written, edited, and produced by Jim Epstein. Filmed at the 2017 Automated Vehicles Symposium.

"Detour" by Gunnar Olsen.

"The Key" by The Tall Pines is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.

"Stranger in the City" by Marcos Bolaños is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.

"Hills" by Riot

"Boogie Pt. 1" by The Tall Pines is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.

"Realness," by Kai Engel is licensed under a Creative Commons license.

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.