Self-Driving Cars Will Make Most Auto Safety Regulations Unnecessary
Simplifying the rules could save lives on the highway.
HD DownloadFederal auto safety regulations fill nearly 900 pages with standards that determine everything from rear-view mirror and steering wheel placement to the shape of vehicles and the exact placement of seats. Many of the rules don't make sense in the coming era of self-driving cars. Autonomous vehicles don't need rear-view mirrors, or (eventually) steering wheels. Their ideal physical form is still a work in progress.
But an even bigger rethink is in order. As motor vehicles become essentially computers on wheels, software, not hardware, will soon be paramount for safety. This will make most government regulation unnecessary, and, to the extent that it slows innovation, could even cost lives on the highway.
"Basically, the entire vehicle code can be boiled down to be safe and don't unfairly get in the way of other people," says Brad Templeton, an entrepreneur and software architect, who has worked as a consultant with Google on its self-driving car project. (He also blogs regularly on the topic.)
One difference between self-driving cars and traditional automobiles is that companies will have every incentive to fix safety problems immediately. With today's cars, that hasn't always been the case. Templeton cites General Motors' 2014 recall of 800,000 cars with faulty ignition switches. The company knew about the safety flaw over a decade prior, but didn't act on the information because recalls are so costly. The companies actions had dire consequences: One-hundred-and-twenty-four deaths were linked to the ignition defect.
But the safety problems of the future will primarily be bugs in software not hardware, so they'll be fixed by sending ones and zeros over the internet without the need for customers to return hundreds of thousands of vehicles to the manufacturer. "Replacing software is free," Templeton says, "so there's no reason to hold back on fixing something."
Another difference is that when hardware was all that mattered for safety, regulators could inspect a car and determine if it met safety standards. With software, scrutiny of this sort may be impossible because the leading self-driving car companies (including Waymo and Tesla) are developing their systems through a process called machine learning that "doesn't mesh in with traditional methods of regulation," Templeton says.
Machine learning is developed organically, so humans have limited understanding of how the system actually works. And that makes governments nervous. Regulations passed by the European Union last year ban so-called unknowable artificial intelligence. Templeton fears that our desire to understand and control the underlying system could lead regulators to prohibit the use of machine learning technologies.
"If it turns out that [machine learning systems] do a better job [on safety] but we don't know why," says Templeton, "we'll be in a situation of deliberately deploying the thing that's worse because we feel a little more comfortable that we understand it."
John Simpson from the California-based nonprofit Consumer Watchdog, who advocates stringent regulation of autonomous vehicles, says extreme caution is warranted because software in cars isn't like software in other contexts. "It's one thing to test Gmail in beta mode," he told Reason, "and software that when it glitches actually might kill you."
"That these need to be perfect before we can allow them on the road is a mindset that has affected a lot of urban planners and…the pro-regulation set," says Marc Scribner, a senior fellow at the free-market think tank the Competitive Enterprise Institute. Since the alternative to allowing imperfect self-driving cars on the highways is the status quo—100 Americans die every day in automobile crashes—perfection shouldn't be the standard.
"Delaying self-driving car technology," Scribner says, "means we're going to see additional deaths that we simply could have avoided if we allowed these vehicles on the road that are not perfectly safe but safer than the cars we have today."
The computerization of driving also has an impact on how we build and maintain highways. Some transportation planners say that to prepare for self-driving cars we need to construct smart infrastructure that communicates directly with vehicles, such as the Obama-era vehicle-to-vehicle safety mandate that was overturned by the Trump administration last week.
Templeton says that as driving become a software-based industry, the physical highways should mimic the internet.
The internet used to be called the "stupid network" because it runs on a few simple protocols and complexity was pushed to the edges. Similarly with cars, we want "stupid roads and smart cars," Templeton says. Then you "put the intelligence in the cars, [which] are being built by lots of competing companies." And that's where you get the internet's "pace of innovation," which "shook the world."
"If we need to install all of this electronic infrastructure on highways," Scribner says, "and we're having problems filling potholes, just from a pure budgetary and management perspective, you're dreaming."
Templeton says self-driving cars provide an opportunity to make our roads even dumber, and he's been promoting an idea called "virtual infrastructure," in which "you take what infrastructure you do need to have and you move it into the cloud." An example is the virtual maps Waymo has built for its self-driving cars.
"The impact that computers will have on society is really, really big [with self-driving cars]," Templeton told Reason. "You need a world where everyone is free to come up with innovations that everyone else thinks are crazy."
Shot, written, edited, and produced by Jim Epstein. Filmed at the 2017 Automated Vehicles Symposium.
"Detour" by Gunnar Olsen.
"The Key" by The Tall Pines is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.
"Stranger in the City" by Marcos Bolaños is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.
"Hills" by Riot
"Boogie Pt. 1" by The Tall Pines is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.
"Realness," by Kai Engel is licensed under a Creative Commons license.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Ah the old "people will die" shit your pants excuse to implement something we really should be careful with. Scribner. would make a good politician
Muh roads!
Private roads will make most auto safety regulations unnecessary.
Except I don't think they will. If corps owned roads, they would control the users, enforce whatever user agreement they put in place, and probably also adhere to insurance standards, which would have their own complex auto safety regulations.
Just because it would be government-free, wouldn't make it utopia, but at least the road enforcement team would be focused on road safety, rather than attempting to justify drug searches with flimsy traffic stops.
Yup.
But you'd probably have a 200mph corridor from DC to Boston.
Heard a story yesterday from a coworker. His friend has a Tesla and was drunk, but just set the car to go home automatically and fell asleep in the seat. Woke up at home in the driveway.
Seen people passed out behind wheel of their Tesla with seats kicked all the way back...going 62mph on the freeway.
Autonomous vehicles are the future. There is really no stopping it. And from the sheer numbers of bad human drivers, drunk drivers, stoned drivers, sleepy drivers, and stupid drivers... I think it can’t come soon enough.
I know guys who drive home drunk almost every day. I remember a poll from years ago that said about 11% of drivers on the road are drunk. Just unacceptable. Let robots do the driving and we are a lot more likely to get there safely.
Register for selenium online training Hyderabad course and get discount offer instantly