Faced with innovation that operates outside of their control, many lawmakers can't stop themselves from grabbing the steering wheel. Sometimes literally.
"So, I'm in the Tesla," Sen. Ben Nelson (D–Fla.) told a March hearing of the Committee on Commerce, Science, and Transportation. "And we're coming back across the Anacostia River and getting up on—on the bridge then to get on to the ramp on to 395. And I'm instructed in the driver's seat: 'Engage the autonomous switch.' I click it twice. 'Take your hands off the wheel.' And so all of a sudden the car is speeding up and they say: 'It automatically will go with the flow of the vehicles in the front and back.' But now we are approaching the on-ramp on to 395 and it is a sharp turn. And the vehicle is still speeding up. And they said, 'Trust the vehicle.'"
But the senator could not quite manage to do that. "As we approach the concrete wall," he said, "my instincts could not resist. And I grabbed the wheel, touched the brake and took over manual control. I said, 'What would have happened?' They said, 'If you'd left your hands off of the wheel, it would have made that sharp turn and come on around.' And so I am here to tell you that I am glad I grabbed the wheel."
Making the senator's panic even more metaphorically apt was the title of the Senate hearing: "Hands Off: The Future of Self-Driving Cars." While many lawmakers therein expressed considerable enthusiasm about autonomous vehicles, most members of the committee made it clear that they'll be hard-pressed to stick to a hands-off approach to regulating driverless cars. As Nelson put it, "In the federal government we have a critical role to make sure that the regulatory environment and legal environment in which American business does business is able to develop and manufacture these vehicles. And also it means that we're going to have to—in our case—exercise responsible oversight."
Nelson's words should strike fear in the hearts of those who are looking forward to playing Scrabble, applying makeup, or reading a Kindle on the freeway. Leading automakers such as Ford, Toyota, and Volvo, plus contending tech companies like Tesla and Google, forecast that self-steering cars will be widely available by 2020, with fully autonomous vehicles less than five years later. Driverless cars have the power to make us richer, less stressed, more independent, and safer.
Unless lawmakers and regulators manage to screw everything up.
Robots, Take the Wheel
"We should be concerned about automated vehicles," University of South Carolina law professor Bryant Walker Smith told the Associated Press in March. "But we should be terrified about today's drivers."
Self-driving cars aren't perfect and they never will be. They are, however, significantly better pilots than today's distracted, maladroit, and sometimes drunk human beings. In 2014, some 38,000 Americans died in traffic crashes. The National Highway Traffic Safety Administration (NHTSA) estimates that 94 percent of automobile accidents in the U.S. are due to human error. A Virginia Tech Transportation Institute study commissioned by Google estimated in January 2016 that human-driven vehicles crash 4.2 times per million miles traveled whereas current self-driving cars crash 3.2 times per million miles, a safety record that's likely to keep improving as robocars gain more real life experience on the roads.
California, unfortunately, didn't get the memo about comparative human/robot safety records. The Golden State's Department of Motor Vehicles (DMV) wants to mandate that all self-driving cars have steering wheels, pedals, and a licensed, specially trained driver in the front seat, according to the 22 pages of draft regulations Sacramento released last December.
As the autonomous vehicle pioneer Brad Templeton wryly summarizes: "We don't understand this, so let's not allow it until we do understand it." There's a cost to such pre-emptive prohibitions. "Once you ban something, it is hard to unban it," Templeton, a software engineer who sits on the board of the Electronic Frontier Foundation—and formerly consulted on Google's self-driving car project, says. "A regulator doesn't want to be the regulator who unbans something and then something goes wrong."
To Templeton's point, a February open letter from a consortium of California tech business associations to the secretary of California's State Transportation Agency decried the "draft regulations that explicitly prohibit the operation of fully autonomous vehicles in California."
Google's vehicle division, in a December statement, declared that it was "gravely disappointed" with the draft regulations, especially the requirement for a licensed driver to be inside the car. In a public blog post, the company's self-driving car chief Chris Urmson urged, "Instead of putting a ceiling on the potential of self-driving cars, let's have the courage to imagine what California would be like if we could live without the shackles of stressful commutes, wasted hours, and restricted mobility for those who want the independence that the automobile has always represented."
But regulators are busy slapping shackles on this technology before it has a chance to deploy. Testifying to the March Senate committee hearing, Urmson noted that in the past two years 53 bills aiming to regulate self-driving cars have been introduced in 23 states. Five states have already adopted laws regulating autonomous vehicles, he said, and "none of those laws feature common definitions, licensing structures, or sets of expectations for what manufacturers should be doing." He added that unless a unified approach is crafted, these disparate state regulatory schemes "will significantly hinder safety innovation, interstate commerce, national competitiveness, and the eventual deployment of autonomous vehicles."
"What we have found in most places," he said, "is that the best action is to take no action. And that in general the technology can be safely tested today on roads in many states."
Send in the Feds?
All of the industry participants at the Senate Commerce Committee hearing strongly urged Congress to pre-empt the developing patchwork of state regulations on self-driving vehicles. "We propose that Congress move swiftly to provide the Secretary of Transportation with new authority to approve life-saving innovation," Urmson said. "This new authority would permit the deployment of innovative safety technologies that meet or exceed the level of safety required by existing federal standards, while ensuring a prompt and transparent process."
In contrast with California regulators, the feds at the NHTSA responded positively in February to Google's request that the agency legally recognize the company's on-board computerized self-driving system (SDS) as the driver in a self-driving vehicle. "If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the 'driver' as whatever (as opposed to whoever) is doing the driving," the NHTSA letter declared. Even so, the agency noted that it would have to embark upon a rulemaking to figure out how to make sure that the SDSs' sensors can monitor and operate such federally mandated safety features as turn signals, headlight dimmers, rear visibility, and so forth.
At the North American International Auto Show in Detroit in January, Secretary of Transportation Anthony Foxx announced that his agency would develop within six months federal guidance on the safe deployment and operation of autonomous vehicles and devise a model state policy on autonomous vehicles outlining a path to a consistent national scheme. In addition, the Obama administration proposed in its 2017 budget allocating nearly $4 billion to funding various unspecified autonomous vehicle pilot projects.
Not So Fast
So is federal intervention the solution? Not exactly.