Bad News: The Government Wants to 'Help' Driverless Car Companies
The future is rushing toward us. Unfortunately, the government wants to help.

Google's parent company, Alphabet, revealed in December that an unaccompanied blind man had successfully traveled around Austin, Texas, in one of the company's cars, which had neither a steering wheel nor floor pedals. That same month, Alphabet announced that it is spinning off its self-driving vehicle technology into a new division called Waymo. Also in December, Uber launched an experimental self-driving ride-sharing service in San Francisco.
The future is rushing toward us. Unfortunately, the government wants to help.
In the case of Uber, the California Department of Motor Vehicles (DMV) was so eager to help that it ordered the company to shut down its service, declaring that its regulations "clearly establish that an autonomous vehicle may be tested on public roads only if the vehicle manufacturer, including anyone that installs autonomous technology on a vehicle, has obtained a permit to test such vehicles from the DMV."
Anthony Levandowski, head of Uber's Advanced Technology Group, responded by observing that "most states see the potential benefits" of self-driving technology and "have recognized that complex rules and requirements could have the unintended consequence of slowing innovation." By refraining from excessive regulation, added Levandowski, these jurisdictions "have made clear that they are pro technology. Our hope is that California, our home state and a leader in much of the world's dynamism, will take a similar view." Uber moved its self-driving fleet to Arizona.
The U.S. Department of Transportation (DOT) likewise wants to "accelerate the next revolution in roadway safety"—so in September, naturally, the agency issued a 116-page Federal Automated Vehicles Policy that outlines a 15-point design and development checklist applicable to the makers of automated cars. In case that was not enough help, the agency then issued a 392-page Notice of Proposed Rulemaking to mandate that all new light cars talk to each other using a very specific vehicle-to-vehicle (V2V) technology.
Instead, these rules are likely to slow down innovation and development. Compliance with the agency's 15-point safety assessment is supposedly voluntary, but woe betide any company that fails to file the proper paperwork. Even more worrying, the DOT is calling for a shift from the current regime, in which automakers self-certify that their vehicles meet safety standards, to a system where the agency tests and approves the product before it can go to market. This would bring all of the speed and efficiency of the federal government's drug approval process to the auto industry.
Plus, as Competitive Enterprise Institute researcher Marc Scribner points out, the safety benefits of the V2V mandate "will be trivial for the next 15 years, at which point far superior automated vehicle technology may be deployed to consumers." Self-driving cars equipped with autonomous collision avoidance technologies will likely provide all of the supposed benefits of V2V communications—and do it sooner. If the incoming Trump administration really wants to help, it'll get Washington out of the way and withdraw these two proposed rules.
This article originally appeared in print under the headline "Bad News: The Government Wants to 'Help' Driverless Car Companies."
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Someone wanted an FDA-like agency for security of the Internet of Things. I said that's the surest way to cripple them, and his response was "Thalidomide". I mentioned people who died from lack of access to drugs approved in Europe but unavailable here due to FDA NIMBYism, and his response was "Thalidomide". I said the FDA moves so slowly that drug patents expire before approval; if we'd had an FDA-like agency for computers, we'd just now be getting Pentiums which can't count and WiFi would be a fantasy, and his response was "Thalidomide".
Ugh. These yokels are not content with the precautionary principle (don't do anything the first time) for themselves, they insist that everybody be stuck with it, regardless of how little it affects anybody else. I don't care if he wants an FDA-approved computer, he can live with 20 year old concepts and technology, but I want the future, not the past.
A different aspect of the same thing is traffic laws. Punish everybody with arbitrary limits when everybody is different. So much simpler to simply say, if you are in an accident and were drunk, driving faster than everybody else, no lights on, yakking on your cell phone, etc -- then you are presumptively at fault. Eventually some autonomous cars would be recognized as safer than others, including human drivers, and it would be the same thing. But that's too damned simple and leaves no scope for bureaucrats.
And if you pointed out that thalidomide was found to slow the growth a malignant tumors decades later, opening up the field of anti-angiogenic drugs decades later, their answer would still be "thalidomide".
You may want to point out to your "friend" that thalidomide WAS approved by the government for human use. It had been tested in mice and rats and did not show any signs of problems. In fact, even today, many drugs are approved by the government, but are later recalled when broader use in humans reveals problems.
The real tragedy is that he thought his friend was arguing with him by repeating "Thalidomide", when in reality he was suffering an aphasia indicating an oncoming stroke.
I genuinely hope that government regulation cripples this amazingly stupid idea. The enthsiasm otherwise sensible people exhibit for the notion of driverless cars baffles me. The idea that software written by humans will somehow be better at driving than humans strikes me as magical thinkng. The possibilities inherant in hundreds of vehicles in the same traffic jam developing the same bug leave me breathless. The only thing I can think of that would be worse is a centralized system, which would be hackable and also ensure that an entire commute worth of cars made the same mistake at one time.
Maybe I'm just a luddite, but driverless cars are an idea that, to my mind, violates the engineer's rule of KISS; Keep It Simple, Stupid.
Whoops! So much for principles. Once your ox is gored, you can't get too much of Big Brother coming to bat on your side, damn the consequences. Yes, if your arguments fall on the deaf ears of ordinary people, drag the nanny into it to make them do what's good for you.
Fuck off, slaver.
Did I ever say I was a Libertarian? I'm not. I'm a Crank. I happen to think that one of the few legitimate purposes of government is building and maintaining the roads and, at least to me degree, regulating what is allowed to use them. I don't want people driving vehicles with no lights, signals, or brakes. Similarly, I don't want vehicles controlled by imbeciles, and all computers are imbeciles, no matter how brilliant their programmers may be.
I don't desire to enslave you; I have to feed you. I do want you restrained from doing something ridiculously dangerous in my vicinity.
Not Amish.
??? Driverless cars, once the software has gone through a few more iterations, will be *safer* than human driven cars. Humans get distracted. Humans are sometimes intoxicated. Humans like to push the limits. Humans tailgate and weave in and out of traffic or move along at a crawl and any of a number of other dangerous actions on the road. Software has no such problems.
Not that it's perfect. Nothing is. Software can only react to things it's written to react to. But the longer it is used, the better it becomes.
My worries are summed up by the adage "To err is human. To repeat the error 10,000 times a second requires a computer. I keep hearing that computer driven cars will be safer than human friven ones. To me this sounds an awful lot like magical thinking. A computer driven car is amcar being driven by a human through several removes. That doesn't strike me as an improvement.
But the longer [software] is used, the better it becomes.
Clearly someone who is unfamiliar with WinZip.
The problem with humans is variance: our skills run the table from fantastic to awful depending on any number of factors. Computer driven cars will solve this problem by being predictably terrible, mostly because of liability issues and the programmers' frames of reference (the people Google et al will hire to tell the cars what to think are generally not the types whose driving inspires confidence among their fellow motorists).
Most of the laws regarding driving are draconian because it's impractical to expect rigid enforcement. The computers will adhere to them anyway, regardless of what their fellow motorists expect. This makes them hazardous regardless of what the law might say.
When an autonomous car crashes, millions of cars get updates and drive safer. When a car driven by a person crashes, people just complain about the traffic backup.
When an autonomous car crashes there will be a panic update, which will hang, stranding thousands. Also, why think only ONE car will crash? Why won't it be hundreds, as a bug surfaces?
When was the last time you updated your iPhone in the middle of a phone call? Why would your car be any different?
You're right that it would take hundreds of crashes before the bug is even identified, much less patched. Certain publishers and manufacturers will patch flaws more quickly than others, and a decent number of bugs will remain in place indefinitely (this happens with every piece of complex software; just compare the list of reported bugs in each Polycom HDX firmware release to the fixed issues in each update).
You need to drive around where I do. The fact that some of these people can *start* a car impresses me. Bring on driverless cars - be much safer, IMO.
It can be done,maybe in 1,000 years....
Driverless cars cannot become mainstream until someone(thing) defines who gets to be sued when damage is caused. The owner, even if not in the car? The company that built the car? The manufacturer of the specific part that failed? The software company that supplied the code? The specific programmer(s) who developed the code (oh God, I only wish!!)? The QA testers who said the code was ready? The designer who wrote the specifications?
All further complicated by the lack or presence of "human override" capability; brake pedals, steering wheel, etc.
Because after all, lawyers have to make a living. It would be unfair to legislate that accidents happen.
Philosophical thought for the day: Who/what is in control of an autonomous vehicle if the vehicle has no controls for the riders?
Tesla is already offering maintenance and insurance package with their cars in other countries and has plans to do the same in the US. If the manufacture is supplying the insurance it would bypass this problem.
In the finale of Grand Tour, Clarkson points out that he's driving a GTI down the freeway using GPS, lane assist and adaptive cruise control - self driving cars are already here.
V2V is desired by Gov't for one very specific Orwellian need... to stop vehicles from working at the whims of enforcement (be it unpaid registration or detainment of passengers).
Gov't will not tolerate autonomous vehicles that cannot be stopped by wireless communication from enforcement (and hackers). With Gov't in the mix, the future is bleak.
Bureaucrats never miss an opportunity to tax and regulate something. Autonomous cars will be a HUGE boon to state tax coffers. From 'different' licenses, to 'different' registrations and inspections and insurance, and titling and property tax and ticketing and law enforcement....state and local governments are DROOLING at the prospect of new and creative ways to suck money out of car owners. It will be an Oklahoma land-rush.
Bureaucrats never miss an opportunity to tax and regulate something. Autonomous cars will be a HUGE boon to state tax coffers. From 'different' licenses, to 'different' registrations and inspections and insurance, and titling and property tax and ticketing and law enforcement....state and local governments are DROOLING at the prospect of new and creative ways to suck money out of car owners. It will be an Oklahoma land-rush.
^ This is the reason why governments want autonomous vehicles. They will be autonomous with regard to the passenger, but you can bet that they will be required to allow the government to take full control, to see everywhere the vehicle has been operated, where it is at any time, and the total number of miles it has driven, in order to tax it. I would not be surprised to see governments decide that these vehicles cannot be owned by individuals, but instead must be owned by a giant govt-charted corporation that will take full responsibility for their purchase, maintenance, and repair.
Think autonomous vehicles plus Uber, run by the government. It solves ALL of the progressive problems, including the elimination of traffic jams by clever "planning" by giant govt departments. You get to go on vacation when there are slots available, and the govt will tell you when that will be. It allows full control of both supply and demand.
By 2050, non-autonomous cars will only be allowed to be operated by special, limited licenses.
Anytime the government gets involved in private enterprise failure is imminent.
The biggest problem I have with self-driving cars is that I love to drive. The ultimate goal is to eliminate human-driven cars and car ownership, so that you won't just be able to decide where you want to go and when, or take a leisurely sunday ride somewhere and pull over at anytime if you see something interesting.
Bentley . true that Ashley `s blurb is good... last week I got Lotus Esprit sincere getting a check for $5815 this-last/five weeks and-even more than, ten/k lass-month . without a doubt it is the easiest work I've ever done . I began this seven months/ago and almost immediately started earning minimum $77... per-hour . more tips here.
======== ____________ http://www.4dayjobs.com
Anthony Levandowski, head of Uber's Advanced Technology Group, responded by observing that "most states see the potential benefits" of self-driving technology and "have recognized that complex rules and requirements could have the unintended consequence of slowing innovation." By refraining from excessive regulation, added Levandowski, these jurisdictions "have made clear that ???? ????? ????? ?????? ????? ???? they are pro technology. Our hope is that California, our home state and a leader in much of the world's dynamism, will take a similar view." Uber moved its self-driving fleet to Arizona.
It`s a great topic! It leaves the space for thinking! As I work at https://ca.edubirdie.com/custom-essay-writing, our team always face with problems like this. All we need is to bring the main point to readers.