"Human beings are idiots." That’s what we think whenever we see a story with the “LOLCARS” tag. It's also what we think when anything happens in Florida. (God forbid I ever have to move there.) It’s probably what Greg Tracy was thinking as I was telling him about the time I tried to express my inner Bob Wollek. And that’s what all of Jalopnik was thinking when I wrote this.
With these thoughts, companies such as Google and General Motors have been building and testing driverless cars, expecting that eventually humans will buy autonomous cars. And DARPA has races every year involving autonomous cars. But I’m not so sure it’ll happen. And it's not because of the crashes and car enthusiasts loudly complaining.
It will happen because humans are idiots. As a result, like I usually do, I wrote a list of reasons why we’ll never have autonomous cars.
Author’s Note: Special thanks to Greg Tracy for inspiring this list after his experience doing the Hot Wheels loop. (I’ll get to why it’s so relevant in the list. Though if you saw the JFF /DRIVE panel videos, you’ll already know why.) And even more special thanks to Mr. Tracy for putting up with my stories. Even I realize it was tedious.
1. Trust Issues
First off, no one will trust autonomous cars. Automakers will try to convince customers that their safety systems will work. Unfortunately, product demonstrations of safety systems have failed spectacularly. Who can forget the time that the S60 crashed into the back of the truck during a City Safety system demonstration. (The radiator breaking was the icing on the cake.) And the time these two guys decided to run in front of their Volvos to see if the system worked. (Once again, human beings are idiots.)
And who can forget the time when three Mercedes-Benz S-Classes were damagedduring a demonstration of the PRE-SAFE automatic braking system. In fact, Mercedes didn’t have enough confidence in its own product, that the system was switched off and the driver (a journalist) was told to stop the car at a certain point. Unfortunately, the braking point was missed and Mercedes had a PR disaster on its hands. (On a side note, that journalist got fired over the incident.)
In the end, if we have a difficult time trusting even radar-based cruise controls and relatively autonomous braking systems that only sometimes work, people are not ready for driverless cars, since they’ve seen computer-operated systems get more than a few things wrong.
2. Hacking Fears
This makes me afraid. Especially when a company like Tesla can remotely update the software on their cars. I would fear the day I got into a driverless BMW 5-Series and someone at BMW decided to eliminate me for the vitriol I’ve given it. Or basically any manufacturer on this list, for that matter.
And a cyberattack targeting cars would take down various forms of transport if able to insert a string of code that would make a Tesla Model S an expensive brick. Mainly because the Model S would end up blocking the carpool lane in California, something which results in flashing high beams and motorcyclists hitting your car so they can lane split. Not to mention numerous calls to Elon Musk to start focusing on the Hyperloop and SpaceX instead.
3. Infrastructure and Regulation
Let me remind you that driverless cars will have to abide by certain regulations so that the Kardashians can stop driving their fleet of G-Wagens all over Hollywood. So Lindsay Lohan can't order her car to crash into a Prada boutique on Rodeo Drive. And Justin Bieber can have that chrome-wrapped Fisker taken away from him.
But determining regulation will be impossible. Mainly because there will be automotive industry lobbyists who will say things like “We don’t need pedestrian safety regulations. It’s their fault for not working hard enough to get a self-driving car.” Not to mention car dealers associations who will proclaim that a customer must pay them hundreds of dollars simply for a software update and will attempt to get the regulations in their favor.
And we have a Congress that would rather shut down a government to get its way instead of passing bills. Congressional hearings would be a nightmare, with the audience having to hear long-winded stories from congresspeople about that one time they rode in a driverless car and they felt safe only to have a Senator follow up with “I preferred my 1956 Buick Super over these confounded driverless doohickeys.” Meanwhile, we’ll be exasperated that people with no enthusiasm about cars will determine whether or not we’ll have autonomous cars. Maybe it's time to start a Jalopnik PAC. So we can at least make sure speed limits are reasonable. (Reasonable being 150 mph, hopefully.)
Furthermore, roads and other forms of infrastructure would have to be improved to accommodate autonomous cars. And that won’t happen, for reasons that involve schools, police officers, fire fighters, prisons, and what amounts to political suicide these days: raising taxes. Not to mention the meetings about the issue that’ll culminate in the one person yelling “Impeach Obama! He's from Kenya!”
4. Liability Issues
Murphy's Law will apply to anything and everything. Meaning accidents. And now I shall spell out a scenario:
A Porsche 999 GT6 RS E-Hybrid Autonomen Ferdinand Piëch Edition crashes into a Bugatti Pacific Grand Sport Autonome Legend Ferdinand Piëch. Both cars exchange information over an immediate Bluetooth connection, which will promptly fail after three seconds. And then an error report will be sent to the manufacturer and safety bureaus.
Both drivers will blame the algorithms in their cars. The owners will say the cars crashed because one car’s processors had the reaction times of Pastor Maldonado. And the other will blame lines of code as redundant as poor Mark Webber once Sebastian Vettel decided to ignore “Multi 21.” But the safety bureau will have to determine which car had inferior machinery and coding and place blame on the owner of that vehicle. Leading to many a Kafkaesque situation.
Insurance companies will only be able to blame the machinery. Meaning manufacturers will be on the hook for everything. For instance, remember that tremendous Toyota recall? And how they said cars were accelerating for no reason? Well that’ll be even worse when an autonomous car gets something very wrong. Especially when getting the most mundane of things wrong. Like picking up a Venti Starbucks Latte instead of a grande. Playing Miley Cyrus instead of Taylor Swift because it was rated higher on Spotify. Taking you to McDonald’s instead of In-N-Out because Zagat thought it was better.
5. We as humans, will always think we're better than the machine...and we are.
This last one is especially true of car enthusiasts. Especially Miata owners compelled to change every part on their car in order to place tenth on the autocross course. And E30 and E36 guys who strip out the interior of their cars only so they can say: "because race car." And Dodge Viper drivers who gladly give John Hennessey thousands of dollars for more power, only to promptly crash them into a tree. I'll stop now.
But back to Greg Tracy. Before he did the Hot Wheels Double Loop Dare, a drone car was used to test whether a car would actually make the loop. That testing attempt was unsuccessful, with the drone car having dropped down directly from the top of the loop. Which did not bode well for a human attempt.
Yet two human drivers were able to drive a loop, in front of a crowd without a horrible accident. And Greg Tracy was able to talk about the incident and how he could outdrive a drone, showing that driverless cars are not the best idea.
It spells good riddance for the best of software algorithms. Meaning no driverless cars. At least until one can perform a Triple Loop Dare, set the record for the largest car jump in the world, be able to win the Formula One championship, and get me a reservation for two at Per Se. Then I'll be all for driverless cars.
This post originally appeared on my Kinja blog BecauseCAR.
Image credits: Wikimedia Commons, YouTube, and Hot Wheels Media.