Self-driving cars will still be safer in the long run

A demonstration of the Tesla Model S Roadster at Tesla Design Studios in Hawthorne, California, U.S., on Wednesday, July 31, 2013. Photographer: Patrick T. Fallon/Bloomberg
A demonstration of the Tesla Model S Roadster at Tesla Design Studios in Hawthorne, California, U.S., on Wednesday, July 31, 2013. Photographer: Patrick T. Fallon/Bloomberg

According to the Wikipedia page around 32,000 people die each year in auto accidents in the United States. That's about 90 per day, give or take. To date, that we know of, one of those people have died in a Tesla using its "autopilot" mode.

Naturally, people are furiously debating self-driving cars and automation. The National Highway Transportation Safety Administration (NHTSA) is investigating the crash.

The information we have right now suggests that the driver was pushing the envelope, doing videos with the autopilot engaged – and maybe just watching a DVD at the time of the accident. (The "autopilot" is, as I understand it, not meant to be 100% self-driving, and the driver is supposed to remain engaged at all times.) The accident seemed to be a perfect storm where the car's software may have interpreted the tractor trailer as an overhead sign, and the driver of the tractor trailer is mostly at fault. The autopilot simply failed to correct for the situation.

Everything about this situation was predictable. We know that the software for these cars will be imperfect. It will improve drastically over time, but it will always be imperfect because humans are imperfect and the real world is damn hard to predict anyway.

Trying to write software that handles all of the possible date formats, etc. around the world is super-difficult, and that's got to be a much simpler problem to try to solve than handling all the possible road conditions, driver errors, equipment errors, and just outright weird stuff that can happen on the road.

That someone would have an accident partially or fully attributable to the self-driving features is predictable. Driving is dangerous. We tend to forget that, or at least keep it hidden in the back of our minds because it's scary, but driving is a dangerous activity. Actually, living on this planet is a dangerous activity, but we tend to forget that too.

And, of course, it's predictable that the story would be deemed newsworthy and that people would be focused on this instead of all the other things that cause people to die while driving.

It's right that we discuss it, and there are some hard problems that have yet to be solved around it. Once again, technology has leapfrogged society's ability to cope with the technology. We don't have consensus on some important questions raised by self-driving vehicles, like "if the software has to choose between life of the driver and life of a third party, who does it choose?" (I haven't seen this come up yet, but people are going to be really pissed when it turns out that the driving software will likely choose to hit animals that dart into the road — including beloved pet, Fluffy — rather than endanger the driver at all.)

And the legal liability questions are going to be playing out for decades. We still feel a need when something bad happens to point a finger somewhere. It has to be someone's fault because we're really, really bad at accepting that we aren't immortal.

In the long run, we're going to be safer with self-driving vehicles. Probably better off in a lot of ways, as we might move away from the individual expense of owning cars to systems of shared vehicles and much safer commutes. I hope, someday, to be able to just program a destination into a car and sit back and read a book while it takes me there.

But to get there, I have to accept some risk. A different set of risk than I accept now when I get into a car and drive to the grocery store. I'll have to put my faith in the vehicle's systems and that of other vehicles on the road, and less in my own ability to drive a vehicle. But I have already accepted that my skill as a driver is only a partial factor in my ability to avoid accidents. Twice in the last year, I've had other drivers hit my car while I was driving — once from the side, once while stopped at a stoplight. Maybe autonomous systems would have prevented those accidents, as they were both caused by the other driver's error.

Getting there means that people will die, though. People will die because of these systems. Other people will live because of autonomous autos, but we'll largely ignore those incidents because we're generally bad at recognizing near-misses and saves, unless they're really unusual. Keep an eye on the statistics, though. My money says that, on average, people will be safer in self-driving vehicles in the long run.

Author: Joe Brockmeier

Joe Brockmeier is a long-time participant in open source projects and former technology journalist. Brockmeier has worked as the openSUSE Community Manager, is an Apache Software Foundation (ASF) member, and participates heavily in the Fedora Cloud Working Group. Brockmeier works for Red Hat in the Open Source and Standards (OSAS) and manages the community team.

Leave a Reply

Your email address will not be published. Required fields are marked *