Can Self-Driving Cars Ever Really Be Safe?

Analysts estimate that by 2030, self-driving cars and trucks (autonomous vehicles) could account for as much as 60 percent of US auto sales. That’s great! But autonomous vehicles are basically computers on wheels, and computers crash all the time. Besides that, computers get hacked every day. So you gotta ask, “Can self-driving cars ever really be safe?”

The short answer

No. Self-driving cars can never really be safe. They will be safer. So much safer that it’s worth a few minutes to understand why.

Humans are very dangerous

First and foremost, according to the National Highway Traffic Safety Administration (NHTSA), 90 percent of all traffic accidents can be blamed on human error. Next, according to the AAA Foundation for Traffic Safety, nearly 80 percent of drivers expressed significant anger, aggression, or road rage behind the wheel at least once in the past year. Alcohol-impaired driving fatalities accounted for 29% of the total vehicle traffic fatalities in 2015. And finally, of the roughly 35,000 annual traffic fatalities, approximately 10 percent of them (3,477 lives in 2015) are caused by distracted driving.

Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents – there were over 4.4 million in the United States during 2015.

Data begins to make the case

In May 2016, a 40-year-old man named Joshua Brown died behind the wheel of a Tesla cruising in Autopilot mode on a Florida divided highway. He was the first.

Rage against the machine quickly followed, along with some valid questions about whether Tesla had pushed this nascent technology too fast and too far. Everyone expected the accident to be the fault of a software glitch or a technology failure, but it was not.

The NHTSA investigation found that “a safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.” In other words, the car didn’t cause the crash. But there was more to the story. The NHTSA’s report concluded, “The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.” In reality, while Mr. Brown’s death was both tragic and unprecedented, the investigation highlighted a simple truth: semi-autonomous vehicles crash significantly less often than vehicles piloted by humans.

What do you mean by “safe”?

The same NHTSA report mentioned 99 percent of US automakers had agreed to include Automatic Emergency Braking (AEB) systems in all new cars by 2025 with the goal of preventing 28,000 crashes and 12,000 injuries. The AEB program is limited to rear-end crashes, but there are a host of other semi-autonomous features in the works, and by the numbers, all of them will make us safer.

That said, this is very new technology, and regulators will need to define what they mean by “safe.” Must our autonomous vehicles drive flawlessly, or do they just need to be better at it than we are? The RAND Corp think tank says, “A fleet of 100 cars would have to drive 275 million miles without failure to meet the safety standards of today’s vehicles in terms of deaths. At the time of the fatal May 2016 crash, Tesla car owners had logged 130 million miles in Autopilot mode.”

The transition to fully autonomous vehicles

In April 2016, Ford, Alphabet, Lyft, Volvo Cars, and Waymo established the Self-Driving Coalition for Safer Streets to “work with lawmakers, regulators, and the public to realize the safety and societal benefits of self-driving vehicles.” They have their work cut out for them.

In January 2017, Elon Musk tweeted that a software update featuring Shadow mode was being pushed to all Teslas with HW2 Autopilot capabilities. This enabled the car’s autonomous driving AI to “shadow” its human drivers and compare decisions that it (the AI) would make to the decisions that were being made by the human driver. Think of it as self-driving AI in training. The auto industry and several tech giants are working as fast as they can to make autonomous vehicles mainstream. To speed the process, they may need to share some data. Will they? My guess is, absolutely.

Hacks and crashes

In September 2016, Chinese researchers discovered some “security vulnerabilities” in the Tesla Model S and remotely hacked into the car. This was notable because it was the first time anyone had remotely hacked into a Tesla. We have a thesis here at The Palmer Group, “Anything that can be hacked, will be hacked.” Is this going to be an issue? Yes, but it’s also going to be an arms race. I’m betting on the good guys, but to be fair, hacking across every digital touchpoint is a never-ending battle. We will do our best to combat the bad guys.

As for computer crashes, yes, it is possible for the computer that runs your self-driving car to crash, but it will happen so infrequently that, by the numbers, you will be significantly safer in an autonomous vehicle than if you were driving yourself.

Fear and assessment of risk

Some people are afraid to fly. When you point out that flying is the safest form of travel by several orders of magnitude, the response is always some version of, “But when a plane crashes everyone dies.” Human beings are not very good at assessing risk. If you don’t have a gas pedal, a brake pedal, or a steering wheel, and your car crashes, you will feel helpless and out of control. And you may die. But, by the numbers, tens of thousands of people will not die or be injured because semi-autonomous driving and ultimately fully autonomous driving will be much safer than pure human driving. Some will counter that it’s cold comfort if you’re the one who is killed or injured, no matter how rare it is. I agree. But, by the numbers, if you were going to make a policy decision for our society at large, you have to agree that saving tens of thousands of lives and millions of injuries is a worthy endeavor.

(BTW: Please do not bring up the absurd “Why Self-Driving Cars Must Be Programmed to Kill” scenario where “One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?” If you had situational awareness and time to consider all of the outcomes posited by this nonsense hypothetical, you’d have time to step on the brake. If you didn’t have time to consider all of the potential actions and outcomes, the AEB would have engaged to prevent the car from hiting what was in front of it – the people you would have killed while you were thinking about what to do.)

A prediction

I’m pretty sure that before 2030, if you are under the age of 25 or over the age of 70, you are going to need a special permit to manually drive a car. I’m also pretty sure that you will not be allowed to manually drive on certain streets and highway lanes because you will pose too great of a threat to the caravans of autonomous vehicles on those roads.

With any luck, the fear-mongers and bureaucrats will get out of the way, and we will all be much safer sooner.

 

 

 

Shelly Palmer is Fox 5 New York's On-air Tech Expert (WNYW-TV) and the host of Fox Television's monthly show Shelly Palmer Digital Living. He also hosts United Stations Radio Network's, ...

more
How did you like this article? Let us know so we can better customize your reading experience.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.
Or Sign in with
Gary Anderson 7 years ago Contributor's comment

Tell me now it will be safer if autonomous cars have no power to merge into traffic? How will it be safer when autonomous cars cannot even detect a ball in the street, or a motorcycle? And, this is what will happen, people will pay extra for the technology, but will soon become disenchanted with it. As a researcher said, autonomous cars will never, ever be able to navigate construction zones. So, fully, and autonomous are mutually exclusive terms.

David M. Green 7 years ago Member's comment

Never say never. Didn't Bill Gates once famously say computers would never need more than 640k ram? Technology will continue to improve and so will autonomous cars.

Gary Anderson 7 years ago Contributor's comment

I never thought I would see the day when so many scientific types would become so gullible. But Self Driving technology has spawned a vast army of gullibles.

Mike Stollenwerk 7 years ago Member's comment

Shelly Palmer's prediction that "before 2030, if you are under the age of 25 or over the age of 70, you are going to need a special permit to manually drive a car" is fatuous. Palmer apparently believes that the age of drivers correlate with higher level driving skills - this is fatuous. Driver skills improve by way of the learning curve. 15-18 year olds are less skilled drivers than older drivers because they have less experience, but they learn faster and retain the skills they earn by doing than older beginning drivers. If you delay allowing drivers to drive until they are 25 or more years old, these drivers must still go through the learning curve to gain skills, but as they are older, these skills are harder to acquire and take longer to master - just think about those older drivers you see from countries without a driving culture - their driving skills are much lower - frankly, terminally lower - than drivers who learn to drive as teenagers. Palmer also under estimates the fact that the USA has a very strong car culture, associated with freedom, and other American values such as self-reliance. State legislatures are not going to just roll over and attack America's car culture. Nor would it make sense to do so.

Ayelet Wolf 7 years ago Member's comment

I think you are reading into his statement too much. I don't think he is saying that 25 year old and under are bad drivers - I think he's saying everyone older will already have a driver's license, but that for new drivers, they'll have a much harder time getting licensed.

For the elderly however, many simply can't drive on their own. Worse, many think they can, but unfortunately likely shouldn't be behind the wheel.

Gary Anderson 7 years ago Contributor's comment

This is what really will happen. Cars will cost more. But one slip up, human or otherwise, and people will not use the technology. This self driving car thing is a scam, and it is all about selling technology. Ayelet, you can't even merge into traffic with this technology. And an expert has said, you will never be able to negotiate construction zones with self driving technology. Also, do you want to buy a car that will crash you into a wall if you make an error and put others in danger? Skeptics are growing daily. The push for these cars is a money grab, only.

Craig Newman 7 years ago Member's comment

While I don't have confidence in these vehicles yet, we'll get there eventually. Even now I'd trust a self driving car more than I'd trust a car driven by a texting teen or a senile and/or half blind senior citizen.

Trinity Sinclair 7 years ago Member's comment

My sole fear lies with hackers. As the author said, anything can be hacked. Imagine if terrorists manage to hack or upload a virus to self driving cars en masse. The death and destruction would be devastating as every car could become a killing machine, crashing into buildings, crowds of people and other cars. Roads, transportation and infrastructure would grind to a stand still while first responders and hospitals all over the country would be overwhelmed. I find the thought terrifying.

Susan Miller 7 years ago Member's comment

While I know that, technically speaking, self-driving cars are safer than those driven by people, I don't think I'll ever totally feel safe in one. Yes, the computers can think and sense faster than us, but I fear such cars wouldn't be able to handle unusual circumstances. They are limited to their programming.

Kurt Benson 7 years ago Member's comment

I don't see the difference between using a self driving car or getting a lift with a friend or a cab driver. People text, use cell phones, get distracted, etc. A self-driving car does none of these things. While I might trust my own instincts over a car's, I'd much rather trust my life to a self-driving car than another person.

Gary Anderson 7 years ago Contributor's comment

Before you lock in that position, Kurt, you may want to read this: www.talkmarkets.com/.../top-ten-reasons-self-driving-cars-are-useless

Kurt Benson 7 years ago Member's comment

Thanks, somehow missed this article.