Blame Autopilot: who’s responsible for preventing autonomous car crashes?

There’s no question that self-driving vehicle technology is the hot topic in the automotive industry. Every major manufacturer seems to be designing their future models with self-driving functionality. The tech industry is also developing and testing its own solutions, including Waymo (Google/Alphabet’s self-driving car project), Uber, Apple, Aptiv, Zoox, and almost countless others.

The applications are exciting, and will eventually be life-changing. Not only will travelers and commuters be able to work, sleep, or watch a movie while our car drives us to our destination, but—even more strikingly—children, the elderly, and people with disabilities who preclude them from driving a car will have a new degree of freedom.

Autopilot is driver assistance, not self-driving

At the forefront of this charge is Tesla; it’s the first manufacturer to make semiautonomous vehicle technology available on all of its cars. Which basically amounts to the most “real world” testing any company has so far been able to do.

But there’s a difference between semiautonomous and self-driving. Tesla’s Autopilot feature is the former—the car makes some decisions on its own, but it’s intended to operate with a licensed driver sitting in the driver’s seat, alert and aware of what’s happening on the road ahead, ready to grab the wheel and intervene. In its current iteration, Autopilot is essentially an extremely advanced form of cruise control and lane keep assist. Tesla refers to it as a “driver assistance system.”

As we’ve seen in the news, Autopilot is still learning; it’s not even close to infallible in all of the possible scenarios a car can encounter. But human operators are often using it as though it is self-driving, with disastrous results.

One of the first things most of them say to the police is “it was on Autopilot.”

So who is responsible, when a semiautonomous vehicle crashes?

Incidents and accidents

One assumption several Tesla owners have made is that Autopilot is the perfect designated driver. They were mistaken.

Just ask the guy police found passed out on the SF Bay Bridge in his stopped Tesla. And then arrested, for having a blood alcohol level of twice the legal limit.

Or the inebriated Tesla employee in Morgan Hill, California, who ended up in a creek bed in his Model 3.

A Tesla employee’s Model 3 in a creek bed near San Jose, CA.

For those who were not drinking, the consequences of not paying attention while using Autopilot were just as bad. (Especially considering that a seemingly disproportionate number of them crashed into fire trucks. And one police car.)

In January, a Culver City, California Tesla driver crashed his Model S into a fire truck at  65 miles per hour while Autopilot was engaged. The NTSB determined the cause to be a combination of factors, including the limitations of the system as well as the actions of both the Tesla driver and the driver of the semitrailer the car crashed into.

The driver of this Model S is now suing Tesla.

In Utah, a driver whose Model S plowed into the back of a stopped fire truck at 60 miles per hour in May was relying on autopilot and not paying attention to the road at all, because she was busy looking at her phone, she told investigators. The car’s data shows that she hadn’t touched the wheel for at least 80 seconds before the crash, and she hit the brakes a fraction of a second before impact. Now she’s suing Tesla and Service King, claiming that Tesla sales people told her that Autopilot would stop the car if necessary to avoid an accident.

And in San Jose, a Tesla owner’s Model S slammed into a stopped (responding to an accident scene) fire truck on a highway at 65 miles per hour. He told first responders at the scene that he thought he had Autopilot engaged. (He was also arrested at the scene for driving under the influence.)

Then there was the Model S owner in Laguna who hit (and totaled) a parked police SUV while in Autopilot. In that case the car apparently veered into the police vehicle. The crash was in May and is still under investigation.

A Tesla Model S after it crashed into a Laguna Beach Police SUV.

Laws and responsibility

Whatever shortcomings the current version of autopilot might have—as the above crashes illustrate—the primary problem is human, not machine: people aren’t following Tesla’s instructions for proper use of autopilot.

After (and presumably in response to) the numerous autopilot-related accidents, Tesla has made the autopilot system more aggressive in keeping the driver alert and engaged. As of June, the latest version of the software visually prompted the driver to place hands on the wheel after 30 seconds of having them off the wheel, issued an audible warning at 45 seconds, and at 60 seconds it turned autopilot off altogether, until the car was restarted.

Tesla’s Autopilot prompts a driver to “hold steering wheel.”

What about laws governing the use of Autopilot and similar technologies? As with so many new technologies—the internet, digital music, ecommerce—legislation still needs to evolve and adapt to catch up. A number of states have enacted at least some laws pertaining to autonomous vehicle technologies, though these are largely applicable to manufacturer testing of self-driving (or “driverless”) vehicles, rather than consumers using semiautonomous systems like Tesla’s Autopilot.

What’s the answer?

Complex technologies always carry with them complex ethical questions. Semiautonomous systems are no different; there don’t seem to be any simple answers to the Autopilot conundrum. It’s the first major step toward self-driving cars, which are arguably the future of car-based transportation. And any type of artificial intelligence needs data points—as many as possible—to improve. Having Autopilot in consumers’ cars accelerates that exposure and learning process, and helps it advance more quickly than just relying on manufacturer test vehicles.

But with the advantages of such an advanced technology comes responsibility. People who are fortunate enough to have access to Autopilot have to remember that—just like advanced cruise control—it’s a feature of their car, and they are still responsible for their car. They need to realize Autopilot’s limitations and use it according to Tesla’s instructions. And common sense.

Comments are closed.

Up ↑

%d bloggers like this: