The Robot Did It: Autopilot on Trial
The question of who's liable for any injuries or deaths that result from a car crash can be fairly easy to answer. Assigning fault is pretty straightforward when a driver is drunk, distracted, or just really bad at driving — but what if the car's driver wasn't actually driving?
Electric vehicle maker Tesla is expected in court this September and October for two separate cases involving two separate deaths. The first occurred in 2019, when a man's Model 3 veered off a highway, struck a palm tree, and burst into flames, killing the driver and seriously injuring his two passengers (one of whom was an 8-year-old boy who was “disemboweled").
The second was another 2019 crash just outside of Miami, Florida, when another Model 3 drove under the trailer of a big rig with enough force to shear off the roof of the vehicle. The driver, Stephen Banner, was killed.
These two incidents have a few things in common. Both took place in 2019. Both involved Tesla Model 3s. Both led to the destruction of the cars and the deaths of their drivers. But the big one is that both cars were being controlled by Tesla's autopilot feature at the time of the accidents.
Autopilot or “Autopilot?"
Tesla has denied liability for both accidents. The company maintains that — contrary to its name — its Autopilot feature does not constitute actual self-driving functionality, and is instead meant to be monitored by the human behind the wheel while in use. According to Tesla, users of Autopilot are told explicitly that their cars will not actually drive themselves, and that human monitoring is required to ensure safe and proper operation.
Tesla has pointed to a number of safety features built into the Autopilot system to make sure drivers remain alert and attentive during operation. Sensors and cameras inside the cars keep an electronic eye on drivers when Autopilot is active, for one, and each vehicle comes with a whole set of audio alerts, visual cues, and physical warnings intended to remind drivers to always keep their hands on the wheel.
A reasonable defense emerges when you add the warnings about Autopilot's capabilities and Tesla's messaging to the slew of alerts and other safety measures. Autopilot (and Tesla's new “Full Self-Driving" mode that's currently in beta testing) may have misleading names, but it's hard to argue that Tesla is wrong when they claim that any accidents or deaths are a result of human error – but not impossible.
There are a few holes in Tesla's seemingly impermeable defense. For instance, these are hardly the first incidents in which Autopilot has allegedly failed to operate as intended and caused injury or property damage as a result.
Take a recent story about the National Highway Traffic Safety Administration's investigations into changes in Tesla's autopilot system. The agency has been investigating crashes involving Teslas using Autopilot dating back to 2016, particularly a series of incidents involving Teslas on Autopilot crashing into emergency vehicles stopped on the side of highways. Of particular concern is the full self-driving software that Tesla has been testing, which was apparently so prone to breaking traffic laws that the NHTSA forced Tesla to recall nearly 363,000 vehicles.
Regulators are also concerned by reports that recent software updates have allowed some drivers to use Autopilot for extended periods of time without telling them to put their hands back on the steering wheel – changes that were seemingly applied to random vehicles without prompting or warning. The changes are too recent to have been a factor in the 2019 crashes, but the fact that Tesla leadership hasn't changed since then suggests the possibility that Autopilot's capabilities and safety features may have been less rigidly defined than Tesla asserts.
Maybe It's Just Broken?
Both lawsuits surrounding the fatal 2019 crashes assert that Autopilot was at fault for the crashes, not the drivers. The lawsuit filed by the victims of the California crash assert that Tesla knew that Autopilot and its safety systems were defective when it sold the car to the deceased. The lawsuit filed by the wife of the deceased in the Miami crash seeks damages and information into what, if anything, Tesla knew about possible defects in the car or the software.
It may be difficult to prove fault in either case, but both suits represent important steps toward defining the parameters by which we judge fault in cases involving self-driving cars and other automated or semi-automated machines. It's anyone's guess how it will all turn out. Only one thing's for sure: Elon's going to stay rich no matter what happens.
- Tesla Must Pay Black Employee $3.2 Million in Racial Bias Lawsuit (FindLaw's Courtside)
- Can Elon Musk Afford to Break the Law? (FindLaw's Legally Weird)
- Self-Driving Car Accidents: Top 5 FAQs (FindLaw's Law and Daily Life)
Was this helpful?
You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help
Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.