Tesla Model D's Autopilot: What Are the Legal Implications?
In an interview with Bloomberg TV, Tesla founder Elon Musk was asked about the legal implications of autopilot on the Tesla Model D, which will come to market throughout next year.
Musk was very careful to point out that there's a difference between "autonomous" driving and "autopilot." The former sounds more like what Google's driverless cars are seeking: "You can go to sleep and wake up at your destination," Musk explained. Autopilot, he said, is "what they have in airplanes. For example, we use the same term they use in airplanes, where there's still an expectation there will be a pilot. The onus is on the pilot to make sure that the autopilot is doing the right thing."
Autonomous Cars = Who Knows?
Last month, we blogged about the legal implications of Google's driverless cars. The upshot is that no one's quite sure what the implications of autonomous driving will be. Musk said he doesn't expect "true autonomous driving" for five to six years, plus two to three more years for regulators to confirm that the cars are safe.
Autopilot, on the other hand, is something we're familiar with: Ships and airplanes have had autopilot for years. Because there's some expectation of operator interaction with an autopilot system, there's always the possibility that the operator could be at fault. That was the case in a recent lawsuit against an autopilot manufacturer prompted by a collision between two barges. In that case, the court found that "it [was] at least equally probable that this accident was caused by the failure of Penn's captain to properly operate the autopilot."
Tesla drivers would ostensibly be expected to know how to operate the autopilot, and if there were an accident when the autopilot was on, one issue would be whether the driver should have intervened, and if so, whether he acted correctly.
Driver Assist
Right now, we also have cars that warn us about dangers and augment our senses, whether it's collision detection or backup cameras. The legal landscape on such enhancements is also fairly clear: If there's an accident, you can go after the manufacturer of the car, or of the collision detection system. "[T]he driver is presumed to be in control of his or her vehicle, but if the driver feels that there have been some facts supporting the notion that the equipment caused in whole or in part the accident, that driver would probably bring in the manufacturer of the equipment," David Snyder, vice president and associate general counsel at the American Insurance Association, told Wired.
Even so, the driver is still in control and could be found partially responsible even if an "activity safety feature" failed. The driver is still expected to drive safely and not rely exclusively on the computer to keep a safe distance or watch out for obstacles.
So even if you do get your Model D next year, you still have to pay attention. Musk has warned you: "We're not asserting that the car is capable of driving in the absence of driver oversight."
Related Resources:
- The Important Details of Tesla's Model D and How Autopilot Works (Gigaom)
- Video: This Is What It's Like to Have Your Tesla Drive You Around by Itself (Boy Genius Report)
- The Internet of Insecure Things: Hacking 'Smart Devices' (FindLaw's Technologist)
- Apple Can't Decrypt Data for Law Enforcement; Is It Enough? (FindLaw's Technologist)