Are Driver-Assistance Cars Safe?
On June 15, a federal agency released the first-ever reports measuring car accidents involving driver-assistance technology.
The National Highway Traffic Safety Administration (NHTSA) provided firm numbers —392 accidents involving vehicles with drivers and another 130 with driverless vehicles — over a 10-month period. But instead of providing answers about the safety of the technology, the reports have mostly sown seeds of confusion.
Are cars equipped with Advanced Driver Assistance Systems (ADAS) technology safer than those that are not? Or do they make our roads more dangerous? The report provides no answers, despite vehicles using this technology in various forms being on our roads.
The reports make no claims that these numbers are anything more than a tool in helping the agency detect defects as it considers regulations. Still, NHTSA appears to be responding to criticism in some quarters that it is not being assertive enough in regulating driver-assistance technology.
Tesla has been dominant in this area for years with its Autopilot system. Crashes and fatalities of Tesla vehicles have made headlines since the first fatality in 2016. To date, there have been 35 crashes, including nine that resulted in the deaths of 14 people. Only three of those investigations concluded that Autopilot was not to blame.
NHTSA took a broader step in June 2021 with an order requiring manufacturers and operators to report crashes involving ADAS technology. This year's report details the responses to that order.
The 5 Levels of Driver-Assistance Technology
NHTSA is primarily interested in a driver-assistance category called SAE 2, which is one of five levels created by the Society of Automotive Engineers. This category includes Tesla's Autopilot.
- Level 1 systems include a single feature like adaptive cruise control that assists drivers in maintaining safe distances behind cars.
- Level 2 systems can take full control of acceleration, braking, and steering, but the driver must be behind the wheel and ready to intervene if the system is not responding properly.
- Level 3 systems possess technology that can control the vehicle by itself, although a driver must be present to intervene if necessary. In May, Mercedes-Benz became the first automaker in the world to sell Level 3 cars, when Germany gave it the green light in that country. Mercedes-Benz says it is working with regulators in California and Nevada and hopes to be selling Level 3 cars there by the end of this year.
- Level 4 and 5 cars require no humans for operation. Driverless taxis are considered Level 4 vehicles, and California regulators gave a go-ahead on June 2 for Cruise (a company owned by General Motors) to operate driverless cabs in one area of San Francisco during late-night hours. Competitor Waymo has already been providing limited driverless taxi service in San Francisco and a few other locations, but with a backup driver present.
Why Are Auto Makers Pushing Them?
Although the reasons are not exactly clear, the auto industry has been pursuing driver-assisted technology for years. Doubters say there's no good reason for it, but the auto industry and many American politicians point to improved safety as the goal. Again, though, it is important to keep in mind that simple market demand is a big part of the reason; consumers want the systems and make buying decisions based on their availability.
In the end, many predict, we will have a system where all vehicles are driverless. The assumption is that the vast majority of the 6 million car accidents in this country every year are the result of human error.
Leaving the job to machines will make it safer for us all. Or so the argument goes.
But first, we don't know for certain that an entirely driverless fleet of vehicles will necessarily be that safe. Will they see like we drivers see? Will they make snap decisions like we drivers learn from experience — like slowing down when you see a deer emerging from a nearby woods or concluding that a bouncing ball in a roadway could mean that a child will follow? And what about technical bugs?
At the Intersection of Humans and Machines
Until that day comes, we must determine how the interaction between human drivers and these automated systems is working out. That is why there is all this attention now on the vehicles containing the Level 2 systems.
Many of the headlines following NHTSA's reports suggested that they cast doubt on automakers' promises of improved safety in vehicles using the new technology. Others, however, contend that 392 recorded crashes are an admirable number when you consider there are nearly 6 million total crashes annually.
The problem with the report is that it provides no basis for comparison. NHTSA identified Tesla as the worst offender, accounting for two-thirds of the SAE2 accidents. But Tesla also apparently has more of these types of vehicles on the roads than other automakers — around 830,000 of them. But the report doesn't say how many comparable cars from other companies are on the road.
Also, the reporting requirements are not firm. Tesla has automatic reporting through vehicle telematics. Others rely on unverified customer claims.
All Eyes on Tesla
Tesla has taken a hit, which might not be fair since their cars may be more numerous and their reporting responses more dutiful. But NHTSA has already had reason to investigate Tesla for a series of accidents involving Autopilot-enabled Teslas plowing into police cars, fire trucks, and other emergency vehicles. Those collisions resulted in 17 injuries and one death.
Meanwhile, other studies have found troubling flaws in Teslas. Consumer Reports engineers found that Autopilot's optionally activated lane-change feature was dangerous and that the system could be "tricked" into operating without anybody in the driver's seat.
One of the biggest arguments about driver-assistance technology and safety is that these systems may create greater highway danger by lulling drivers into inattentiveness. Last year, an MIT study concluded that drivers really do pay less attention to the road and roadway situations when Tesla's Autopilot is on.
Safety experts argue that those drivers are then unprepared to take action if the system malfunctions or a situation emerges that requires their attention.
Tesla's Response
Despite naming the system Autopilot, Tesla is clear in telling drivers that the system isn't totally autopilot. "Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver," the company tells prospective purchasers. "It does not turn Tesla into a self-driving car nor does it make a car autonomous."
Still, Tesla's advertising, which has included the phrase, "Full Self Driving," has drawn the attention of lawmakers who think it dangerously promises prospective buyers something a bit more. Last August, Democratic Sens. Richard Blumenthal of Connecticut and Edward Markey of Massachusetts asked the Federal Trade Commission to investigate Tesla for deceptive marketing and unfair trade practices. On June 9, FTC Chair Lina Khan told Reuters that the issues raised in that letter are "on our radar."
It might be worth keeping in mind at this point that the FTC made Volkswagen pay $9.8 billion to misled buyers in 2016 for unjustified claims it made about the environmental performances of its diesel cars.
The Road Ahead
When it comes to driver-assistance technology, there is a long way to go before we know how safe these systems are.
No doubt there will be more cases like a current one in Los Angeles involving a Tesla driver who ran through a red light while his car was on Autopilot, killing two people in a Honda. The driver, who faces manslaughter charges, blames Tesla and Autopilot. A trial is upcoming, and Tesla is sure to point to the disclaimer it gives to all purchasers: Autopilot requires fully attentive drivers.
So, what can we glean from all this confusion? Maybe this: Driver-assistance technologies may provide enhanced safety, but you're still the driver. And drivers have serious responsibilities.
Related Resources:
- Find a Personal Injury Lawyer Near You (FindLaw's Lawyer Directory)
- More Traffic Deaths Despite Pandemic and Fewer Drivers (FindLaw's Law and Daily Life)
- Road Rage Shootings Skyrocket (FindLaw's Law and Daily Life)
- Should Sidewalk Robots Have Legal Rights as Pedestrians? (FindLaw's Law and Daily Life)
Facebook Post
A federal report measuring accidents involving "driver assistance" technology, like Tesla's Autopilot, raises more questions about safety than it answers. As automakers roll out more and more of these systems, how safe — or unsafe — should motorists feel?