The push to bring self-driving cars to US roads got a significant boost on Thursday when the nation’s chief car safety regulator essentially cleared Tesla Motors’ Autopilot system of fault in a fatal 2016 crash.
The US National Highway Traffic Safety Administration found that the owner of a Tesla Model S sedan that drove itself into the side of a truck in May had ignored the manufacturer’s warnings to maintain control even while using the driver-assist function. The agency said it found no defect in the vehicle and wouldn’t issue a recall.
“The auto industry just let out a giant sigh of relief,” said Dave Sullivan, an analyst at consultancy AutoPacific. “This could have started a snowball effect of letting automakers and suppliers become liable for human error.”
The finding concludes traffic safety authority’s first investigation into the role played by automated driving systems in a fatal crash. It was a win for not only for Tesla but for companies from General Motors to Google that have invested billions of dollars in what they see as the future of personal transportation. Safety regulators, too, have backed the nascent industry, giving it the flexibility to develop products that they think could greatly reduce highway deaths.
Tesla CEO Elon Musk called the authority’s report “very positive.” In a Twitter message he highlighted data showing the company’s vehicle crash rate dropped by 38% after the company installed its auto-steer system.
“We appreciate the thoroughness of the report and its conclusion,” Tesla said in an e-mailed statement.
Some car-safety advocates have criticised Tesla for what they said was a premature introduction of its Autopilot system and said the safety authority could have taken stronger action.
“If a vehicle could not distinguish between a white truck and the sky, that to me would seem to be a defect,” said Joan Claybrook, traffic safety administrator under President Jimmy Carter and an car safety advocate.
Stephanie Brinley, a senior analyst at IHS Markit’s automotive group, cautioned that it’s too early to draw too many conclusions about self-driving vehicles from the findings.
“This decision does not in and of itself tell us what will happen down the road,” Brinley said. “It’s really too soon. ”
Self-driving risks
The US National Highway Traffic Safety Administration didn’t completely absolve self-driving technologies. The agency drew several observations about the limits of automated driver-aids and the risks associated with how drivers use them.
Automatic braking systems like the one on the Model S and those increasingly available on other new vehicles can’t address all crash scenarios, traffic safety administration spokesman Bryan Thomas said. The crash in May that killed Joshua Brown, a former Navy Seal and Tesla enthusiast, in Florida is an example of that, Thomas said.
The Model S’s sensors couldn’t distinguish the trailer against a bright sky as it attempted to cross the highway while making a left turn. Auto-braking systems are best at preventing rear-end collisions, not the cross-traffic collision that led to Brown’s death, Thomas said.
So-called level 2 automated driver systems like Tesla’s Autopilot, which provide automated driving functions in limited circumstances, continue to require a driver’s “full attention”, Thomas said.
Car makers must anticipate that some drivers will fail to do so and design their systems with the “inattentive driver in mind,” he said. He also signalled that car makers will be expected to provide clearer warnings about the limitations of automated driver aids, saying the traffic safety administration believes “strongly” that “it’s not enough to simply put it in an owner’s manual”.
Since the accident, Tesla added protections to its software that shuts off Autopilot if it detects the driver isn’t paying attention. The software also emphasises radar over cameras, and Musk has said that change would have made it easier for the car in the crash to detect the truck and might have saved Brown’s life.
The California-based group Consumer Watchdog said Tesla should have been held accountable for the accident.
The traffic safety administration has “wrongly accepted Tesla’s line and blamed the human, rather than the Autopilot technology and Tesla’s aggressive marketing”, John Simpson, the group’s privacy project director, said in an e-mailed release. “The very name Autopilot creates the impression that a Tesla can drive itself. It can’t.”
The National Transportation Safety Board, an independent agency that has no regulatory power, is conducting a parallel investigation of the accident. The safety board is planning to issue its conclusions by early summer, spokesman Christopher O’Neil said.
In spite of the crash, the Tesla Autopilot system appears to have improved the safety of its vehicles overall. Crash rates in Tesla vehicles equipped with the Autosteer system fell by 38% to 0,8 per million miles compared to those without it, the traffic safety administration said in its report.
Thomas said Tesla was “fully” cooperative and provided data on what he estimated were “dozens” of Tesla Model S and Model X crashes in which Autopilot was active during the crash or 15s prior.
Tesla was able to pull crash data directly from its vehicles, providing the agency access that “would not have been possible just a few years ago or with other automakers”.
Tesla advanced as much as 4,3% in US trading and closed up 2,3% at US$243,76/share, its highest since 28 April. Shares got an early boost from a Morgan Stanley upgrade. — (c) 2017 Bloomberg LP