Federal authorities say there is a “critical safety lapse.” TeslaThe Autopilot system has contributed to at least 467 crashes, 13 resulting in fatalities and “many more” resulting in serious injuries.

The findings come from analysis of the National Highway Traffic Safety Administration of the 956 crashes in which Tesla Autopilot is believed to have been used. The results of the nearly three-year investigation were released Friday.

Tesla’s Autopilot design “has led to predictable misuse and avoidable crashes,” the NHTSA report said. The system does not “sufficiently ensure driver attention and correct use”.

The agency also said it is starting a new study on the effectiveness of the software update Tesla previously released as part of the December recall. This update was intended to correct Autopilot defects that NHTSA identified as part of the same investigation.

The voluntary recall via an over-the-air software update covered 2 million Tesla vehicles in the US and was meant to specifically improve the driver monitoring systems in Teslas equipped with Autopilot.

NHTSA suggested in its report Friday that the software update was likely inadequate as more Autopilot-related crashes continue to be reported.

In one recent example, a Tesla driver in Snohomish County, Washington, struck and killed a motorcyclist on April 19, according to records obtained by CNBC and NBC News. The driver told police he was using autopilot at the time of the collision.

NHTSA’s findings are the latest in a series of reports by regulators and watchdogs questioning the safety of Tesla’s Autopilot technology, which the company has promoted as a key differentiator from other car companies.

On your websiteTesla says Autopilot is designed to reduce driver “strain” through advanced cruise control and self-steering technology.

Tesla has not issued a response to the NHTSA report as of Friday and did not respond to a request for comment sent to Tesla’s press inbox, its investor relations team and the company’s vice president of vehicle engineering, Lars Moravi.

Following the release of the NHTSA report, Senators Edward J. Markey, D-Mass., and Richard Blumenthal, D-Conn., issued a statement urging federal regulators to require Tesla to limit its Autopilot feature “to the roads it was designed for.”

On the owner’s manual websiteTesla warns drivers not to use the Autosteer feature on Autopilot “in areas where bicyclists or pedestrians may be present,” among a host of other warnings.

“We urge the agency to take whatever action is necessary to prevent these vehicles from endangering lives,” the senators said.

Earlier this month, Tesla settled a lawsuit from the family of Walter Huang, an Apple engineer and father of two who died in a crash when his Tesla Model X with Autopilot features turned on hit a highway barrier. Tesla tried to hide the terms of the agreement from the public.

In the face of these developments, Tesla and CEO Elon Musk signaled this week that they are betting the company’s future on autonomous driving.

“If someone doesn’t believe that Tesla is going to solve autonomy, I think they shouldn’t be an investor in the company,” Musk said during Tesla’s earnings call on Tuesday. He added: “We will and we are.”

Musk has promised customers and shareholders for years that Tesla would be able to turn its existing cars into self-driving vehicles with a software update. However, the company only offers driver assistance systems and has not yet produced self-driving cars.

He also made safety claims about Tesla’s driver assistance systems without allowing a third-party review of the company’s data.

For example, in 2021, Elon Musk claimed in a publication in social networks“A Tesla with Autopilot on is now approaching 10 times less chance of an accident than the average vehicle.”

Philip Koopman, automotive safety researcher and Carnegie Mellon University associate professor of computer engineering, said he views Tesla’s marketing and claims as “autowash.” He also said in response to the NHTSA report that he hopes Tesla takes the agency’s concerns seriously going forward.

“People are dying because of misplaced trust in Tesla’s Autopilot capabilities. Even simple steps could improve safety,” Koopman said. “Tesla can automatically limit the use of Autopilot to predicted roads based on mapping data already in the vehicle. Tesla can improve monitoring so that drivers cannot routinely be engrossed in their cell phones while Autopilot is in use.”

A version of this story was published on NBCNews.com.



https://www.cnbc.com/2024/04/26/tesla-autopilot-linked-to-hundreds-of-collisions-has-critical-safety-gap-nhtsa.html