National Highway Traffic Safety Administration (NHTSA) an investigation has been completed in Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal accidents resulting in 14 deaths. The organization ruled that these accidents were due to improper use of the system by the driver.
However, NHTSA also found that “Tesla’s weak driver activation system is not adequate for Autopilot’s permissive operating capabilities.” In other words, the software does not prioritize the driver’s attention. Drivers using Autopilot or the company’s fully self-driving technology “were not sufficiently engaged” because Tesla “did not adequately ensure that drivers maintained their attention on the driving task.”
The organization investigated nearly 1,000 crashes from January 2018 to August 2023, resulting in a total of 29 deaths. NHTSA found that there was “insufficient data to make an assessment” for about half (489) of those crashes. In some accidents, the other party was at fault or Tesla drivers did not use the Autopilot system.
The most serious were 211 crashes in which “the frontal plane of the Tesla struck a vehicle or an obstacle in its path,” and these were often related to Autopilot or FSD. These incidents resulted in 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react but failed to do so in 78 of those accidents. These drivers failed to stop or steer to avoid the hazard, despite having at least five seconds to make a move.
Here comes the complaints against the software. NHTSA says drivers would simply become too complacent to assume the system would handle any hazards. When it was time to react, it was too late. “Crazes without or late driver evasion attempts were detected across all Tesla hardware versions and crash circumstances,” the organization wrote. The imbalance between driver expectations and Autopilot’s operational capabilities created a “critical safety gap” that led to “predictable misuse and avoidable crashes.”
NHTSA also took offense to Autopilot’s branding, calling it misleading and suggesting it allows drivers to assume the software is in full control. To this end, competing companies tend to use branding with words such as “driver assistance”. Autopilot shows, well, autonomous pilot. California’s attorney general and the state Department of Motor Vehicles are also investigating Tesla for misleading branding and marketing.
Tesla, on the other hand, says it warns customers to pay attention while using Autopilot and FSD, According to On the edge. The company says the software features regular indicators that remind drivers to keep their hands on the wheels and eyes on the road. NHTSA and other safety groups have said those warnings don’t go far enough and are “insufficient to prevent abuse.” Despite these statements by safety groups, CEO Elon Musk recently promised that the company will keep going “balls to the wall for autonomy”.
The findings may represent only a small fraction of the actual number of crashes and accidents involving Autopilot and FSD. NHTSA stated that “gaps in Tesla’s telematics data create uncertainty about the actual rate at which vehicles operating with Autopilot involved in crashes.” This means that Tesla only receives data from certain types of crashes, which NHTSA claims , that the company collects data on about 18 percent of the crashes reported to the police.
With all this mind, the organization has open another probe in Tesla. This addresses a recent OTA software patch released in December after two million vehicles were recalled. NHTSA will evaluate whether the autopilot recall fix introduced by Tesla is effective enough.
https://www.engadget.com/nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746.html?src=rss