U.S. auto safety regulators said Friday that an investigation into Tesla’s Autopilot identified at least 13 fatal crashes involving the feature. The investigation also found that the claims made by electric car manufacturers did not match reality.
The National Highway Traffic Safety Administration (NHTSA) on Friday found at least 13 Tesla crashes in which at least one person died and many more were seriously injured in a three-year Autopilot safety investigation that began in August 2021. It has been revealed that it has been identified. It said that “foreseeable misuse of the system by the driver clearly played a role”.
It also found evidence that “Tesla’s weak driver engagement system was not adequate for Autopilot’s permissive operating capabilities,” resulting in “significant safety gaps.”
NHTSA also expressed concern that Tesla’s Autopilot name “could lead drivers to believe that the automation has better capabilities than it actually does, leading drivers to place undue trust in the automation.” .
In December, Tesla announced that its largest-ever recall, which affected 2.03 million U.S. vehicles, or nearly all of its vehicles on U.S. roads, would require drivers to be more careful when using the company’s advanced driver-assistance systems. It was announced that this was to ensure thorough implementation.
After the initial investigation concluded, regulators opened a new investigation to determine whether a recall to install new Autopilot safety equipment was appropriate.
NHTSA announced it would launch a second investigation after identifying concerns caused by crashes after a recall software update was installed on vehicles and “NHTSA preliminary testing results of improved vehicles.”
According to NHTSA, the recall investigation includes Model Y, X, S, 3, and Cybertruck vehicles equipped with Autopilot and manufactured in the United States from the 2012 to 2024 model years.
The agency said Tesla had issued software updates to address issues believed to be related to the company’s concerns, but did not consider them “as part of a recall or to fix defects that pose an unreasonable safety risk.” He said he had not made any decisions. NHTSA also cited Tesla’s statement that “some of the remedies require owners to opt-in and make it easy for drivers to opt out.”
Tesla said in December that Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.
Tesla did not immediately respond to a request for comment.
Consumer Reports, a nonprofit that reviews products and services, in February tested Tesla’s Autopilot recall update and found that the changes did not adequately address many of the safety concerns raised by NHTSA. NHTSA announced that it had found this out and asked the agency to request the next steps from automakers. Tesla’s recall “addresses minor inconveniences rather than solving real problems,” he said.
Tesla’s Autopilot is intended to allow the car to automatically steer, accelerate, and brake within its lane, but while the enhanced Autopilot can assist with lane changes on the highway, it It’s not meant to be self-driving.
One of Autopilot’s components is Autosteer, which maintains a set speed or following distance and works to keep your vehicle within its lane of travel.
Tesla said in December that while it disagreed with NHTSA’s analysis, it would “incorporate over-the-air software updates that incorporate additional controls and warnings already present on affected vehicles to further encourage drivers to comply with ongoing driving responsibilities.” “We will introduce it,” he announced. Whenever Autosteer is engaged. ”
Anne Carlson, then the NHTSA’s chief executive, said in December that a review found that more steps needed to be taken to ensure driver engagement when using Autopilot. “One of the things we determined is that drivers aren’t always paying attention when the system is on,” Carlson said.
NHTSA began investigating Autopilot in August 2021 after identifying more than a dozen crashes in which Teslas collided with stationary emergency vehicles.
Separately, since 2016, NHTSA has opened more than 40 special investigations into Tesla crashes in which driving systems such as Autopilot were suspected to have been used, and 23 fatalities have been reported to date. has been done.
Tesla’s recall includes enhanced visual warnings, disabling Autosteer if the driver doesn’t respond to inattentive warnings, and additional checks when Autosteer is activated. Tesla has announced that it will restrict Autopilot’s use for one week if serious inappropriate use is detected.
In October, Tesla revealed that the US Department of Justice had issued a subpoena related to its fully self-driving (FSD) and Autopilot features. Reuters reported in October 2022 that Tesla was under criminal investigation.
Tesla will spend $360,000 in February 2023 to update its FSD beta software after NHTSA said the vehicles did not properly comply with traffic safety laws and could cause crashes. 2,000 US vehicles were recalled.





