The National Highway Traffic Safety Administration (NHTSA) launched an investigation Thursday after Tesla's fully self-driving (FSD) software was reportedly involved in four crashes, including one that resulted in the death of a pedestrian.
The agency said it was investigating the possibility that the system “fails to detect and disarm in certain circumstances where it cannot operate properly.”
According to NHTSA, the four reported crashes occurred when the FSD was activated and the vehicle encountered poor visibility, such as sun glare, fog, or dust. In one of the incidents, a pedestrian was struck and killed.
The study covered approximately 2.4 million Tesla vehicles, including most model years of the Model S, Model X, Model 3, Model Y, and Cybertruck.
Tesla's FSD software is intended to enable the car to operate with “minimal driver intervention,” but the electric car maker's website states that this does not make the car self-driving. It is written.
The survey comes a week after Tesla CEO Elon Musk announced the company's long-awaited robotaxis. The billionaire unveiled a fully self-driving vehicle with no steering wheel or pedals at a highly staged event held at Warner Bros. Studios.
He argues that “cyber taxis” are safer than human drivers, given that the artificial intelligence behind robotaxis has been trained on numerous vehicles and has more experience than humans. did.
Tesla did not immediately respond to a request for comment on the NHTSA investigation.





