Trouble for Tesla’s Self-Driving System
The National Highway Traffic Safety Administration (NHTSA) has initiated an investigation into nearly 2.9 million Tesla vehicles equipped with the “fully self-driving” (FSD) feature. Officials are concerned that this technology might breach traffic laws and even lead to accidents. Reports indicate that there have been 58 instances where Teslas allegedly ran red lights, swerved into incorrect lanes, or crashed at intersection points. Among these incidents, 14 involved actual collisions, and 23 resulted in injuries.
Issues with Traffic Signals and Railroad Crossings
One concerning trend shows six Teslas running a red light and crashing into another vehicle. A driver from Houston mentioned that the FSD system “didn’t recognize the lights,” claiming that while his car halted for a green light, it proceeded through a red. He also noted that Tesla acknowledged the issue during a test drive but chose not to address it. Additionally, there are new claims that FSD-equipped Teslas struggle to safely navigate railroad crossings, almost colliding with trains in some cases.
Ongoing Regulatory Scrutiny
This isn’t the first time Tesla has faced regulatory inquiries. The company is already dealing with several investigations concerning its Autopilot and FSD systems. In a notable case, a jury in California mandated that Tesla pay $329 million following a fatal accident involving Autopilot. Furthermore, there was another investigation into the limited robotaxi service in Austin, Texas, where riders reported erratic driving and speeding, despite a human safety driver being present. Meanwhile, Tesla is in a legal battle over allegations of misleading advertising from the California DMV, which claims that branding the software as “fully autonomous” is deceptive since it necessitates continuous driver oversight. Tesla recently modified the name to “Full Self-Driving (Supervised)” to better reflect its functionality.
Potential for Future Crashes
A new update for Tesla’s FSD software was released just before the investigation commenced. However, NHTSA has claimed that the system already often “induces vehicle behavior that breaches traffic safety laws.” This investigation is still in its nascent phases and, depending on its findings, could lead to a recall if the self-driving software is found to pose risks to safety.
What This Means for Drivers
For those using Tesla’s FSD, it’s crucial to remain cautious. Regardless of the name, this system is not genuinely autonomous. It’s advised to:
- Keep both hands on the wheel and stay alert.
- Manually disable the system when nearing intersections or railroad crossings.
- Regularly check for software updates for potential safety enhancements.
- Report unsafe FSD behavior to NHTSA.
This situation serves as a reminder that “self-driving” still involves a degree of supervised driving.
Key Takeaways
Tesla’s ambitions for a fully autonomous future are encountering challenges. With heightened scrutiny from safety regulators and ongoing lawsuits, the company’s forthcoming actions will be critical in shaping public confidence in AI-enabled transportation. Nonetheless, efforts to advance automation continue.





