SELECT LANGUAGE BELOW

Safety Concerns Are Mounting About Elon Musk’s Tesla ‘Full Self-Driving’ System

Tesla's “full self-driving” system has come under intense scrutiny after accidents and experts raised questions about its safety and readiness for widespread use, even as Elon Musk promotes the company's cars as the robot taxis of the future.

AP News Reports Tesla's vaunted “Full Self-Driving” (FSD) system, currently used by roughly half a million Tesla owners, has raised concerns about its safety and ability to navigate autonomously. The system, which Tesla claims can get from point A to point B with minimal human intervention, has been the subject of several accidents that have attracted the attention of federal regulators.

William Stein, a technology analyst at Truist Securities, tested the latest version of FSD three times in the past four months, and each time, he reported that the vehicle behaved unsafely or illegally, leaving his 16-year-old son “terrified” during a recent test drive. Following Stein's experience and a fatal crash involving an FSD-equipped Tesla in the Seattle area, the National Highway Traffic Safety Administration (NHTSA) launched an investigation into the system.

Tesla CEO Elon Musk has made bold predictions about FSD's capabilities, suggesting it could be safer than human drivers by the end of this year or next. But experts in the autonomous vehicle field are increasingly skeptical that the system can work safely at scale, and many are wondering whether Tesla is close to deploying self-driving robotaxis as Musk predicted.

One of the main concerns experts cite is that Tesla's FSD system relies on cameras and computers that may not be able to accurately detect and identify objects, especially in bad weather or low light. Most other companies developing autonomous vehicles, including Waymo and Cruise, use a combination of cameras, radar and laser sensors to more accurately perceive the environment.

Missy Cummings, a professor of engineering and computing at George Mason University and a prominent Tesla critic, stresses that a car cannot be driven safely by vision alone: ​​Even systems incorporating laser and radar technology don't operate reliably in all conditions, raising questions about their overall safety.

Another problem is the lack of common sense and narrow learning capabilities of machine learning systems like FSD. When self-driving cars encounter situations they are not trained to handle, they are prone to crashes, explains Phil Koopman, a professor at Carnegie Mellon University who studies self-driving car safety.

Tesla is facing declining electric vehicle sales and Musk has urged investors to view the company as a robotics and AI business, but the safety and effectiveness of FSDs remain under intense scrutiny. The NHTSA is evaluating information about a fatal crash in Washington state and is investigating whether a recent Tesla recall aimed at improving its driver monitoring system for self-driving cars was successful.

Some Tesla enthusiasts have shared videos of their cars driving themselves without human intervention, but those examples aren't a comprehensive representation of the system's long-term performance. Alan Kornhauser, who leads self-driving car research at Princeton University, suggests Tesla might start by launching a small-scale ride-hailing service in areas where it can guide its cars with detailed maps.

The Associated Press contributed to this report.

Learn more AP news here.

Lucas Nolan is a reporter for Breitbart News covering free speech and online censorship.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News