Safety experts argued that Tesla did not adequately safeguard against the misuse of its autopilot system.
In a Miami courtroom, Mary “Missy” Cummings, an engineering professor at George Mason University, questioned Tesla’s measures to prevent improper use of Elon Musk’s autopilot. This testimony is part of a federal trial related to a fatal incident in 2019, which involved a Tesla vehicle in Key Largo, Florida.
Cummings noted that the owner manuals for Tesla vehicles, which include crucial warnings about the Autopilot system, are not easily accessible to drivers. She highlighted that Tesla was working with a driver who had disregarded warning notifications generated prior to the crash. Furthermore, she mentioned that Tesla has not adopted Geo-Fencing, a technology already in use by other car manufacturers.
When plaintiff’s lawyer Brett Schreiber inquired about Tesla’s choice not to implement geofencing back in 2019, Cummings stated, “I believe they are using it as a way to sell more cars.” Having previously served as a senior adviser at the National Highway Traffic Safety Administration (NHTSA), Cummings is expected to face more questions from Tesla’s legal team when she returns to testify.
The trial, anticipated to run for three weeks, marks the first federal case challenging the safety claims made by Tesla’s CEO Elon Musk. This moment is particularly significant for Tesla as the company aims to launch its Robotaxi service, which relies heavily on autonomous driving technology.
The lawsuit is on behalf of the deceased Nyberbenavidez Leon and Dillon Anglo, who suffered serious injuries when their Tesla ran through a T-intersection in Key Largo. The lawyers for the plaintiffs argue that Tesla’s driver assistance technology is flawed and that the company failed to adequately warn users about its limitations. Conversely, Tesla claims that the crash was due to driver error, a defense the company has successfully used in two previous California incidents involving autopilot-related accidents.
George McGee, a Model S driver involved in the crash, had been using the driver assistance system but had dropped his phone and was looking for it on the floor instead of the road. The plaintiff’s attorney has shown a video clip from the car’s camera, which demonstrates the system’s ability to identify the edge of the road, stop signs, parked vehicles, and nearby pedestrians.
Tesla argues that the technology available in 2019 was not capable of preventing crashes, asserting that McGee acted with complete negligence before accelerating and overriding the vehicle’s adaptive cruise control.
In response to questions about a letter sent to the NHTSA, Cummings told the judge that “Autopilot has the most robust warning against misuse and abuse by drivers of features ever deployed in the automotive industry.”
Facing backlash from Tesla supporters, Cummings was labeled “very biased against Tesla” by Musk when she was appointed as a safety advisor for NHTSA in 2021. She has previously served as an expert witness in at least two other lawsuits against Tesla regarding the autopilot system.
The professor noted that McGee apparently believed the car was acting as his co-pilot, assuming it would handle obstructions on the road. This belief is common among many Tesla drivers, who often feel they can rely on the autopilot while engaging in other tasks, like retrieving missed calls.
