SELECT LANGUAGE BELOW

Tesla Worker Killed in Crash May Be First Official ‘Full Self-Driving’ Fatality

Evidence suggests that Tesla’s advanced driver assistance system, Fully Self-Driving (FSD), was engaged in the fatal 2022 crash that killed Tesla employee Hans von Ohain in Colorado. ing. If this proves accurate, Tesla employees could prove to be just that. First fatal accident caused by Elon Musk’s “self-driving” software.

of washington post report On May 16, 2022, Hans von Ohain died when his Tesla Model 3 crashed into a tree and burst into flames in Evergreen, Colorado. Von Ohain worked as a recruiter at Tesla and was a big fan of CEO Elon Musk. His passenger, Eric Rossiter, survived the accident.

Rossiter told a 911 dispatcher that von Ohain activated the Tesla’s “self-driving feature,” causing the car to veer off the road on its own. Rossiter later said in an interview that he believed von Ohain was using the Tesla’s fully self-driving feature at the time of the accident.

Tesla still on fire in Connecticut (Stamford Fire Department)

Fully autonomous driving is Tesla’s most advanced driver-assistance technology, designed to guide vehicles down roads from quiet suburbs to crowded cities with little input from the driver. Over 400,000 Tesla owners have access to his FSD software, which is currently in beta testing.

If Rossiter’s account is true, this would likely be the first known fatal accident involving fully autonomous driving. In late 2021, federal regulators began requiring automakers to report crashes involving driver-assistance systems. Since then, it has recorded more than 900 Tesla EV crashes, including at least 40 with serious or fatal injuries. Most crashes involved Tesla’s simpler Autopilot system.

According to the police report, there were no signs of a skid at the scene of the Colorado crash, suggesting von Ohain did not apply the brakes before the collision. The car continued to power its wheels after hitting the tree, indicating that advanced driver assistance systems were activated at the time.

An autopsy revealed that von Ohain’s blood alcohol level was more than three times the legal limit. Experts say this level of intoxication would have severely hampered his ability to maintain control. However, von Ohain believed that the advanced self-driving features were working, and he may have given too much confidence to the car’s ability to self-correct.

Tesla has faced increasing complaints about unreliable behavior of its driver-assistance software, including sudden steering and braking. The lawsuit alleges that Tesla should also be held liable if its technology causes a crash or fails to prevent a crash. So far, Tesla has avoided responsibility by insisting that drivers must remain alert and in control at all times.

Breitbart News has reported on dangerous situations caused by Musk’s “full self-driving” software, including a video of a Tesla driving itself through a red light. In another case, a Tesla caused an eight-car pileup on the San Francisco Bay Bridge.

read more of washington post here.

Lucas Nolan is a reporter for Breitbart News covering free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News