Tesla’s 2019 Model S is not a fully automated self-driving vehicle, but two men, age 59 and 69, who died in a crash on Saturday night appear to have been either under that impression or simply experimenting.

Police at the site, in Spring, Texas, said the evidence at the scene suggests that “no-one was driving the vehicle at the time of impact,” said Mark Herman from Harris County Precinct 4, who also said the case was still under investigation.

It was not clear whether the automated system was in operation at the time of the crash. 

The Tesla hit a tree and caught fire, according to the reports. One victim was in the front passenger seat, the other in the rear passenger seat.

A New York Times report said police indicated that the wives of the men heard them talking about the vehicle’s Autopilot feature just before they left. 

Tesla sells automated driving systems called Autopilot and Full Self-Driving (FSD), but customers could be under a false assumption of safety.

Tesla Autopilot and FSD can’t control vehicles fully under all normal driving circumstances.

In Germany, according to a BBC report, Tesla can’t use the terms “autopilot inclusive” and “full potential for autonomous driving” in advertising materials.

Tesla head Elon Musk just recently tweeted about the company’s self-reported, first quarter data.

“Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.” 

Tesla has issued that warning. Multiple reports referenced a letter to the California DMV sent in late 2020 from Tesla lawyers that stated the Autopilot and FSD systems are not meant to be fully self-driving.

From the Tesla owners manual: “The currently enabled features require active driver supervision and do not make the vehicle autonomous.”

Add comment