Tesla gained a victory in a lawsuit against its driver-assistance software on Tuesday. A California jury ruled that Tesla’s autopilot feature was not at fault for the crash that killed a Tesla driver and injured two passengers in 2019, according to a Reuters report.
The lawsuit claimed Tesla’s driver-assistance software caused the Tesla Model 3 Micah Lee was driving to veer off a California highway and crash into a tree, engulfing itself in flames. Lindsay Molander and her son Parker Austin, who were both severely injured in the crash, filed the lawsuit accusing Tesla’s software of forcing the steering wheel to turn abruptly.
“They sold the hype and people bought it,” Molander’s attorney, Jonathan Michaels said in his opening arguments to jurors, The Washington Post reported last month. Tesla “made a decision to put their company over the safety of others.”
Tesla’s attorney argued the company’s software didn’t cause the crash, instead claiming it was “classic human error,” and saying the Autopilot feature is “basically just fancy cruise control,” the outlet reported.
An autopsy showed Lee had alcohol in his system at the time of the crash, but it wasn’t enough to exceed California’s legal intoxication limits. Tesla attorney Michael Carey told jurors: “When someone gets in a crash after they have had a few drinks, we don’t accept their excuses,” The New York Times reported. He also argued that it is unclear whether the autopilot feature was engaged at the time of the crash.
Molander and her son’s attorney argued that Tesla was aware of a software defect that could cause the car to veer without notice and despite human intervention, but the company claimed the software wasn’t capable of veering the car as suddenly as it had in the accident.
The 12-member jury deliberated over four days and in a 9-3 vote said they found the vehicle did not have a manufacturing defect and Tesla was not at fault for the fatal crash.
The company won a similar trial in April for a lawsuit that claimed its Autopilot system was at fault when a Model S swerved into a curb, injuring the driver in Los Angeles.
Tesla’s Autopilot and Full Self-Driving (FSD) systems continue to face legal challenges. The company argues that the technology’s names are misleading to drivers, who may believe the vehicle doesn’t require human monitoring when the systems are engaged.
Tesla received a subpoena from the U.S. Department of Justice to release all documents pertaining to its Autopilot and FSD features last week after not complying with the DoJ’s request for documents in January. The National Highway Traffic Safety Administration opened investigations into Tesla’s autonomous driving software in response to reports that its vehicles crashed into emergency vehicles while in autopilot mode.