Tesla Autopilot Wrongful Death Lawsuit Filed Over Fatal Crash Involving Self-Driving Software
A family has filed a wrongful death lawsuit against Tesla, alleging the company misrepresented the safety of its Autopilot feature, which they claim caused a 33-year-old California man to crash into the back of a fire truck, resulting in fatal injuries.
Tesla’s Autopilot is an advanced driver assistance system (ADAS) designed to automate certain vehicle functions to enhance convenience and safety. These features include traffic-aware cruise control, which adjusts the car’s speed based on surrounding traffic, autosteer to assist with maintaining lane position, navigation on Autopilot for route guidance, and auto lane change for switching lanes on highways.
Despite these capabilities, Autopilot operates as a Level 2 system on the SAE (Society of Automotive Engineers) scale of autonomous driving, which means it provides only partial automation. The system requires continuous human oversight, with drivers expected to keep their hands on the steering wheel and remain ready to assume control at any moment.
In a complaint (PDF) filed in the Superior Court for the State of California in October, plaintiffs Caleb Mendoza, Eduardo Mendoza and Maria Mendoza seek damages on behalf of themselves and their deceased relative, Genesis Giovanni Mendoza Martinez, claiming that Tesla engaged in deceptive marketing practices, and suggested that its Autopilot and Full Self-Driving capabilities were safer than human drivers.
The lawsuit asserts that Tesla’s misrepresentations prompted Giovanni Mendoza Martinez to trust the Autopilot feature, which failed to detect an emergency vehicle, resulting in a fatal collision that also caused injuries to others at the scene. Tesla removed the lawsuit to the U.S. District Court for the Northern District of California on December 4, 2024.
Did You Know?
Change Healthcare Data Breach Impacts Millions of Customers
A massive Change Healthcare data breach exposed the names, social security numbers, medical and personal information of potentially 100 million Americans, which have now been released on the dark web. Lawsuits are being pursued to obtain financial compensation.
Learn MoreTesla Wrongful Death Lawsuit
According to the complaint, Genesis Giovanni Mendoza Martinez had been driving his Tesla Model S in Autopilot mode at approximately 71 mph for about 12 minutes on Interstate 680 in California on February 18, 2024, when the vehicle collided with a parked fire truck blocking his lane.
The lawsuit claims that Giovanni frequently relied on the Autopilot feature during freeway travel, influenced by Tesla’s advertising, which he believed portrayed Autopilot as a safe and fully autonomous driving system.
“[A]t all times relevant to this complaint, Tesla has marketed its ADAS technology under various names, including ‘Autopilot,’ ‘Enhanced Autopilot,’ and/or ‘Full SelfDriving Capability,’ all of which falsely—and intentionally—imply that the vehicles equipped with such software can operate at SAE Levels 3, 4, and 5, when in reality they are SAE Level 2 at best,” the Mendozas’ lawsuit claims.
The Mendozas’ complaint names Tesla Inc. as the defendant, alleging claims of strict product liability, negligent product liability, negligent misrepresentation, fraudulent misrepresentation, concealment, negligent infliction of emotional distress and wrongful death.
The lawsuit seeks not only economic and noneconomic damages but also punitive damages for Tesla’s alleged reckless disregard for safety, willful misrepresentation of its Autopilot capabilities, and conscious decision to prioritize profits over consumer protection, resulting in preventable injuries and fatalities.
In response to the Mendozas’ complaint, Tesla filed a notice of removal (PDF) to the U.S. District Court for the Northern District of California on December 4. The removal was based on the company’s Texas headquarters being outside the jurisdiction of the original complaint, as well as the plaintiffs’ requested damages exceeding the maximum threshold permitted in California Superior Court.
This request for removal from state court to federal court does not alter the substance of the Mendozas’ complaint, it only changes the venue where the trial may ultimately take place.
Tesla Self-Driving Issues
In October 2024, the U.S. National Highway Traffic Safety Administration (NHTSA) launched an investigation into Tesla’s self-driving software following a series of pedestrian accidents allegedly linked to the company’s Autopilot system. The investigation was prompted by reports of at least four incidents involving pedestrians, raising concerns over the safety of the software, which is installed in more than 2.4 million Tesla vehicles.
This inquiry followed a 2023 NHTSA announcement of a recall affecting over 2 million Tesla vehicles due to issues with an “autosteer” feature that could increase the risk of accidents. The recall was prompted by mounting evidence that Tesla’s Autopilot system was associated with a significantly higher number of crashes than previously disclosed.
NHTSA data showed that Tesla Autopilot features were involved in at least 736 accidents between 2019 and 2023. Prior to that report, it was believed the technology had been involved in less than 300 crashes.
0 Comments