Tesla finds itself in legal hot water with two upcoming trials that raise questions regarding the safety of its Autopilot system in fatal accidents. The cases could have ripple effects on public trust in self-driving technologies.

Tesla’s full driving and self driving technologies have a lot of gray areas that need to be addressed

The first lawsuit will unfurl this September in California. It centers around the tragic loss of Micah Lee, whose Tesla Model 3 veered off a highway, collided with a palm tree, and caught fire. The plaintiffs, which include injured passengers and Lee’s estate, argue that Tesla knew about Autopilot’s flaws but didn’t address them.

Tesla

Tesla counters that Lee had been drinking and it’s uncertain if Autopilot was engaged during the accident in 2019.

A second case in Florida next October adds fuel to the fire. It involves another fatal accident where Stephen Banner’s Model 3 went under a truck’s trailer, tearing off the car’s roof. The claimants assert that Autopilot failed to take evasive action. Internal documents suggest that Elon Musk and his team were aware of Autopilot’s issues but did not rectify them. Tesla has continually defended its technology, asserting that when supervised by humans, Autopilot is safe.

The public, however, is skeptical. Critics argue that the terminology—“Autopilot” and “Full-Self Driving” (FSD)—is misleading. Though Musk has repeatedly promised fully autonomous driving, the reality has fallen short. Tesla’s Autopilot and FSD are only Level 2 driver assistance features, not fully autonomous systems.

The upcoming trials are more than a legal obstacle for Tesla; they’re a moment of reckoning for the self-driving tech industry. While Tesla pushes forward with its FSD Version 12 software, claiming it enables autonomous driving without human input, the question remains: How safe is “safe enough” when human lives are at stake?

RELATED:

(Via)