Judge Claims Tesla Knew About Autopilot Danger

A court in Florida has given the go-ahead for a lawsuit that accuses Tesla of gross negligence and deliberate wrongdoing in connection to a fatal accident utilizing the Autopilot Driver Assistance System.

Stephen Banner, owner of Tesla, was killed in a 2019 incident north of Miami when his Model 3 Tesla was crushed under the trailer of an 18-wheeler big rig vehicle that had pulled onto the road. Banner’s roof was ripped off when the autopilot sent him under the truck.

The evidence that Judge Reid Scott claims to have found suggests that Tesla “engaged in a marketing campaign that presented the goods as autonomous” and that Musk’s public statements on the technology “had a major influence on the belief about the capabilities of the products.”

Kim Banner, Banner’s wife, has accused Tesla of willful malfeasance and gross carelessness in the lawsuit she filed. Scott compared Banner’s crash to Tesla’s first fatality on autopilot in 2016, calling it “eerily similar” in his judgment.

In one instance, a semi-truck was struck by a Model S, which resulted in the destruction of the truck’s top and the loss of life for the driver. In a blog post on the tragic accident, Tesla said that neither Autopilot nor the driver had recognized the tractor-trailer’s white side against the brilliantly illuminated sky; thus, they did not use the brakes. Due to the unusual combination of factors, including the trailer’s elevated ride height, placement across the road, and the impact’s extraordinary conditions, the Model S fit under the trailer, and the trailer’s base hit the Model S’s windshield.

Last week, however,a court in California found that Tesla’s driving-assistance software was not at fault in a vehicle incident that killed the driver and severely wounded two passengers.

The Palm Beach case has not yet been rescheduled for a trial.

A marketing film that Tesla published in October 2016 showing an Autopilot-equipped vehicle traversing a metropolis and claiming it was “driving itself” has also drawn criticism because the video was essentially fake.

June research out of the Washington Post indicated that 736 incidents involving Teslas equipped with Autopilot had occurred since 2019, with 17 fatalities.