FOR A FREE CONSULTATION 754-777-9911
FOR A FREE CONSULTATION 754-777-9911

Tesla Motors is being sued by the family of a 50-year-old Florida man who died in a crash while using the company’s Autopilot advanced driver assistance system. The family of Jeremy Beren Banner is suing Tesla for wrongful death while asking for damages of more than $15,000. A family attorney announced the lawsuit on Thursday, though it has apparently not yet been filed with the Palm Beach County Clerk.

Mr. Banner is the fourth known driver to die while using Tesla’s Autopilot, and his family is the second to sue Tesla over a fatal crash involving the technology.

In May, Tesla was sued by the family of 38-year-old Wei Huang, who died in 2018 after his Model X crashed into an off-ramp divider with Autopilot engaged.

Banner was killed while driving along a Florida highway at 68 miles per hour in March, this year. His Tesla Model 3 collided with a tractor-trailer that was crossing his car’s path, which tore the roof off of the car. The vehicle ultimately came to a stop about 1,600 feet away from the site of the impact.

Tesla’s account of the crash, of course, differed slightly. The company said it told the NTSB and the National Highway Traffic Safety Administration that the vehicle’s data logs showed Banner “immediately removed his hands from the wheel.”

This would mean Banner didn’t comply with the company’s instructions that drivers keep their hands on the wheel while using Autopilot.

But the NTSB’s language leaves room for the possibility that Banner had his hands on the wheel when he crashed. Autopilot users often receive a warning to apply pressure to the wheel even when they’re already gripping it, and so the exact order of events remains up in the air.

The NTSB also said that “[n]either the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.”

The NTSB’s full investigation is likely to take another year to be completed. A lawyer for Banner’s family said that Tesla has video of the accident from the car’s onboard cameras, but it’s unclear if the family has been given access to that footage.

Tesla also often reminds drivers that they need to supervise Autopilot at all times, though the company still markets and sells an Autopilot package it calls “full self-driving.” Musk has said in the past that serious crashes involving Autopilot are often the result of the “complacency” of “inexperienced user[s].”

“They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do,” Musk said in 2018.

The circumstances of Banner’s crash very closely resemble those of the first high-profile fatality involving Autopilot. In 2016, 40-year-old Joshua Brown collided with a tractor-trailer that was crossing his path on a Florida highway. Brown was also using Autopilot at the time of his death.

Tesla said in 2016 that its camera system failed to recognize the white broadside of the truck against the bright sky. The NHTSA eventually came to the conclusion that Brown was not paying attention to the road, though the NTSB said a lack of safeguards contributed to his death.

The car Brown was driving had a completely different version of Autopilot which relied on tech from Israeli company Mobileye. But the similarities in the crashes suggest that Tesla didn’t address this issue with Autopilot’s ability to recognize a crossing tractor-trailer, regardless of the potential fault of the driver.