On the eve of a jury trial expected to delve deeply into Tesla’s controversial “Autopilot” technology, the company has settled a lawsuit filed by the family of an Apple engineer killed on Highway 101 in Mountain View, according to court filings Monday.

Walter Huang, a married father of two from Foster City, died in 2018 after Autopilot steered his Tesla Model X compact SUV into a freeway barrier while he played a video game on his phone, a federal investigation found.

His wife, Sevonne, sued Tesla in 2019 on behalf of herself and the couple’s two children, seeking unspecified damages. The lawsuit in Santa Clara County Superior Court claimed Huang’s 2017 Tesla “lacked a properly designed system for crash avoidance,” and “was a vehicle that could and would strike and collide with ordinary and foreseeable roadway features in Autopilot mode.”

The case was to go before a jury this week, but Tesla said in court filings that it had reached a deal with Huang’s wife and kids. According to a filing by the electric car maker led by CEO Elon Musk and a lawyer for the Huang family, the settlement amount is confidential. No other terms of the deal were disclosed in the filings. Tesla did not immediately respond to a request for comment.

Tesla’s 2015 deployment of its Autopilot software — standard on every vehicle — has led to a litany of regulatory and legal problems for the company formerly headquartered in Palo Alto and now based in Texas. The basic version provides cruise control and steering assistance, while an enhanced version including navigation and automated lane changes and exiting.

Central to lawsuits and investigations are questions around whether Tesla’s marketing and the name of Autopilot encourage drivers to take their hands off the wheel and their attention off the road. The system’s ability to recognize stopped emergency vehicles and take appropriate action has also been called into question.

The National Highway Transportation Safety Administration has been investigating Autopilot since August 2021, initially looking into 17 incidents in which a Tesla on Autopilot ran into a parked emergency vehicle on a highway. In 2022, the agency said it had broadened the probe, “to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”

In March of last year, the agency said it was launching a special team to investigate a fatal crash the previous month in which a Tesla Model S sedan collided with a ladder truck from the Contra Costa County fire department. The driver of the Tesla was killed, a passenger was critically injured, and four firefighters suffered minor injuries.

A 2019 survey by the Insurance Institute for Highway Safety found that Tesla’s Autopilot, more than any other manufacturer’s driver-assistance systems, led people to overestimate system capabilities, with 48% saying they thought it would be safe to take their hands off the wheel while using it.

In 2022, the California Department of Motor Vehicles filed an administrative complaint claiming Tesla deceptively advertised Autopilot and its “full self-driving” system in ways that contradict its own warnings that the features require active driver supervision.

Tesla, on its website, warns that Autopilot is “intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”

The first lawsuit to blame Autopilot for a deadly crash — filed by the family of the deceased driver, Micah Lee, and two passengers seriously injured in the southern California crash — went to trial last year, with the jury siding with Tesla. Several such cases are headed for trial this year.

In the Huang case, the software engineer turned on Autopilot about 19 minutes before he crashed into the barrier known as a “gore point,” Tesla said in a March court filing. “He set the speed at 75 mph and (Autopilot) remained engaged until impact,” the filing said. Huang’s hands “were not detected on the steering wheel the last 6 seconds before the crash,” the filing said. “He had received two visual and one audible alerts reminding him to put his hands back on the steering wheel during the drive.”

Tesla acknowledged that its data and analysis showed that four times in the 35 days before Huang’s crash, “Autopilot steered slightly to the left at this same gore area,” and Huang each time “almost immediately corrected the steer by turning the wheel back to the right.” But the company claimed that its data showed that in the days before the incident, Huang had been increasing the time he spent on Autopilot with his hands off the wheel, reaching a point of “extraordinary misuse” of the system.

Tesla in a court filing said it settled the Huang case “to end years of litigation” and that it sought to keep the settlement amount secret because publicizing it could make settling future cases harder for the company.

Source: www.mercurynews.com