A 2017 Tesla Model S sedan on autopilot mode suddenly began to accelerate on its own as it headed toward a highway offramp, ran off the road and crashed into a tree, according to a lawsuit filed by the driver.
Tesla’s autopilot mode, which the electric carmaker claims allows its vehicles to steer, accelerate and brake automatically in their lanes, is “at best a work in progress,” the lawsuit says.
Christopher Hinze, of Washington D.C., is seeking an unspecified amount of damages from Tesla for liability, negligence and breach of warranty.
His federal suit says Hinze suffered “catastrophic” injuries including shattered and fractured vertebrae and chest pain from the June 20, 2020 accident while driving his friend’s Tesla. The injuries required emergency spinal fusion surgery, and weeks of hospitalized recovery time.
“This isn’t an isolated incident,” David Wright, an attorney for the Southern California law firm McCune Wright Arevalo LLp, said in an interview Thursday.
The suit alleges that, unlike some other Tesla crashes involving the autopilot feature, Hinze was “actively and consciously maintaining active supervision of the vehicle, including keeping his hands on the steering wheel,” as Tesla recommends, when the car veered off the roadway.
Hinze activated a turn signal and the car merged into an exit lane, heading toward an interchange from Interstate 495 onto Route 123, which includes a “significant curve,” the suit says.
“Based on Tesla’s representations regarding the Autopilot feature and its performance during the journey so far, (Hinze) reasonably expected that the vehicle would be able to successfully navigate the transition road between the two freeways and proceed with the trip,” the suit says.
But in “a split second at the beginning of the curve,” Hinze recognized the Tesla was not going to reduce speed and make the turn.
Though Tesla says its autopilot does not mean “autonomous,” and that drivers must actively supervise the vehicle, Tesla’s website says the product “enables your car to steer, accelerate and brake automatically within its lane.”
“While fully autonomous driving may still be aspirational, Tesla designs, manufactures, and markets features on the Model S as technologically-advanced, if interim, steps on the road to fully computerized driving,” the suit adds.
“Even the most successful and sophisticated computer companies in history — Microsoft and Apple among them — regularly release computers and software with bugs, glitches, and unanticipated problems that cause their computers to unexpectedly crash, malfunction, or work differently than intended,” the suit continues.
But software and hardware bugs or glitches “are magnified exponentially when a computer controls a half-ton moving machine capable of accelerating from 0 to 60 miles per hour in under 4 seconds,” the suit continues.
“Our primary concern, as always, is making sure the vehicles on our roads are safe, and that they perform the way consumers expect them to perform and the way Tesla represents that they’ll perform,” Wright said.
Tesla didn’t immediately respond Thursday to an email requesting comment.
The complaint claims that many car crashes and “perhaps more than 20 deaths are attributable” to Tesla’s autopilot system, and noted several instances in which Tesla drivers have been killed when the system was engaged, including in the Bay Area.
One of the more widely publicized cases involved a San Mateo man who was driving his Tesla Model X with autopilot engaged on Highway 101 in Mountain View when the car veered left and struck a damaged crash attenuator at roughly 70 mph, killing the driver. The car’s battery also ignited after the crash.
After investigating that crash, the National Transportation Safety Board criticized the driver for likely playing a game on his cell phone with his hands off the steering wheel when the car crashed. It also faulted Tesla for designing an autopilot system with “ineffective monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness.”
This month’s lawsuit says Tesla CEO Elon Musk has boasted about the autopilot’s capabilities, stating at a recent conference that “essentially complete autonomy” could be accomplished “with the hardware that is in Tesla today” with some software improvements.
“The dark side of the system, however, is that Tesla’s Autopilot system is at best a work in progress,” the suit adds, “and it has a history of dangerous and even fatal consequences for its users.”
Source: www.mercurynews.com