Tesla autopilot kills father of two in Santa Clara County crash, family claims

Jason Bolton turned on Tesla's controversial “Autopilot” system while driving his Model 3 from his home in Fresno to a business meeting within the Bay Area. But just as he reached the highest of Route 152, the system failed and the automobile crashed fatally, in line with a lawsuit filed against Tesla by his wife and two daughters.

A 2023 Tesla Model 3 after a fatal crash on State Route 152 in Santa Clara County, California, on July 26, 2023. The photo was filed against Tesla on July 16, 2024, as part of a wrongful death lawsuit in Santa Clara County Superior Court in San Jose, California, by the family of the deceased man, Jason Bolton of Fresno. (Photo from Santa Clara County Superior Court file)
A 2023 Tesla Model 3 is seen in a parking zone after a fatal crash on State Route 152 in Santa Clara County, California, July 26, 2023. The photo was filed against Tesla on July 16, 2024, as a part of a wrongful death lawsuit in Santa Clara County Superior Court in San Jose, California, by the family of the deceased man, Jason Bolton of Fresno. (Photo from Santa Clara County Superior Court file)

The 2023 black sedan fell onto its roof after which rolled over on its wheels, in line with the lawsuit filed Tuesday in Santa Clara County Superior Court by Bolton's widow, Linda, and their daughters, Rowan and Willow. Photos attached to the lawsuit show the automobile's roof being dented and the windshield and windows blown out.

Bolton, a 49-year-old advantages administrator, “suffered gruesome and ultimately fatal injuries” within the July 2023 accident, the wrongful death lawsuit says.

The family claims within the lawsuit that the electrical automobile maker released the “Autopilot” system when it was not ready for public use out of greed for profit and a “blind pursuit of market dominance.” Tesla, led by CEO Elon Musk, used “deceptive marketing tactics” to “intentionally misrepresent” the capabilities and limitations of its technology and “manipulate consumers into believing that ('Autopilot') enables hands-free driving.”

Jason Bolton, 49, a married father of two from Fresno, was killed in a Tesla crash on Route 152 in July 2023 (courtesy of MLG Attorneys at Law)
Jason Bolton, 49, a married father of two from Fresno, was killed in a Tesla crash on Route 152 in July 2023 (courtesy of MLG Attorneys at Law)

The lawsuit accuses Tesla of “incredible disregard for basic ethical principles and consumer safety.”

The lawsuit follows quite a few lawsuits and state and federal investigations related to the “Autopilot” system, which doesn’t mechanically steer a automobile. The basic system provides cruise control and steering assistance. An advanced version includes navigation and automatic lane changes and exits.

Tesla didn’t immediately reply to requests for comment on the lawsuit and the regulatory investigation. Owner's manuals advise Autopilot users to maintain their hands on the wheel in any respect times and to “pay attention to road conditions, surrounding traffic and other road users.”

In April, on the eve of a jury trial that may take a detailed have a look at the Autopilot system, Tesla reached an undisclosed settlement in a lawsuit brought by the family of Apple engineer Walter Huang, a married father of two from Foster City who died on Highway 101 in Mountain View in 2018 after Autopilot sent his Tesla Model X compact SUV right into a freeway guardrail. Federal investigators had alleged Huang's “over-reliance” on Autopilot while he was distracted, likely by a game on his phone. His family's lawyers have noted that there isn’t a clear finding that he was playing a game.

At the center of the Autopilot lawsuits and investigations are questions on whether Tesla's marketing and the name “Autopilot” encourage drivers to drive with their hands off the wheel and their attention averted from the road. The system's ability to detect stationary emergency vehicles and take appropriate motion has also been called into query.

A 2019 survey by the Insurance Institute for Highway Safety found that Tesla's Autopilot—greater than another manufacturer's driver-assist systems—causes drivers to overestimate the system's capabilities: 48% said they thought it was secure to make use of the system hands-free.

The National Highway Transportation Safety Administration has been investigating Autopilot since August 2021, initially looking into 17 incidents through which a Tesla using Autopilot collided with a parked emergency vehicle on a highway. In 2022, the agency said it had expanded the investigation “to examine the extent to which Autopilot and related Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of driver oversight.”

In March 2023, the agency announced it had opened a special investigation right into a fatal accident the previous month through which a Tesla Model S sedan collided with a Contra Costa County Fire Department ladder truck. The driver of the Tesla was killed, a passenger was seriously injured, and 4 firefighters suffered minor injuries.

In 2022, the California Department of Motor Vehicles filed an administrative criticism alleging that Tesla misleadingly advertised “Autopilot” and its “fully autonomous driving system” in a way that contradicted its own warnings that these features required lively driver supervision.

Tesla recalled nearly all of its 2 million vehicles within the U.S. in December after federal authorities spent two years investigating about 1,000 crashes that occurred while using Autopilot. The recall was for an over-the-air software update designed to present drivers more warnings in the event that they were inattentive while using Autopilot's Autosteer feature.

Originally published:

image credit : www.mercurynews.com