US investigators need to know whether the Tesla Autopilot recall had enough effect to draw the eye of drivers

The U.S. government's auto safety agency is investigating whether last 12 months's recall of Tesla's Autopilot driving system was enough to make sure drivers paid attention to the road.

The National Highway Traffic Safety Administration it says in documents published on its website on Friday that Tesla had reported 20 additional accidents involving Autopilot for the reason that recall. The crashes and regulatory testing raised concerns in regards to the drug's effectiveness. The recall affected greater than 2 million vehicles, almost the entire vehicles Tesla had sold on the time.

The agency pushed the corporate into the recall after a two-year investigation into the Autopilot driver monitoring system, which measures steering wheel torque based on a driver's hands. As a part of the investigation, the agency examined several cases through which Teslas with Autopilot struck emergency vehicles parked on highways.

The recall fix includes an internet software update to extend alerts for drivers. However, the agency said in documents that it found evidence of crashes after the fix and that Tesla tried to handle problems with additional software updates after the recall fix was sent out. The updates may not have worked.

“This investigation will examine why these updates were not part of the recall or were otherwise intended to address a defect that poses a disproportionate safety risk,” the agency wrote.

A message in search of comment from Tesla was left early Friday.

NHTSA said Tesla reported the 20 accidents involving vehicles that had the recall software fixed. The authority has required Tesla and other automobile manufacturers to report accidents involving partially and fully automated driving systems.

NHTSA said it should evaluate the recall, including the “importance and scope” of the Autopilot controls, to handle misuse, confusion and use in areas for which the system isn’t designed.

It also said that Tesla has stated that owners can resolve whether or not they wish to opt-in to parts of the recall and that it allows drivers to undo parts of it.

Safety advocates have long expressed concern that Autopilot, which may keep a vehicle in its lane and maintain a distance from objects in front of it, isn’t designed to operate on roads apart from limited-access highways.

The investigation comes just every week after a Tesla could have undergone surgery Autopilot struck and killed a motorcyclist near SeattleThis raises questions on whether a recent recall went far enough to make sure Tesla drivers using Autopilot concentrate to the road.

After the April 19 crash in a suburb about 15 miles (24 kilometers) northeast of the town, the motive force of a 2022 Tesla Model S told a Washington State Patrol trooper that he was using Autopilot and checked out his cellphone, while the Tesla was driving.

“The next thing he knew was a bang and the vehicle spun forward, accelerated and collided with the motorcycle in front of him,” the trooper wrote in a probable cause document.

The 56-year-old driver was arrested for investigation of manslaughter “due to admitted inattention while driving in autopilot mode and distracted use of the cell phone while driving forward, trusting the machine to drive for him.” the affidavit says.

The motorcyclist, Jeffrey Nissen, 28, of Stanwood, Washington, was pronounced dead on the scene, authorities reported.

Authorities said that they had not yet independently verified whether the autopilot was in use on the time of the crash.

On Thursday, NHTSA finished his investigation of Autopilot, citing the recall and the study of its effectiveness. The agency said it found evidence “that Tesla's weak driver engagement system was not suitable for Autopilot's permissive operating capabilities.”

Tesla, the leading electric vehicle manufacturer, reluctantly agreed to the recall last 12 months after NHTSA determined the motive force monitoring system was defective.

The Associated Press reported shortly after the recall that experts said the answer was fixed depend on technology that will not work.

Research from NHTSA, the National Transportation Safety Board and other researchers shows that simply measuring torque on the steering wheel doesn’t make sure that the motive force is paying enough attention. Experts say night vision cameras are needed to watch drivers' eyes and ensure they’re taking a look at the road.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said NHTSA is reviewing where Tesla allows using Autopilot.

The company doesn’t limit its use, though it was designed to operate on limited-access highways. Tesla, he said, appears to depend on computers to choose whether Autopilot can work, moderately than maps that show a vehicle's location.

“If you get to the point where you're in an area that the autopilot isn't designed for, and the car knows it's in that area, why is it still allowed to engage?” he asked .

Brooks said NHTSA could seek civil penalties and extra repairs from Tesla.

Government documents filed by Tesla as a part of the recall in December say the net software change will increase alerts and warnings to drivers to maintain their hands on the steering wheel.

NHTSA began investigating an Autopilot accident in 2021 after receiving 11 reports that Teslas using Autopilot had struck parked emergency vehicles. In documents explaining why the investigation was closed, NHTSA said it ultimately found 467 accidents involving Autopilot, leading to 54 injuries and 14 deaths.

In investigative documents, NHTSA said it found 75 accidents and one fatality related to full self-driving. It's not clear whether the system was in charge.

CEO Elon Musk has said for several years that “Full Self Driving” will allow a fleet of robotaxis to generate revenue for the corporate and owners by utilizing the electrical vehicles once they are parked. Since full self-driving hardware hit the market in late 2015, Musk has touted self-driving vehicles as a growth catalyst for Tesla. The system is being tested by 1000’s of householders on public roads.

In 2019, Musk promised a fleet of autonomous robot taxis by 2020 that will rival Teslas in value. Instead, they’ve declined with price cuts, as was the case with autonomous robotaxis Delayed 12 months after 12 months because it is tested by owners as the corporate collects road data for its computers.

Tesla says not one of the systems can drive themselves and that drivers have to be able to take control at any time.

Neither Musk nor other Tesla executives would indicate on Tuesday's earnings conference call once they expect Tesla vehicles to drive independently in addition to humans. Instead, Musk praised the newest version of “Full Self Driving” and said that “it's only a matter of time before we surpass human reliability, and certainly not for very long.”

Musk further emphasized: “If someone doesn't believe Tesla will solve autonomy, I don't think they should be an investor in the company.”

image credit :