Friday, November 29, 2024

Federal investigators are looking into Tesla’s Autopilot recall after 20 accidents

Federal traffic safety investigators want Tesla to inform them how and why the corporate developed the answer as a part of a recall of greater than 2 million vehicles equipped with the corporate’s partially automated driving system, Autopilot.

Investigators with the U.S. National Highway Traffic Safety Administration have concerns about whether the recall solution worked, as Tesla has reported 20 accidents since releasing the answer as an internet software update in December.

The recall solution was also about clarifying the query of whether the autopilot may also be used on roads apart from highways with restricted access. The solution to this was increased warnings for the motive force on roads with intersections.

But in a single Letter to Tesla published on the agency’s website On Tuesday, investigators wrote that they might find no difference between warnings to drivers before the recall and after the brand new software was released. The agency said it is going to consider whether driver warnings are appropriate, particularly when a driver monitoring camera is roofed.

The agency requested extensive details about how Tesla developed the answer, specializing in how the corporate used human behavior to check the effectiveness of the recall.

“Inadequate remedial measures”

Phil Koopman, a professor at Carnegie Mellon University who studies automated driving safety, said the letter shows the recall did little to resolve problems with Autopilot and was an try to reassure NHTSA. which requested the recall after greater than two years of investigation.

“It’s pretty clear to anyone watching that Tesla has tried to provide as little remedial action as possible to see what they can get away with,” Koopman said. “And NHTSA must respond forcefully or other automakers will begin offering inadequate remedies.”

Safety advocates have long expressed concern that Autopilot, which may keep a vehicle in its lane and maintain a distance from objects in front of it, will not be designed to operate on roads apart from limited-access highways.

Missy Cummings, a professor of engineering and computer science at George Mason University who studies automated vehicles, said NHTSA is responding to criticism from lawmakers over a perceived lack of motion on automated vehicles.

“As cumbersome as our government is, the feedback loop works,” Cummings said. “I think NHTSA leadership now believes this is a problem.”

The 18-page NHTSA letter asks how Tesla used the science of human behavior in developing Autopilot and the way the corporate views the importance of evaluating human aspects.

Tesla must also discover each workplace that’s involved in evaluating human behavior and the qualifications of the employees. And it calls on Tesla to say whether the positions still exist.

The Associated Press left a message early Tuesday looking for comment from Tesla in regards to the letter.

Tesla is within the means of shedding about 10% of its workforce, about 14,000 people, to chop costs and address falling global sales.

Cummings said she suspected that CEO Elon Musk would have fired anyone with knowledge of human behavior – a key skill needed to operate semi-automated systems like Autopilot, which cannot drive themselves and require humans to be able to intervene in any respect times .

“If you want to have a technology based on human interaction, you better have someone on your team who knows what they are doing in this area,” she said.

Cummings said her research showed that after a propulsion system takes over steering from humans, there’s little left for the human brain to do. Many drivers are likely to rely an excessive amount of on the system and take a look at.

“You can fix your head in one position, potentially keep your eyes on the road and be a million miles away in your mind,” she said. “All the driver monitoring technology in the world still won’t force you to pay attention.”

Is the autopilot on or off?

In its letter, NHTSA also asks Tesla for details about how the recall treatment will address driver confusion about whether Autopilot has been turned off when force is applied to the steering wheel. Previously, if Autopilot was disabled, the motive force may not quickly realize that they should take over driving.

The recall added a feature that permits for “stronger deceleration” to alert drivers when Autopilot has been disabled. However, the recall doesn’t activate the function routinely; the motive force has to do it himself. Investigators asked what number of drivers had taken this step.

NHTSA asks Tesla, “What do you mean you have a cure and it doesn’t actually turn on?” Koopman said.

The letter, he said, shows that NHTSA is checking whether Tesla conducted tests to make sure the fixes actually worked. “When I looked at it, I couldn’t believe that there was a lot of analysis showing that it improved safety,” Koopman said.

The agency also says Tesla made safety updates after the recall was issued, including an attempt to cut back accidents brought on by hydroplaning and collisions in high-speed turning lanes. NHTSA said it will investigate why Tesla didn’t include the updates in the unique recall.

Safety experts say NHTSA could seek additional recalls, limit Tesla’s use of Autopilot, and even force the corporate to disable the system until the issue is resolved.

NHTSA began its Autopilot investigation in 2021 after receiving 11 reports that Teslas using Autopilot had struck parked emergency vehicles. In documents explaining why the investigation was closed due to recall, NHTSA said it ultimately found 467 accidents involving Autopilot, leading to 54 injuries and 14 deaths.

Subscribe to the Eye on AI newsletter to stay awake to this point on how AI is shaping the long run of business. Sign up at no cost.
Latest news
Related news