Tesla recall: ‘Full Self-Driving’ software may cause crashes

US safety regulators expressed concern about how Tesla’s “Full Self-Driving” system responds in four areas along roads and “increases the risk of a crash.”
DETROIT – U.S. safety regulators have pressured Tesla to recall nearly 363,000 vehicles with a “Full Self-Driving” system because it failed to work properly around intersections and at times are also subject to the speed limit.
The recall, part of the National Highway Traffic Safety Administration’s larger investigation into Tesla’s autonomous driving systems, is the most serious action taken against the automaker. electric vehicle production.
It calls into question CEO Elon Musk’s claim that he can prove to regulators that cars equipped with “Full Self-Driving” are safer than humans, and humans almost never are. must touch the controller.
Musk has at one point promised that a fleet of autonomous robotic vehicles will be in use by 2020. The latest action seems to push that development further into the future.
The safety agency said in documents posted on its website Thursday that Tesla will address the concerns with an online software update in the next few weeks. The documents say Tesla is conducting a recall but disagree with the agency’s analysis of the matter.
The system is being tested on public roads by about 400,000 Tesla owners, taking unsafe actions such as going straight through an intersection while in a turn-only lane, not fully stopping at stop or go signs. via NHTSA indicates an intersection when the traffic light is yellow without proper caution.
In addition, the system may not respond adequately to posted speed limit changes, or may not account for the driver’s speed adjustment, the documents say.
“The FSD beta software allows a vehicle to exceed the speed limit or pass through intersections in an illegal or unpredictable manner increasing the risk of a collision,” the agency said in the documents. .
Musk complained Thursday on Twitter, the company he now owns, that calling an over-the-air software update a recall is “outdated and completely wrong!” A message was left on Thursday seeking further comment from Tesla, which has disbanded its media relations division.
Tesla received 18 possible software-caused warranty claims between May 2019 and September 12, 2022, the documents said. But the Austin, Texas electric vehicle maker told the agency it was not aware of any deaths or injuries.
In a statement, NHTSA said it discovered the problems during testing conducted as part of an investigation into Tesla’s “Full Self-Driving” and “Autopilot” software undertakings. several driving missions. The agency said the investigation is still open and the recall does not cover the full extent that NHTSA is scrutinizing.
Despite the names “Full Self-Driving” and “Autonomous Driving,” Tesla says on its website that cars cannot drive themselves and owners must be ready to intervene at all times.
NHTSA’s testing found that Tesla’s FSD beta test “results in an unreasonable risk to motor vehicle safety by failing to fully comply with traffic safety laws.”
Raj Rajkumar, a professor of computer engineering at Carnegie Mellon University, doubts that Tesla can fix the problems raised by NHTSA with a software update. The automaker relies solely on cameras and artificial intelligence to make driving decisions, he said, a system that will make mistakes.
“Cameras can miss a lot of things,” says Rajkumar. These are not simple problems to fix. If they could fix it, they would have fixed it a long time ago.
Most other companies with self-driving vehicles use laser and radar sensors in addition to the camera to make sure the vehicle sees everything. “A sensing method is not perfect by any metric,” says Rajkumar.
He questioned whether NHTSA required testing before sending out a software update to make sure it worked. The agency said it works closely with automakers as they develop recall measures “to ensure adequacy.”
In the documents, NHTSA said that on January 25, as part of regular communication with Tesla, it informed the automaker of concerns with the FSD and it asked Tesla to proceed recall action. On February 7, Tesla decided to carry out the recall out of caution, “while disagreeing with the agency’s analysis.”
The recall is one of a list of problems Tesla has had with the US government. In January, the company revealed that the US Department of Justice had asked Tesla to provide documents on “Full Self-Driving” and “Autonomous Driving”.
NHTSA has been investigating Tesla’s automated systems since June 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing the road in Florida. A separate investigation into Teslas that were using Autopilot when they crashed into ambulances began in August 2021. At least 14 Teslas crashed into emergency vehicles while using the Autopilot system.
NHTSA sent investigators to 35 Tesla accidents in which automated systems were suspected. Nineteen people died in those crashes, including two motorcyclists.
The agency is also investigating complaints that Tesla vehicles may brake suddenly for no reason.
As of January 2022, Tesla has issued 20 recalls, some of which were requested by NHTSA. The recalls include one from January last year for “Full Self-Driving” vehicles programmed to run stop signs at slow speeds.
“Full Self-Driving” went on sale in late 2015 and Musk has been using the name ever since. Currently the cost to activate the system is 15,000 USD.
The recall announced Thursday includes some 2016-2023 Model S and Model X vehicles, as well as 2017 through 2013 Model 3 and 2020 through 2023 Model Y vehicles equipped with the software or pending installation. .
Shares of Tesla closed Thursday down 5.7%. Shares are up about 64% year-to-date, reversing the heavy losses of 2022.
https://www.king5.com/article/news/nation-world/tesla-recall-full-self-driving-beta-flaws/507-2d117027-141e-4d0c-8c29-b6871ea7a420 Tesla recall: ‘Full Self-Driving’ software may cause crashes