Federal authorities are investigating whether a Tesla vehicle involved in last week’s crash that killed three people and injured three others in Newport Beach had Autopilot activated at the time of the crash. fall or not.
A special crash investigation team has been dispatched for the May 12 incident on the Coast Highway, the National Highway Traffic Safety Administration said Wednesday.
In that crash, Newport Beach police were called around 12:45 a.m. to the 3000 block of the Coast Highway, where they found a 2022 Tesla Model S sedan that had veered off the curb and hit construction equipment. build.
Three people were found dead in the Tesla; last week they were identified as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr., 40, of Newport Beach, according to the Orange County Sheriff’s Department.
Three construction workers suffered non-life-threatening injuries, police said, adding that the department’s large Accident Investigation Team had been dispatched.
Tesla, which has disbanded its media relations division, on Wednesday did not respond to a request for comment from The Times about the NHTSA investigation into the Orange County crash.
The federal investigation is part of the agency’s broader investigation into crashes involving advanced driver assistance systems like Tesla’s Autopilot. Investigators have been dispatched to 34 accidents since 2016 in which systems were in use or suspected to be operating; 28 of those involved Teslas, according to an NHTSA document released Wednesday.
In those 34 crashes, 15 people were killed and at least 15 others injured, and all but one of the fatalities occurred in crashes involving Teslas, according to the document. .
NHTSA told The Times on Wednesday night that it does not comment on open investigations.
In addition to those crashes, NHTSA is investigating several incidents where Teslas on Autonomous Vehicles crashed into emergency vehicles parked along the roadway despite flashing lights or hazard cones, as well as a series of complaints that Autopilot activated “virtual braking” at high speed for no apparent reason.
NHTSA is also investigating two crashes involving Volvos, one Navya shuttle crash, two Cadillac crashes, one Lexus and one Hyundai. One of Volvo’s crashes was an Uber self-driving test vehicle that ran over and killed a pedestrian in Arizona in March 2018.
In Los Angeles County, the district attorney’s office filed in January what experts believe is the first felony prosecution in the United States against a driver accused of causing death while using the system. partially automatic driver assistance system.
The charges were made two years after the accident in Gardena. Kevin George Aziz Riad, 27, was at the wheel of a 2016 Tesla Model S on Autopilot on December 29, 2019, when it exited the highway, ran a red light, and crashed into a Honda Civic.
The Civic driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.
Riad faces two counts of vehicle manslaughter.
Tesla has warned drivers using Autopilot, as well as its so-called Full Self-Driving Systems, that cars cannot drive themselves and drivers must be ready to intervene at all times.
Last June, NHTSA ordered dozens of auto and technology companies to report crash data on automated vehicles to better monitor their safety.
The agency said there are no motor vehicles on the market that can fully drive themselves. Tesla’s Autopilot feature is classified as a “Level 2” vehicle autonomous feature, which means the car can control steering and acceleration, but the driver can take control at any time. any.
“Will a [Level 2] Whether or not autonomous driving is involved, all available vehicles require the driver to be in control at all times, and all state laws hold the driver accountable for the operation of the vehicle. theirs,” said an NHTSA spokesperson. “A number of advanced driver assistance features can promote safety by helping drivers avoid collisions and reduce the severity of accidents, but like all other technologies and equipment, equipment on motor vehicles, drivers must use them correctly and responsibly.”
It’s clear to many legal experts that the responsibility of Level 2 systems like Autopilot lies entirely with the driver – not the companies marketing the technology that might lead consumers to believe features are capable of driving. more powerful than them.
But the California Department of Motor Vehicles is grappling with confusion over Tesla’s Full Self-Driving feature, an advanced version of Autopilot that aims to ultimately do what the name says: offering complete autonomy, to level does not need human to drive.
Although other self-driving car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting tests with its own customers. himself, charging car owners $12,000 for the privilege.
Other autonomous technology companies are required to report problems and system failures to the DMV under their experimental licensing system, but the agency has allowed Tesla to decline those regulations.
Following pressure from state legislators, fueled by terrifying videos on YouTube and Twitter pointing to the poor performance of Fully Self-Driving Vehicles, in January the DMV said it was “reconsidering” its stance. on Tesla technology.
The agency is also conducting a review to determine whether Tesla violated another DMV regulation on Fully Self-Driving systems – a regulation that prohibits companies from marketing their cars as autonomous vehicles. when they don’t.
Times staff writer Russ Mitchell and the Associated Press contributed to this report.
https://www.latimes.com/california/story/2022-05-18/federal-agency-investigate-fatal-tesla-crash-newport-beach Federal agency to investigate fatal Tesla crash in Newport Beach