Teslas account for 70% of driver-assist crashes, data show

How safe are automated driving systems? Are some safer than others?

Seven years after Tesla started selling cars equipped with what it calls Autopilot, auto safety regulators still can’t answer these fundamental and important questions.

But it took a step toward being able to do so Wednesday with the National Highway Traffic Safety Administration’s first report on crashes involving advanced driver assistance systems.

The numbers are suggestive, Tesla accounts for 70% of all crashes involving the “Level 2” driving system, which includes adaptive cruise control plus automatic lane-keeping and may include more advanced features, such as automatic lane change. That number is sure to feed off critics, who argue that Elon Musk’s company has taken a reckless approach to launching unproven technology.

However, more details and context are needed before managers can say definitively whether such systems can outperform human drivers or other systems.

“The data may raise more questions than they answer,” NHTSA head Steven Cliff told reporters.

In June 2021, the agency requires automakers to report serious crashes involving Level 2 systems. The numbers reported Wednesday reflect incidents that have occurred. from that time to May 15 of this year.

Of all the crashes that occurred during that time period for all cars, the automakers reported that 392 involved automated driver assistance systems.

Of those, 273 were reported by Tesla, 90 by Honda, 10 by Subaru, and others reported single-digit fatal crashes.

“These data provide a limited view of hundreds of collisions,” said Bryant Walker Smith, a professor of autonomous vehicle law at the University of South Carolina School of Law. “But in that same time period, there were literally millions of other crashes.”

But no one should conclude that Level 2 systems are safer than human-driven cars alone, he said. They may be, they may not. The NHTSA data is too extensive to draw any such conclusions, he said.

The data does not include the number of automated systems each company has on the road or the total number of miles traveled that participates in the Level 2 system. NHTSA could not comment on the thoroughness of its reporting procedures. of each company. Agency planning monthly report.

Smith said. An insight into the causes of reported crashes – the role of the system, by the driver, the system’s driving monitoring system and other conditions on the road – would help, he said. Safety regulators draw firm conclusions, he said.

Last year’s crash data reporting order marked NHTSA’s first attempt to fill a profound knowledge gap about the real-life safety impacts of autonomous vehicle technology on public roads.

Any vehicle manufacturer’s automated system can be safer than the driver’s. Or less secure. Data rich enough to draw proper conclusions is scarce. Incident data collection systems in the United States are decades old, inconsistent, still paper-based in many police departments, and completely unspecified as to the automated system’s role in the detection of incidents. prevent or cause problems.

Alain Kornhuser, head of the driverless car program at Princeton University, said in an email, “Who would expect that NHTSA would ‘do the job’ to make up for the numbers they publish in the summary? off is really comparable.

In addition to collecting crash data, NHTSA is investigating why Tesla cars crashed into an emergency vehicle that was parked on the side of the road, which frequently had flashing emergency lights.

The poll was prompted by 11 crashes that resulted in 17 injuries and one death, including three in Southern California. The number of such crashes has risen to 16. Technology in about 830,000 cars – all Teslas sold in the US between 2014 and 2022 – is under investigation.

As part of that investigation, regulators will look into the performance of Tesla’s automatic emergency braking system. As The Times reported last year, Tesla drivers report emergency braking problems at a much higher rate than drivers of other automakers.

The emergency vehicle poll got more serious earlier this month, when the NHTSA upgraded its status to “EA,” for technical analysis. That category means investigators will take a deeper look at Autopilot’s technical design and performance. When an investigation comes to the EA, the possibility of a recall is higher.

Meanwhile, the California Department of Motor Vehicles continues to investigate whether Tesla is falsely promoting its Full Self-Driving feature, a $12,000 option. Experts in the field are extremely aware that the system cannot safely drive itself.

However, the DMV review is more than a year old and the DMV will not say when it might be completed.

State lawmakers are increasingly concerned about the DMV’s seemingly lax approach to Tesla. In December, the Chair of the California Senate Transportation Committee, Lena Gonzalez, asked the DMV to provide the committee with accident and safety information. The DMV said it will look into the matter, and is continuing to do so.

The DMV appears to allow Tesla to test self-driving cars on public highways without requiring the company to report crashes or system failures, as is required by competitors like Waymo, Cruise, Argo and Zoox. DMV head Steve Gordon has declined all media requests to discuss the topic as of May 2021.

https://www.latimes.com/business/story/2022-06-15/tesla-autopilot-crash-report-nhtsa Teslas account for 70% of driver-assist crashes, data show

Edmund DeMarche

USTimesPost.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@ustimespost.com. The content will be deleted within 24 hours.

Related Articles

Back to top button