Musk backed a boast of zero ‘self-driving’ Tesla crashes, but data show otherwise

Elon Musk has long used his mighty Twitter megaphone to amplify the idea that Tesla’s autonomous driving software isn’t just safe – it’s safer than anything a human driver could have. can be achieved.

That campaign kicked off successfully last fall when the electric carmaker expanded its Full Self-Driving “beta” program from a few thousand people to an existing fleet of more than 100,000 people. The $12,000 feature allows a Tesla to drive itself on highways and neighboring streets, changing lanes, turning, and obeying traffic signs and signals.

When critics scolded Musk for testing experimental technology on public roads without trained safety drivers as backup, Santa Monica investment director and Tesla booster Ross Gerber was one of the allies who stood up to defend him.

“There have been no accidents or injuries since the launch of the FSD beta,” tweeted in January. “Not one. Not one at all.”

Musk replied with a single word: “Yes.”

In fact, at the time, dozens of drivers filed safety complaints with the National Highway Traffic Safety Administration about incidents involving fully autonomous driving — and At least eight of them were involved in the accident. Complaints are in the public domain, in the database on the NHTSA website.

One driver said the FSD automatically “jumped towards a pickup truck” before accelerating into the divider, causing a wreck.

Another said: “The car was in the wrong lane” with FSD “and I was hit by another driver in the lane next to my car.

YouTube and Twitter are rife with videos revealing FSD misconduct, including a recent one parcel that appears to show a Tesla driving itself into the path of an oncoming train. The driver jerked the steering wheel to avoid a head-on collision.

Almost no one other than Tesla can say how many FSD-related crashes, injuries or deaths; NHTSA is investigating a number of recent fatal crashes it may have been involved in. The agency recently asked automakers to report serious accidents involving automatic and semi-autonomous technology to the agency, but it has yet to release individual crash details to the agency. general public.

Car companies like Cruise, Waymo, Argo, and Zoox are all equipped with over-the-air software that alerts the company immediately. Tesla has pioneered such software in passenger cars, but the company, which does not maintain a media relations office, did not respond to a question about whether it received automatic accident reports from cars that run FSD or not. Automakers without online software must rely on public reports and communicate with drivers and service centers to assess whether NHTSA reporting is necessary.

Efforts to reach Musk were also unsuccessful.

Gerber said he was not aware of crash reports in the NHTSA database when he posted his tweet, but believes the company will be aware of any collisions. “Because Tesla records everything that happens, Tesla is aware of each incident,” he said. He said it was possible that the drivers were at fault in the crash but he did not review the reports himself.

Exact public statistics on auto-vehicle crashes currently do not exist because police officers who write crash reports include only the driver’s testimony. “We’re not experts at how to get that kind of data,” said Amber Davis, a spokeswoman for the California Highway Patrol. “At the end of the day, we ask for the best recollections of how [a crash] happened.”

Mahmood Hikmet, head of research and development at autonomous shuttle company Ohmio, said exactly what data Tesla’s autonomous driving system collects and transmits to headquarters is only available to the company. known. He said Musk’s definition of a collision or an accident may be different from that of an insurance company or the average person. NHTSA only requires accident reports for fully or partially autonomous vehicles if someone is injured, an airbag is deployed, or a vehicle has to be towed.

Reports of the FSD crash were first unearthed by FSD critic Taylor Ogan, who runs Snow Bull Capital, a China-oriented hedge fund. The Times separately downloaded and evaluated the data to verify Ogan’s findings.

The data – covering the period from January 1, 2021 to January 16, 2022 – shows dozens of safety complaints about FSD, including many reports of virtual braking, in which the emergency braking system the car’s automatic brake is not obvious. reason.

Below are excerpts from eight incident reports in which the FSD was involved:

  • Southhampton, NY: A Model 3 traveling at 60 mph collided with an SUV parked on the highway. Tesla drove the car himself, “rushing over the side of the SUV, ripping the car’s mirror off”. The driver called Tesla to say “our car has gone crazy.”
  • Houston: A Model 3 was traveling at 35 mph when it suddenly swerved over the curb, damaging the bumper, wheels and flat tyres. “The crash “appears to have been caused by a patch of discolored lines that caused the FSD to misperceive an obstacle it was trying to avoid.” Denying the warranty claim, a Tesla service center charged $2,332.37 and said it would not return the vehicle until the bill was paid.
  • Brea: “During a left turn, the car went into the wrong lane and I was hit by another driver in the lane next to my car.” The vehicle “steered on its own and entered the wrong lane… putting everyone involved at risk. The car was badly damaged on the driver side”.
  • Collettsville, NC: “The road curves to the left and when the car turns in, it turns too wide and turns off the road…. The right side of the car goes up and over the top of the rock. The right front tire blew out and only the side airbags were deployed (both sides.) The car went about 500 yards along the road and then shut itself off.” Estimated damage is $28,000 to $30,000.
  • Troy, Mo: A Tesla was turning around a bend when “suddenly, about 40% of the way through the turn, the Model Y straightened the steering wheel and crossed the center line into the oncoming vehicle line. As I was about to back up into my lane, I lost control and plunged into a ditch and through woodland, causing significant damage to the vehicle.”
  • Jackson, Mo.: A Model 3” jerked right toward a pickup, then jerked left toward the median posts while it was accelerating and the FSD wouldn’t turn off.… We’ve owned this car for a while now. 11 days when the accident happened.”
  • Hercules, California: The “ghost brake” caused the Tesla to come to a sudden stop and “the car behind me didn’t react”. A rear-end collision caused “serious damage to the vehicle.”
  • Dallas: “I was driving with fully autonomous assistance… a car was in my blind spot so I tried to overtake the car by pulling the wheel. The car sounded an alarm indicating that I was about to hit the left divider. I believe I fought the car to regain control of the car and ended up hitting the left divider causing it to overturn.[ed] The car went all the way to the right, crashing into the median.

Critics say the name Self-Driving is a complete misnomer and that no vehicle sold to an individual in the US can drive itself. New York University professor Meredith Broussard, author of the book “Artificial Intelligence,” published by MIT Press, is “a complete fantasy. “And it’s a safety nightmare.”
California regulations forbid a company from advertising a fully self-driving car when it’s not. The state Department of Motor Vehicles is conducting a review of Tesla’s marketing practices, an assessment that has entered its second year.

DMV head Steve Gordon has declined to speak publicly on the matter since May 2021. On Wednesday, the department said, “Review is ongoing. Will let you know when we have something to share. “

https://www.latimes.com/business/story/2022-07-14/elon-musk-said-no-self-driving-tesla-had-ever-crashed Musk backed a boast of zero ‘self-driving’ Tesla crashes, but data show otherwise

Edmuns DeMars

Edmund DeMarche is a USTimesPost U.S. News Reporter based in London. His focus is on U.S. politics and the environment. He has covered climate change extensively, as well as healthcare and crime. Edmund DeMarche joined USTimesPost in 2023 from the Daily Express and previously worked for Chemist and Druggist and the Jewish Chronicle. He is a graduate of Cambridge University. Languages: English. You can get in touch with me by emailing edmund@ustimespost.com.

Related Articles

Back to top button