Elon Musk doc tackles Tesla safety, Twitter fans. What to know

If you own a Tesla, or a loved one, or you’re thinking of buying one, or you share public roads with Tesla cars, you might want to check out the new documentary “Elon Musk’s Crash” Courses”.

Premiering Friday on FX and Hulu, the 75-minute horrifying show highlights the persistent dangers of Tesla’s autonomous driving technologies, the company’s lax safety culture, the hype Musk’s PT Barnum marketing and weak safety managers who don’t seem to care.

Reportedly and accurately (I’ve been covering the company since 2016 and can attest to its authenticity), the project, part of the “Gifts of the New York Times” series, is take place, will likely become a historical artifact of what-the-hells-they-think differently.

Central throughout is the story of Joshua Brown, a die-hard Tesla fan and derring-do enthusiast who is beheaded when his Autopilot Tesla drives at high speed. maximum speed on a Florida highway underneath a pickup truck’s trailer in 2016.

None of the lessons learned at Tesla didn’t prevent a nearly identical fatal crash, also in Florida, three years later. An unspecified number of Autopilot-related accidents have happened since – no one knows but Tesla, which is capable of tracking its cars via wireless connections – because of the process of collecting numbers. Decades of government accident statistics are not suitable for the digital age. The company is currently being investigated by federal safety regulators for its apparent tendency to crash into emergency vehicles parked next to highways.

Here are four other key takeaways from the “Elon Musk Incident Course.”

The exterior of the Tesla headquarters

The New York Times presents “Elon Musk Incident Course” Episode 1 airs on Friday.

(FX)

1. Tesla’s Autopilot Feature Not Fully Tested, Former Employees Allege

According to several former members of the Autopilot development team featured in the documentary, the pressure to push Autopilot features to customers quickly, ready or not, is relentless. “There is no deep research phase,” said one engineering program manager, as at other driverless car companies, with “the client is essentially looking for expert test drivers.” Karma”.

Testimonials from these developers are a standout feature of “Crash Course”. It’s rare to hear from insiders at Tesla because “liberalist” Musk forces employees to sign strict non-disclosure agreements and enforces them with a massive army. including well-paid attorneys.

When Brown’s vehicle drove under the truck, the company said, the system mistook the side of the trailer for a bright sky and blamed the camera supplier. But inside Tesla, the Autopilot team is still grappling with how their software can distinguish a truck crossing a highway from an overhead bridge, said software engineer Raven Jiang: “The learning rate is not high. Personally, it’s hard for me to believe the promise will be fulfilled. “

The media covered the Theranos scam at the time, which provoked some “soul seekers” for Jiang, who had left for another job. Akshat Patel, an autonomous driving engineering program manager, echoed Jiang’s concerns. If anyone sees Tesla as “an example of scientific integrity, public responsibility, and sound and methodical engineering,” Patel said, “it is not.”

2. Fully autonomous Teslas are more science fiction than reality

Tesla is currently selling a feature for $12,000 called Full Self-Driving, which isn’t fully self-driving at all. There are no fully self-driving cars for individual buyers.

But that hasn’t stopped Musk from claiming, year after year, that autonomous Teslas are close at hand.

Clip after clip in “Crash Course” showcases his false claims.

2014: “No arms, no legs, nothing,” he says from the wheel of a Tesla. “The car can do almost anything.”

2015: Musk tells a crowd he’s “quite confident” he’ll achieve autonomy within three years, to the point where “you could sleep through it all the time.”

2016: “I think we are basically less than two years away from having full autonomy,” he told journalist Kara Swisher on a conference stage.

2017: “We’re still on track to be able to go cross-country from Los Angeles to New York by the end of the year, fully automated.”

2018: “By the end of next year, self-driving will be at least 100% to 200% safer than a human.”

2019: Buying a car that doesn’t include Autopilot is exactly like buying a horse. ”

2022: In a black cowboy hat and sunglasses: “The car that will take you anywhere you want to end up being 10 times safer than driving yourself. It will completely revolutionize the world. ”

The document also explains how Tesla controlled a widely shared video of a Tesla self-driving on the streets of Palo Alto months after Brown’s death.

It should be noted that Autonomous and Self-Driving revenue is largely responsible for meeting the compensation targets that have made Musk so wealthy.

A man wearing glasses and a chestnut t-shirt

Engineer Raven Jian in “Elon Musk’s Incident Course”.

(FX)

3. Musk’s fans don’t hold back. Even on camera

Among his 94 million Twitter followers, Musk has amassed a particularly avid fan base, with the occasional “Crash Course” gesture. (One tweet reads, “Elon is God.”) But the documentary doesn’t go into it why a lot of people seem to be smitten with him. That is the realm of speculation, or perhaps psychology.

However, there are a few selection quotes from Tesla fans and Musk supporters who sat down for the interview, seemingly oblivious to the irony:

“I think Elon wants to make a mark on the world.”

“Any company would kill to get that level of fandom and devotion.”

“He has the resources that allow him to do things that to any other person would be irresponsible or insane.”

A man in a suit at the meeting room table

Elon Musk in an image from the documentary “Elon Musk’s Crash Course”.

(FX)

4. Regulatory error is part of the problem

In the “Crash Course” section, viewers may begin to wonder: Where are the safety regulators?

Great question. The National Highway Traffic Safety Administration investigated the Brown crash, determining that Autopilot somehow skidded across the front of a truck in front, but determined there was no defect. , allowing Tesla to pass.

“I was a bit stunned,” New York Times reporter Neal Boudette said to the camera. “The system does not see the tractor? And that’s not a defect? ”

An NHTSA communications official at the time tried to explain: “It’s a bit complicated and almost counter-intuitive, isn’t it? Autopilot didn’t even engage to try to prevent the crash. But the fact of the matter is, Autopilot is not designed to prevent every crash under all circumstances. “

The fact of the matter is that some of the top NHTSA officials in both the Obama and Trump administrations have gone on to take jobs in the driverless car industry.

NHTSA under Biden and Transportation Secretary Pete Buttigieg is getting tougher on Tesla over data access, and its investigations are continuing.

Meanwhile, a high-speed Tesla crashed into a Newport Beach Expressway construction site on May 12. Three people were killed. Is autonomous driving or full self-driving relevant? Police are investigating the incident and NHTSA has opened an investigation.

‘The New York Times Presents: Elon Musk’s Crash Course’

Where: FX

When: Saturday, 10 p.m

Streaming online: Hulu, Anytime, Starting Saturday

Rating: TV-MA (may not be suitable for children under 17)

https://www.latimes.com/entertainment-arts/tv/story/2022-05-20/elon-musk-fx-crash-course-doc-takeaways Elon Musk doc tackles Tesla safety, Twitter fans. What to know

Edmund DeMarche

USTimesPost.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@ustimespost.com. The content will be deleted within 24 hours.

Related Articles

Back to top button