OK, we need to talk about this clip.

Let's start with the easy stuff: @Lebeaucarnews got a basic fact wrong, his own outlet (@CNBC) has reported that NHTSA has opened at least 28 investigations into Tesla crashes and 24 of those are STILL OPEN TODAY. cnbc.com/2021/04/19/tes…
Obviously the fact that most of NHTSA's Tesla crash investigations are still open casts doubt on @Lebeaucarnews' opinion that they exonerated Tesla and blamed the driver.

But the biggest issue I have here is Phil's framing: this is not a choice between blaming driver or system.
By far the most in-depth investigations of Autopilot-involved crashes were by the @NTSB, and in every case they found that the design of the system contributed to misuse and the crash/death. Not one or the other, but both.

Has anyone actually read these?
In Phil's defense, there has been one NHTSA investigation into Autopilot that exonerated Tesla. Following Josh Brown's death, NHTSA produced a report saying Autopilot not only wasn't at fault, but actually reduced crashes by 40%.

It was comically wrong.
thedrive.com/tech/26455/nht…
At first glance the report had obvious problems. When an auto safety expert made FOIA requests to obtain more information about how they had come to the finding NHTSA blocked him. He had to sue to get it, and when he did it was clear why: it was a bad joke thedrive.com/tech/26455/nht…
NHTSA's inaction on the very specific issues that the NTSB's investigations found in three separate cases, the lack of ODD limits and camera driver monitoring, has left the NTSB extremely frustrated. I encourage you to watch their latest (Feb '20) hearing:
Why has the @NTSB identified the problems in the complex interaction between human and machine when @NHTSAgov could not? Easy: NTSB investigates crashes of all kinds and thus has deep human factors knowledge/experience, NHTSA is largely a defect investigating body and doesn't.
That's why NHTSA's only published investigation turned out to be an embarrassment and why nobody, from Elon Musk himself on down, has been able to explain why or how NTSB's findings were wrong... not once, but three times.

It's because they aren't wrong. They got it right.
Sadly, we live in a time when expertise is secondary to tribal loyalty, where simplistic false dichotomies resonate stronger than complex, nuanced problems, and where people see the internet as a tool to bend reality to their will and not as the ultimate learning tool.
But don't discount the possibility that the complex, nuanced answers provided by NTSB will never cross over into NHTSA. The agency has actually said that systems contributing to foreseeable misuse can be determined defective. The tools are there. federalregister.gov/documents/2016…
But in the meantime, people in the media and public discourse need to wrap their heads around the fact that partial automation like Autopilot can lead to situations where both the design of the system and the user are to blame for bad outcomes. It's not black and white.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with E.W. Niedermeyer

E.W. Niedermeyer Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Tweetermeyer

19 Apr
If onboard data wasn't destroyed in the fire, we'll know the truth about Autopilot's role in this crash.

If onboard data was destroyed in the fire, we only have Tesla's word for what it shows because there is no independent chain of custody.

"So far" seems like a hedged bet.
It appears that I am no longer the only person concerned about chain of custody issues for vehicle data related to crashes that may have involved Autopilot. I can't remember a time when a warrant was used to obtain data from Tesla, so this seems big.
To clarify, it seems that the Precinct 4 Constable Mark Herman quoted in the tweet embedded above has subpoenaed the offboard vehicle data, which makes more sense than an arrest warrant. click2houston.com/news/local/202…
Read 5 tweets
18 Apr
Two people dead, nobody in the driver seat. 6,000 lb experiments in half-baked, camera-only autonomy, capable of doing 0-60 in under 3 seconds, just roaming the streets. I love living in a SciFi dystopia. click2houston.com/news/local/202…
We don't know what happened here, but we shouldn't be at all surprised that it has happened.

It's not just that a life-and-death experiment is allowed to play out, in the hands of amateurs, on public streets. It's that a chorus of ghouls cheers it to this inevitable conclusion.
Just watch: today the ghouls will all be singing from the second stanza of their hymnbook. Today will be filled with cries of "it's just driver assistance" and "the driver is always responsible."

In a week they will all be back to "it basically drives itself! The future is now!"
Read 7 tweets
17 Apr
Can someone explain how humans might make this uniquely hospitable planet uninhabitable, but somehow not do the exact same thing to far, far less hospitable planets?

Please, game that out for me.
Saw a tweet yesterday comparing SpaceX to the Dutch East India Company... as a good thing.

I had to literally put my phone down. The sheer inaccuracy of the comparison (um, what resources are there to plunder on Mars?) was exceeded only by its historically-illiterate amorality.
There's a real kernel of truth there though: claiming, occupying and despoiling any perceived vacuum within our grasp is a consistent human drive, as is the need to dress it up in the latest flavors. Space gives this drive new scope, while "saving" us from its consequences here.
Read 5 tweets
14 Apr
Bold take from a guy whose "Autopilot" has been identified by the @NTSB as a contributing factor in multiple fatal crashes!
In fact, the Ford system that Elon is bashing here has the very two features that the NTSB determined could have prevented those fatal Autopilot-involved crashes: a camera-based driver monitoring system, and operational design domain limits. Autopilot still doesn't have these!
Here are the NTSB investigations of three fatal Autopilot crashes:

Josh Brown: ntsb.gov/investigations…

Jeremy Banner: ntsb.gov/investigations…

Walter Huang: ntsb.gov/investigations…

Watch the last NTSB hearing on Autopilot safety issues (Feb 2020) here:
Read 4 tweets
10 Apr
Critical as I am, I generally give Tesla credit for their powertrains and Superchargers... but this 739-page thread about loss of range and/or charging speed and/or increased "vampire drain" makes me wonder. Some (understandably) angry owners. teslamotorsclub.com/tmc/threads/su…
The guy who has done more to help owners understand these problems than anyone, @wk057, is now being pushed out of the community by TMC mods. I saw the owner-investor war for TMC's soul coming years ago, and unsurprisingly the investors have won.
Here's what the conflict comes down to: owners want an open exchange of information about the products they have spent a lot of money on, while investors want to suppress or spin any information they perceive as being negative for the company. This is a big issue for a forum!
Read 4 tweets
10 Apr
Reminder: radar used to be the "future of Tesla's self-driving Autopilot." If you read the afterword to the paperback edition of Ludicrous: The Unvarnished Story of Tesla Motors you know that Tesla even spent years developing radar in-house inverse.com/article/20833-…
If radar is so useless, why do the early Model Xs have brackets for corner radar? ebay.com/p/700484339

The TMC guys even stumbled onto the fact that Model X was supposed to have corner radar: teslamotorsclub.com/tmc/posts/3223…

Even Elon knows sensor diversity/redundancy matters.
The radar story matters because it reveals that Musk has been fumbling for a strategy even after he started taking customer cash for FSD, he was naive enough to think Tesla could beat Bosch/Conti, and that the north star for this safety-critical tech is cost and not safety.
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!