Mahmood Hikmet Profile picture
Feb 2 9 tweets 3 min read
So, Tesla's lawyers saw my tweets from a couple weeks ago about the deposition transcripts and they didn't appreciate them very much.

Here is a video about it:
In Huang vs Tesla, Tesla has filed a request to retroactively and proactively mark all depositions confidential due to this thread and the media fallout from the deposition:
One of the things I pointed to was Ashok Elluswamy apparent lack of knowledge of what an Operational Design Domain is. This is Automated Vehicles 101 stuff, but as Tesla's Head of Autopilot Software, he didn't seem to know.

Let's be generous and say he just blanked.
Tesla's lawyers, in a legal and publicly accessible filing claim that the operational design domain is a concept that is ENTIRELY IRRELEVANT for Level 2 Systems...

This is Tesla's position.

I'm sorry... WHAT!!!!?!?!
This is factually and provably false.

Here are a couple of screenshots from SAEJ3016 (free to download from here: sae.org/standards/cont…)
Also, here is the ODD for Autosteer (which you can find in any Tesla manual). Why would they add something "entirely irrelevant" in there?
This is no longer just an isolated incident where a single employee didn't know specific technical jargon - this negligence is company policy and strategy.

Their lawyers just said the quiet part out loud.

This is unheard of in this industry.
All of this is accessible on traffic.scscourt.org/search

Case Number: 19CV346663

You can see my printed and photocopied tweets in all their monochromatic glory.
As always, my DM's are always open for anyone (even Tesla's lawyers) if they're looking to genuinely educate themselves on aspects of this industry.

I promise to not make fun of you for stupid questions.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Mahmood Hikmet

Mahmood Hikmet Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @MoodyHikmet

Jan 19
In Tesla's infamous "Paint it Black" demo video, it opens with the following title screen.

I want to specifically highlight the phrase: "HE IS NOT DOING ANYTHING"

Let's talk about how this is a lie. Image
Most people in this space have no doubt heard about the different "levels" of autonomy for vehicles going from Level 0 to Level 5.

If you have 4 minutes, please watch this video:
The levels describe what elements of the Dynamic Driving Task (DDT) are under control of the human and what parts are under control of the system. Each step up in the levels corresponds with the "handing over" of one DDT element from human to machine.

Tesla Autopilot is Level 2. Image
Read 10 tweets
Jan 18
The main criticism I've received from Tesla defenders has been that Ashok Elluswamy, Head of Autopilot Software at Tesla, not knowing the term "ODD" is a "gotcha".

I'm going to tell you why this is SO MUCH worse than you think.

Let's put aside any expectation that Ashok should have educated himself about basic terminology in the lead up to becoming the Head of Autopilot Software.

While not the case, let's assume that Operational Design Domain (ODD) is an extremely obscure term that no one uses.
Ashok has been the Head of Autopilot Software since mid-2019. We can assume from this point on, he has been in charge of leading the team, setting how they tackle goals they've been given and all of that.

It's why he was at the deposition afterall.
Read 10 tweets
Jan 17
How do engineers show up at work every day and continue to work on a system which has killed multiple people?

Looking into the deposition of Dhaval Shroff, Autopilot Engineer at Tesla, in relation to the Walter Huang fatality in 2018 you can get a sense of how they justify it.
Human factors seem to be completely ignored by a division at Tesla dedicated to developing driver assistance technology.

By definition this technology is assisting the human driver, but the behaviour of the driver are not even thought about by this team - which is WILD.
We saw similar attitudes crop up in the Ashok Elluswamy, Head of Autopilot Software, deposition where when questioned about human factors he responds with "I am a software engineer" as if that gives him an out.

Read 8 tweets
Jan 15
I don't know if I can put into words how terrible this is, but I'll try.

This quote is from a deposition of Ashok Elluswamy, Tesla's Head of Autopilot Software relating to the 2018 fatal Autopilot crash of Walter Huang.

He doesnt know what an Operational Design Domain (ODD) is. Image
An ODD describes the conditions under which an automated system has been designed to operate. It's one of the MOST IMPORTANT things to have a grasp of when designing that system.

Here's a quick primer:
It just keeps getting worse as the deposition goes on. But this is the person in charge of this system within Tesla. He should absolutely and totally know what an ODD is and specifically where his software is fit for operation and where it isn't. Image
Read 9 tweets
Mar 11, 2022
Ridiculous but true fact: Zoom, the teleconferencing app, has done more to reduce emissions than electric vehicles.
Electrification of the vehicle fleet isn't the holy grail that a lot of people think it is.

If you truly want to reduce emissions, then you should be looking at changing the way people use transport (or avoiding it altogether) rather than merely improving the fuel source.
If this intrigues you, the concept is called Avoid-Shift-Improve. The wikipedia article is a really quick intro to this style of thinking: en.wikipedia.org/wiki/Avoid-Shi…
Read 7 tweets
Nov 3, 2021
So the first Tesla OTA Recall was issued for FSD Beta 10.3 and it's brought to light a few things 🧵

Recall report: static.nhtsa.gov/odi/rcl/2021/R…
First the good: It's good to see Tesla finally co-operating with regulators and doing what they're supposed to in the case of releasing unquestionably unsafe updates.

More of this please.
It's also good to see that only 17 of the 11704 vehicles have not patched.

But there are 17 testers who aren't testing the current build and are driving around with dangerous software. Tesla could kick them off the program for this, but they don't seem to be doing that 🤷🏼‍♂️
Read 17 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(