So, Tesla's lawyers saw my tweets from a couple weeks ago about the deposition transcripts and they didn't appreciate them very much.
Here is a video about it:
In Huang vs Tesla, Tesla has filed a request to retroactively and proactively mark all depositions confidential due to this thread and the media fallout from the deposition:
One of the things I pointed to was Ashok Elluswamy apparent lack of knowledge of what an Operational Design Domain is. This is Automated Vehicles 101 stuff, but as Tesla's Head of Autopilot Software, he didn't seem to know.
Let's be generous and say he just blanked.
Tesla's lawyers, in a legal and publicly accessible filing claim that the operational design domain is a concept that is ENTIRELY IRRELEVANT for Level 2 Systems...
This is Tesla's position.
I'm sorry... WHAT!!!!?!?!
This is factually and provably false.
Here are a couple of screenshots from SAEJ3016 (free to download from here: sae.org/standards/cont…)
Also, here is the ODD for Autosteer (which you can find in any Tesla manual). Why would they add something "entirely irrelevant" in there?
This is no longer just an isolated incident where a single employee didn't know specific technical jargon - this negligence is company policy and strategy.
In Tesla's infamous "Paint it Black" demo video, it opens with the following title screen.
I want to specifically highlight the phrase: "HE IS NOT DOING ANYTHING"
Let's talk about how this is a lie.
Most people in this space have no doubt heard about the different "levels" of autonomy for vehicles going from Level 0 to Level 5.
If you have 4 minutes, please watch this video:
The levels describe what elements of the Dynamic Driving Task (DDT) are under control of the human and what parts are under control of the system. Each step up in the levels corresponds with the "handing over" of one DDT element from human to machine.
The main criticism I've received from Tesla defenders has been that Ashok Elluswamy, Head of Autopilot Software at Tesla, not knowing the term "ODD" is a "gotcha".
I'm going to tell you why this is SO MUCH worse than you think.
Let's put aside any expectation that Ashok should have educated himself about basic terminology in the lead up to becoming the Head of Autopilot Software.
While not the case, let's assume that Operational Design Domain (ODD) is an extremely obscure term that no one uses.
Ashok has been the Head of Autopilot Software since mid-2019. We can assume from this point on, he has been in charge of leading the team, setting how they tackle goals they've been given and all of that.
How do engineers show up at work every day and continue to work on a system which has killed multiple people?
Looking into the deposition of Dhaval Shroff, Autopilot Engineer at Tesla, in relation to the Walter Huang fatality in 2018 you can get a sense of how they justify it.
Human factors seem to be completely ignored by a division at Tesla dedicated to developing driver assistance technology.
By definition this technology is assisting the human driver, but the behaviour of the driver are not even thought about by this team - which is WILD.
We saw similar attitudes crop up in the Ashok Elluswamy, Head of Autopilot Software, deposition where when questioned about human factors he responds with "I am a software engineer" as if that gives him an out.
I don't know if I can put into words how terrible this is, but I'll try.
This quote is from a deposition of Ashok Elluswamy, Tesla's Head of Autopilot Software relating to the 2018 fatal Autopilot crash of Walter Huang.
He doesnt know what an Operational Design Domain (ODD) is.
An ODD describes the conditions under which an automated system has been designed to operate. It's one of the MOST IMPORTANT things to have a grasp of when designing that system.
Here's a quick primer:
It just keeps getting worse as the deposition goes on. But this is the person in charge of this system within Tesla. He should absolutely and totally know what an ODD is and specifically where his software is fit for operation and where it isn't.
Ridiculous but true fact: Zoom, the teleconferencing app, has done more to reduce emissions than electric vehicles.
Electrification of the vehicle fleet isn't the holy grail that a lot of people think it is.
If you truly want to reduce emissions, then you should be looking at changing the way people use transport (or avoiding it altogether) rather than merely improving the fuel source.
If this intrigues you, the concept is called Avoid-Shift-Improve. The wikipedia article is a really quick intro to this style of thinking: en.wikipedia.org/wiki/Avoid-Shi…
First the good: It's good to see Tesla finally co-operating with regulators and doing what they're supposed to in the case of releasing unquestionably unsafe updates.
More of this please.
It's also good to see that only 17 of the 11704 vehicles have not patched.
But there are 17 testers who aren't testing the current build and are driving around with dangerous software. Tesla could kick them off the program for this, but they don't seem to be doing that 🤷🏼♂️