"Autonomy Day" 2019, Tesla influencers claim to have had "Full Self-Driving" rides but when asked by regulators why autonomous miles weren't logged Tesla clarifies that "Full Self-Driving" is actually a Level 2 driver assistance system
Last night it happened again: the influencers claimed "Full Self-Driving is here" only for NHTSA to release a statement saying they were briefed by Tesla who apparently told them that "Full Self-Driving" is in fact a driver assistance system reuters.com/article/us-tes…
For what it's worth, the SAE guidelines for public road testing (J3018) say that prototype automated driving systems should be classified based on their intended automation level when complete. Needless to say, it's unlikely SAE foresaw consumers testing!
Creating norms and regulations around driving automation technology is one of the biggest governance challenges we face as a society, but when a company takes an approach that nobody foresaw happening due to the massive reputational risks it becomes even harder.
That's why we have this mess: the entire AV sector agrees on certain basic assumptions due to the risk that a single crash could profoundly affect this technology. The norms and terminology we have are based on those assumptions, and they don't work when those risks are ignored.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with E.W. Niedermeyer

E.W. Niedermeyer Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Tweetermeyer

22 Oct
Still trying to wrap my brain around the fact that Elon Musk said he wouldn't put lidar in Teslas even if it were free.
Seriously, the only real problem with lidar is that it's expensive. Otherwise it's an extremely powerful sensor modality that compliments vision and radar really well, and more modalities basically always means more safety. If there's some other downside, nobody's mentioned it.
What it comes down to is: the question was hypothetical and Tesla won't have lidar in its vehicles, so Musk has to talk it down. If that hurts his credibility with people who actually understand AV technology it's no huge loss... they didn't buy his FSD narrative anyway.
Read 4 tweets
20 Oct
Q: How do you roll out a plan to have untrained customers test unvalidated beta "Full Self-Driving" software on public roads, thereby flouting every industry safety norm around public road testing?

A: Extremely slowly and cautiously, as it should be.
It's really too bad that you can't read @SAEIntl's J3018 "Safety-Relevant Guidance for On-Road Testing of SAE Level 3, 4, and 5 Prototype Automated Driving System (ADS) Operated Vehicles" without paying $83, as it shows how far from industry norms Tesla is sae.org/standards/cont…
For example: J3018 says that "fallback test drivers" engaged in public road testing of L3-5 automated driving systems should be "trained and qualified in test driving," including classroom time, test track training and on-road training... even for late-stage prototype testing.
Read 8 tweets
20 Oct
Fantastic piece here by @HalfonJesse on the complex legal issues around the 2018 crash of an in-development autonomous vehicle. How auspicious that this is coming out on the day that Tesla starts to make its customers untrained "safety drivers" for its "Full Self-Driving" system.
Testing unvalidated software on public roads with untrained consumers as the only safety net is the most wild divergence from mainstream practices for public road testing of autonomous vehicles we have ever seen. It's hard not to see it producing a whole crop of Rafaela Vasquezes
If these fears are warranted, @m_c_elish's concept of a "moral crumple zone" is going to hit the mainstream like a ton of bricks. Stay ahead of the curve by reading her article here: qz.com/461905/when-yo…

and academic paper: papers.ssrn.com/sol3/papers.cf…
Read 5 tweets
13 Oct
Imagine this: in the midst of the space race, with the US and USSR pumping billions into moon landing missions, Boeing suddenly claims they could land people on the moon in a modified 707.

That's roughly how Tesla's Full Self-Driving claims come across to AV developers.
AVs are taking longer to reach maturity than a lot of people expected, and that's using a ton of super high-end cameras (>20 in the new Waymo stack) plus 360 degree coverage of short- and long-range lidar, and high-performance radar. Typical sensor suites costs 6 figures per car!
To see Tesla's claims as credible, we're forced to believe that the companies who have been leading in this technology could be using hardware that costs at least an order of magnitude less... but either don't realize it, or aren't smart enough to make it work.
Read 18 tweets
13 Oct
Tesla has been promising a truly autonomous driving system for four years (as of next Monday) and I for one am looking forward to moving past the "just imagine" phase and seeing an actual system in action. Enough table talk, it's time to actually show your cards... let's see 'em.
Remember all the hype about "Smart Summon"? The buildup was insane, with Musk claiming that it would "really illustrate the value of having a massive fleet"... and then it came out, the flood of videos ranged from hilarious to terrifying and nobody spoke of it again.
Of course, there is a difference this time: people will be using "Full Self-Driving" at higher speeds and in complex situations to see what it's capable of, and the risk of an injury/death and/or an owner being used as a "moral crumple zone" is correspondingly higher.
Read 8 tweets
13 Oct
Adversarial attacks are not likely to be common, but the vulnerability shows how how important it is to have diverse and redundant sensor modalities in autonomous vehicles. If you can create a safety risk by fooling one camera, you were in trouble long before the attack.
Tesla's approach to full autonomy is seen as plausible by people who know a little bit about machine learning and absolutely nothing about safety critical systems architecture and functional safety... which is a lot of people these days!
Try doing some basic functional safety thinking yourself: imagine a totally driverless Tesla driving down the road, when something goes wrong. For example, a camera goes out due to humidity or mud. What might happen? Play it out.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!