"Autopilot failed (did not brake) and I nearly ran into someone at 65 km/h."
Indeed, in accidents, the question of blame is becoming more frequent: Did the driver make a mistake? Or did his autopilot let him down? In the US, the first jury trials are soon to start - due to fatal accidents that the driver assistance system may have caused.
In a particularly high-profile case, a 27-year-old driver is facing manslaughter charges because his Model S crashed into a Honda Civic in a suburb of Los Angeles on December 29, 2019.
Gilberto Lopez and his passenger were killed instantly. They were turning at a green light when a Tesla shot across the intersection at 119 kilometers per hour and crashed into their vehicle.
The Tesla was driving with Autopilot. It did not brake at the red light, nor did it avoid the Honda. This is stated in the records of the Los Angeles Police Department and in court documents available to Handelsblatt. Both cars skidded several meters across the intersection.
Donald Slavik represents Lopez's family. The survivors are suing not only the Tesla driver, but also the company.
Tesla misled its customers with exaggerated promises about Autopilot, argues lawyer Slavik. It did not make driving safer, as promised by Musk for years, but "did not slow down the Tesla or steer it away from the other vehicle".
One of the plaintiffs is Lorena Ochoa, the former wife of the Honda driver. Tesla has not contacted her even once, she says. There was no word of regret, no information about why the Model S ran the red light.
Tesla did not answer any questions about the case from Handelsblatt either. In lawyer letters to the court, the company stated that the accident was "in no way caused by actions or omissions of Tesla."
More lawsuits are looming. Courts in Alameda, San Francisco, and Santa Clara, California are dealing with alleged cases of Tesla Autopilot failures. Handelsblatt has evaluated documents and spoken to those involved. Two of them are the sisters Tammy Neuhaus and Becky Edwards.
Their parents, David and Sheila Brown, died on August 12, 2020, in a Tesla Model 3. The vehicle sped at 183 kilometers per hour in Saratoga, California, ran a red light, and crashed into a Toyota. Handelsblatt was able to view the documents from the subsequent investigation.
According to these, Father Brown was driving with good visibility and moderate traffic on State Route 17 with activated driver assistance systems. At 11:12 AM, eight minutes before the crash, he drove "for an unknown reason" onto the shoulder.
Brown unbuckled, spending four minutes and 14 seconds with the vehicle computer. He also called up the Autopilot settings.
Back on the road, according to the data, the Tesla accelerated to 115 kilometers per hour. The automatic collision warning system briefly reduced the speed. Then, the Autopilot allegedly turned off.
The Tesla continued to accelerate. The gas pedal recorded a pressure of 95 percent, according to the data, when the car touched the rear of a Toyota Sienna. Brown drove off the highway, allegedly without braking.
As the Tesla raced towards the intersection and crashed into the Toyota Tundra, the gas pedal reported that it was 100 percent depressed.
David Brown died at the scene of the accident. His wife Sheila was taken to the hospital with multiple broken ribs and a brain hemorrhage. Two days later, she was also dead.
The police informed the National Transportation Safety Board (NTSB) in Washington. The investigators ruled out drugs and health problems as the cause of the accident. The agency's report says the driver must have confused the gas pedal with the brake pedal.
Despite this, Brown's daughters are suing Tesla. Along with their lawyer, Andrew McDevitt, they are looking for answers to open questions.
Why was the gas pedal allegedly 100 percent depressed, but a brief braking attempt was still recorded in the data? Why did Brown drive onto the shoulder minutes before the accident to call up the Autopilot settings?
Tesla argues in court documents that the accident had nothing to do with the Autopilot because it was turned off. The company did not answer Handelsblatt's questions about the case.
It's no secret that Tesla evaluates every accident. The safety report on the Tesla website states that the company can now use "more than nine billion kilometers driven with Autopilot turned on" to understand "the different types of accidents."
Every collision makes the Autopilot a little smarter, every crash brings the company closer to Elon Musk's vision of a Tesla without a steering wheel and gas pedal.
Nothing is as safe as fully autonomous driving, Musk claims. In this regard, accidents are a means to an end. Milestones on the way to a better future.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Tesla Files (part 2): How the Huge Data Leak Occurred
"The state commissioner has serious indications of possible data protection violations by the automotive group Tesla," confirmed a spokesman for Dagmar Hartge, the state data protection commissioner in Brandenburg.
Tesla's German factory is located in this federal state. The data protection authority in the Netherlands has also been informed about the case. Tesla's European headquarters is located there.
The background is the "Tesla Files": The informant who alerted the authorities also contacted Handelsblatt. Our reporters have spent considerable effort over the past few months reviewing more than 100 gigabytes of data allegedly originating from within Tesla.
Ok, that escalated quickly. Yesterday, when I tweeted about the 'Tesla Files', I had less than 10 follower - and yes, they were all bots. Seems like the chief twit hasn't solved the bot issue yet. I've been a quiet observer, but things shifted when I red @handelsblatt's piece.
Now, just to be clear: I have no affiliations with Handelsblatt. I'm just an individual who can read German, has subscriptions to both Handelsblatt and GPT-4 (which handled the translations), and thought it'd be fun to share this in the same format as the Twitter Files.
A massive shout-out to the Handelsblatt team, particularly Michael Verfürden (@mv6) and his stellar squad of 11. Kudos to you for this stellar investigative journalism. Your work merits worldwide attention, and I'm excited to have potentially helped it reach that scope.
Chapter 1: The data was leaked by an unknown source revealing thousands of complaints about unexpected accelerations and phantom braking in Tesla cars.
"Phantom braking, leaving traces on the road. Need help as soon as possible because I don't feel comfortable driving again."
To this day, Karl has received no explanation. The tables from the Tesla Files as of March 2022 list, among other things, the model, vehicle number, mileage, and the software installed in the car, as well as the status of the respective incident.
In the corresponding column, Karl's incident is marked as "closed". Next to the accidents of Manfred Schon and the doctor from California, it says "unresolved".
"Our car just stopped on the highway. That was pretty scary."
How did the company handle complaints? The Tesla Files shed light on this as well. The files show that employees have precise instructions for communication with customers. The prime directive apparently is: provide as little opportunity for attack as possible.
For each incident, there are bullet points for "technical review". The employees who enter this review into the system regularly make clear that the report is "for internal use only".
"Frequent phantom braking on two-lane highways. Makes autopilot almost useless."
How big is the risk for Tesla drivers? The search for an answer to this question leads to a converted cow barn in the Bavarian district of Landsberg am Lech.
This is where Jürgen Zimmermann has his workshop. Up to 700 Teslas roll onto his lift each year, he says. Zimmermann films as he inspects the cars, removes wheels and curses axle shafts. Hundreds of thousands watch his clips on Youtube.