"Vehicle collided with a deer on the highway while the autopilot was activated."
In August 2022, the new book by Scottish philosopher William McAskill was published: "What We Owe to the Future. The One-Million-Year View." Elon Musk endorsed it, saying, "Worth reading. This aligns quite closely with my philosophy."
McAskill is a proponent of so-called "longtermism," a school of thought popular in Silicon Valley. It teaches that each person should primarily align their decisions with how they will impact the world in the distant future.
Short-term problems, such as accumulating car accidents, can thus lose their terror. At the heart of longtermism is the assumption that many more people will live in the future than do today. This legitimizes decisions that may seem questionable in the present.
Critics consider this dangerous. Longtermists are seen as willing to accept extreme sacrifices under the guise of theoretical benefits.
A theoretical example: More than one million people die in road traffic every year worldwide. A car manufacturer claims that its autopilot could prevent 100 billion fatal accidents over the next 100,000 years.
In an extreme interpretation of longtermism, software malfunctions costing a million human lives would be a negligible problem. After all, the number of victims would only correspond to 0.001 percent of those saved.
McAskill applies this logic to minimize famines, natural disasters, and wars to minor dents in human history. Even if the Earth's population were to fall by 90 percent today, the philosopher writes, an estimated 99.5 percent of humanity would still have their lives ahead of them.
Handelsblatt asked Elon Musk whether he finds such calculations appropriate. The Tesla CEO did not respond.
Software engineer Vivek Wadhwa founded his first company in 1997, and in 1999, Forbes magazine named him a "Leader of Tomorrow." In 2013, Time Magazine named him one of the 40 most influential minds in tech. That same year, Wadhwa met Elon Musk. Musk convinced him of his cars.
Wadhwa bought a Model S. In 2016, he upgraded to the Model X to enjoy the latest version of the Autopilot. Wadhwa was thrilled—until the car drove into his garage without his input. Shortly thereafter, he wanted to demonstrate the Autopilot to a TV station.
But on live camera, the Tesla would have almost caused a rear-end collision—if Wadhwa hadn't hit the brake at the last moment.
Musk is fueling a lie with his prophecies, Wadhwa tells Handelsblatt. The company keeps promising that the problems with the Autopilot will be solved with the next update. But then nothing happens.
Wadhwa: "People are dying because of Tesla's faulty technology. And Elon is trying to get away with it."
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Tesla Files (part 2): How the Huge Data Leak Occurred
"The state commissioner has serious indications of possible data protection violations by the automotive group Tesla," confirmed a spokesman for Dagmar Hartge, the state data protection commissioner in Brandenburg.
Tesla's German factory is located in this federal state. The data protection authority in the Netherlands has also been informed about the case. Tesla's European headquarters is located there.
The background is the "Tesla Files": The informant who alerted the authorities also contacted Handelsblatt. Our reporters have spent considerable effort over the past few months reviewing more than 100 gigabytes of data allegedly originating from within Tesla.
Ok, that escalated quickly. Yesterday, when I tweeted about the 'Tesla Files', I had less than 10 follower - and yes, they were all bots. Seems like the chief twit hasn't solved the bot issue yet. I've been a quiet observer, but things shifted when I red @handelsblatt's piece.
Now, just to be clear: I have no affiliations with Handelsblatt. I'm just an individual who can read German, has subscriptions to both Handelsblatt and GPT-4 (which handled the translations), and thought it'd be fun to share this in the same format as the Twitter Files.
A massive shout-out to the Handelsblatt team, particularly Michael Verfürden (@mv6) and his stellar squad of 11. Kudos to you for this stellar investigative journalism. Your work merits worldwide attention, and I'm excited to have potentially helped it reach that scope.
Chapter 1: The data was leaked by an unknown source revealing thousands of complaints about unexpected accelerations and phantom braking in Tesla cars.
"Phantom braking, leaving traces on the road. Need help as soon as possible because I don't feel comfortable driving again."
To this day, Karl has received no explanation. The tables from the Tesla Files as of March 2022 list, among other things, the model, vehicle number, mileage, and the software installed in the car, as well as the status of the respective incident.
In the corresponding column, Karl's incident is marked as "closed". Next to the accidents of Manfred Schon and the doctor from California, it says "unresolved".
"Our car just stopped on the highway. That was pretty scary."
How did the company handle complaints? The Tesla Files shed light on this as well. The files show that employees have precise instructions for communication with customers. The prime directive apparently is: provide as little opportunity for attack as possible.
For each incident, there are bullet points for "technical review". The employees who enter this review into the system regularly make clear that the report is "for internal use only".
"Frequent phantom braking on two-lane highways. Makes autopilot almost useless."
How big is the risk for Tesla drivers? The search for an answer to this question leads to a converted cow barn in the Bavarian district of Landsberg am Lech.
This is where Jürgen Zimmermann has his workshop. Up to 700 Teslas roll onto his lift each year, he says. Zimmermann films as he inspects the cars, removes wheels and curses axle shafts. Hundreds of thousands watch his clips on Youtube.