Dr. Émile P. Torres Profile picture
Philosopher & historian of all things related to human extinction. PhD. Co-host of "Dystopia Now" w/ Kate Willett. Musician w/ WonderLost. 🇺🇸
7 subscribers
Jul 13 9 tweets 4 min read
I've been screaming about this for years: what many powerful people in Silicon Valley want is to replace the human species with a new population of digital posthumans in the form of AGI. This is the goal, and it's seen by many as the natural next step in cosmic evolution. Short🧵 Image Here are all the articles I've written about this so far. Pro-extinctionism is not a fringe ideology, but the mainstream within certain tech circles. But don't take my word for it -- listen to what these people themselves have said: techpolicy.press/digital-eugeni…
Jun 28 4 tweets 1 min read
God damnit, @nytimes. I have been screaming about this for *years*. The TESCREAL ideologies are *pro-extinctionist*--they do not want our species to exist in the future. Here are just a few recent articles I wrote about this (below). Please take this seriously! truthdig.com/articles/a-tal…
Jun 20 14 tweets 5 min read
What is it about our present moment that has made pro-extinctionism so damn appealing? From the "Efilists" to the TESCREALists, VHEMT to e/acc, more and more people agree that Homo sapiens should soon expire. I wrote an article about this. A short 🧵: Image First, here's a link to the article: truthdig.com/articles/a-tal…
Jun 5 15 tweets 4 min read
A short thread on the messy divorce drama going on with Trump and Musk. I'll be adding more as the saga continues ... feel free to contribute below. Image Image
May 29 14 tweets 3 min read
I'll write a Truthdig article about this soon, but for now it's worth (I think) introducing a distinction that will help make sense of the pro-extinctionism at the heart of the TESCREAL movement. This concerns what I call "terminal" and "final" human extinction. These refer to two distinct extinction scenarios. The first--terminal extinction--would happen if our species were to disappear entirely and forever. The second--final extinction--adds an additional condition. It would happen if our species were to disappear entirely and forever *without* us
May 26 25 tweets 8 min read
I can't stress enough that the whole push to build AGI & the TESCREAL worldview behind it is fundamentally pro-extinctionist. If there is one thing people need to understand, it's this. Here's a short 🧵explaining the idea, starting a this clip of Daniel Kokotajlo: Here's Eliezer Yudkowsky, whose views on "AI safety" have greatly influenced people like Kokotajlo, saying that he's not worried about humanity being replaced with AI posthumans--he's worried that our replacements won't be "better."
May 23 22 tweets 8 min read
“If God manifested himself as a human in the form of Jesus Christ, then why not as a posthuman in the form of superintelligent AI?”

It's happening: Christianity and TESCREALism are merging in the MAGA movement -- or so I argue in my new article for @Truthdig. A 🧵on this👇 Image I note that there are two factions within the MAGA movement: traditionalists like Steve Bannon and transhumanists like Elon Musk. The first are Christians, and the second embrace a new religion called "transhumanism" (the "T" in "TESCREAL").

truthdig.com/articles/machi…
Apr 25 18 tweets 6 min read
Silicon Valley is run by people who genuinely think the world as we know it is going to end in the next few decades. Many also WANT this to happen: they WANT the biological world to be replaced by a new digital world. They WANT "posthumans" to take the place of humans. A 🧵: Image As I write in my newest article (linked below), some scholars refer to evangelical Christian Zionists as the "Armageddon Lobby." But there's a new Armageddon Lobby that's taken hold in Silicon Valley. These people embrace what Rushkoff calls The Mindset.

truthdig.com/articles/the-e…Image
Apr 5 4 tweets 2 min read
I've now watched most of this. One thing that's striking is the shear number of assumptions that the authors make--assumptions about extremely complex issues. They also use phrases like "I feel that X" to express some of their conclusions. Reminds me of "The Age of Em," which an influential AI safety researcher once described to me as having the highest bullshit-to-word ratio of any book he's ever read. In many ways, this report (discussed in the podcast) is worse than theology, although it manages to give one the prima facie impression of rigor.
Feb 7 7 tweets 2 min read
This memorandum is GREAT. It is very, very important--so I highly recommend it. However, the analysis is problematic--NOT because it's incorrect but because it's incomplete. Neoreaction could be seen as a roadmap for how to get to a certain destination. But what is that (short🧵) destination? The answer comes from the TESCREAL worldview, which Dave Troy has written about before (I recommend his article!). The end-goal is a techno-utopian civilization of posthumans spread throughout our entire lightcone. No, I am not kidding--I know this because I used to
Oct 24, 2024 10 tweets 2 min read
I sent a paper of mine to an Oxford philosophy prof earlier this year, and he *loved it*. Told me I should submit it without edits, and that he'd be citing it in future papers of his. So, I submitted to an ethics journal -- desk rejection. I submitted it to another journal, and this time it got reviewed: one reviewer liked it, but opted to reject(??), while the other reviewer said, basically, that the paper is complete trash. Since then, I've sent it out to 5 other journals -- all desk rejects. I'm about ready to post it on SSRN so that this Oxford prof
Apr 6, 2024 25 tweets 7 min read
What Musk and Beff Jezos aren't saying is that Silicon Valley is OVERRUN by human-extinctionist ideologies! The dominant visions of our future among the tech elite, espoused by both Musk and Beff, ARE EXTINCTIONIST. A 🧵 on my newest article for @Truthdig: truthdig.com/articles/team-…
Too much of the environmentalist movement has morphed into a human extinctionist movement 5:37 PM · Apr 5, 2024 · 38.7M  Views Replying to @elonmusk  No file chosen Beff Jezos — e/acc ⏩  @BasedBeffJezos · 21h Many such movements... initially noble goal, progressively co-opted by the extinctionist mind virus This is absolutely crucial for journalists, policymakers, academics, and the general public to understand. Many people in the tech world, especially those working on "AGI," are motivated by a futurological vision in which our species--humanity--has no place. We will either be
Feb 25, 2024 7 tweets 2 min read
Something happened recently that has brought me to tears on several occasions. Basically, person A is struggling with serious health issues, and person B, who is close to person A, has expressed unconditional support for A, no matter how bad things get. This is not normal (!!)—I don’t mean that normatively (a claim about what ought to be), but statistically (a claim about what is the case). Many, many, MANY people--friends, family, partners, etc.--leave and abandon others in times of need. When I very young, an older relative of mine told me that I
Jan 5, 2024 6 tweets 2 min read
Fascinating. Ghosting in a long-term relationship is, I think, one of the most traumatic experiences one can have. It will never not be the case that I moved to a foreign country for someone who ghosted me when I got sick, after *years* of living together. It's changed my entire worldview, tbh. I wasn't a philosophical pessimist or nihilist when I entered Germany, but--ironically--I left Germany as one. Hard to express how much ghosting has impacted me. Studies, though, suggest that ghosting can harm ghosteres, too. More here: truthdig.com/articles/what-…
Dec 16, 2023 26 tweets 8 min read
Last month, Sam Altman was fired from OpenAI. Then he was rehired. Some media reports described the snafu in terms of a power struggle between Effective Altruists and "accelerationists"--in particular, "e/acc." But what is e/acc? How does it relate to EA?

truthdig.com/articles/effec… And what connections, if any, does e/acc have with the TESCREAL bundle of ideologies?

There are two main differences between e/acc and EA longtermism. The first concerns their respective estimates of the probability of extinction if AGI is built in the near future. Picture of Yudkowsky with "We're all gonna die" next to his face.
Nov 10, 2023 31 tweets 9 min read
The Effective Altruist movement has had a terrible year. Its most prominent member, Sam Bankman-Fried, was just found guilty of all 7 charges against him. But the FTX fiaso wasn't the only scandal that rocked EA. A short 🧵about my newest article:

truthdig.com/articles/effec… One might say that "a single scandal is a tragedy; a million scandals are a statistic." From a PR perspective, it's sometimes *better* to have a whole bunch of scandals than just one major transgression, because people start to lose track of the what and when. Hence, I thought The Bankman-Fried debacle was just one of many controversies to have embroiled the EA movement over the past year and a half. In fact, there were so many that it’s difficult to keep track. And so we run the risk of the scandals being buried in the collective memory and forgotten.  This is a strategy employed — whether consciously or not — by people like Elon Musk and Donald Trump. To modify a line often attributed to Joseph Stalin, one scandal is a tragedy; a million scandals are a statistic — and statistics don’t have the same psychological force or impact that tragedies do.
Jun 11, 2023 21 tweets 6 min read
Curious about all these "warings" and "statements" signed by "AI experts" that AGI threatens humanity with "extinction"? In this rather philosophical article, I explain what they're talking about -- and it's almost certainly not what you think. A short 🧵

salon.com/2023/06/11/ai-… Could AGI actually cause "human extinction"? That's a "how" question that I'll discuss in another article. What I focus on here is what "human extinction" means in the first place. When TESCREALists say "We must avoid human extinction," what exactly are they talking about?
Jun 10, 2023 7 tweets 4 min read
A short thread of articles that mention the "TESCREAL" acronym. First up, one from @davetroy in the Washington Spectator (@WashSpec):

washingtonspectator.org/understanding-… POTIC (People of Color in Tech, @pocintech) has a really nice discussion of the TESCREAL bundle of ideologies here:

peopleofcolorintech.com/articles/timni…
May 31, 2023 12 tweets 4 min read
From LessWrong. A short 🧵 on the significance of this post and its unsettling question: Why is violence against AI ... After saying he doesn't advocate for violence, Yudkowsky has endorsed nanobots causing property damage to AI labs. The reference to "instant death" is that a rogue AGI could create self-replicating atmospheric diamondoid bacteria (what??) that blot out the sun, or something. 🤖🦠 So it is - again - explicit...
May 30, 2023 7 tweets 2 min read
"Extinction" would directly affect the elite, which is why they care about it mitigating "extinction" risks. "Sub-extinction" risks from AI that harm marginalized peoples don't get signatures like this, from Sam Harris, Grimes, and Toby Ord.

Here's a crucial point: if AGI kills everyone, then marginalized groups lose (along with everyone else).Alternatively, if resources are poured into preventing hypothetical sci-fi scenarios involving AGI, marginalized groups ALSO lose, because they'll continue to be ignored. Either way, certain groups will lose
May 8, 2023 5 tweets 1 min read
nonsense. One doesn't need an authoritarian *state* for eugenics to be profoundly illiberal. If "enhancement" is even possible (Susan Levin provides *many reasons* that it won't be), those who choose not to be "enhanced" (so-called "legacy humans," lol) will be left behind, thrown into the gutter, abused, exploited, and decimated (perhaps through murder -- just read what Bryan Caplan wrote about "IQ realists"). No one who knows anything about our world today or about human (esp. Western) history could doubt this. People with power do not treat the