, 15 tweets, 3 min read Read on Twitter
A longish thread about why fears of the AI apocalypse are way overblown, why these scenarios appeal in particular to a certain kind of narcissistic man, and what to do about it. 1/15
The near-panic that many people—from Stephen Hawking to Elon Musk to Nick Bostrom—have expressed about the possibility of the machines "taking over" or killing off humanity is actually a *displaced anxiety* about the collapse of the distinction between humans and machines... 2/15
...underpinned by three further assumptions: a Darwinian view of nature (including human nature) as an inherently rapacious force; the belief that "the most intelligent wins"; & the idea that humans are the most intelligent creatures, which is why humans have been "winning" 3/15
If you believe (a) that 'machines' are becoming indistinguishable from the natural order, (b) that the natural order is inherent voracious, and (c) that superior intelligence leads nolens volens to victory, it follows that the voracious super-intelligent machines will win. 4/15
Aside: this is an old concern. In his 1863 essay, "Darwin among the Machines," Samuel Butler stated: "That the time will come when the machines will hold the real supremacy over the world & its inhabitants is what no person of a truly philosophic mind can for a moment question."
Once we name these assumptions, however, we can see how shaky this whole edifice is. For example: Is human intelligence really "superior"? Even if it is, does superior intelligence always lead to "victory" in the Darwinian struggle? These are both extremely dubious claims. 6/15
It also helps explain why the Chinese react with bemusement to AGI fear factor scenarios. Once you knock out the "red in tooth and claw" view of nature (a phrase drawn from Tennyson's poem about the implications of Darwinism), the AI apocalypse scenario lacks plausibility. 7/15
The Chinese do not subscribe to the idea that the natural order conforms to the Darwinian drive for absolute domination, mastery, control, etc. but rather believe that much of the natural order is about the search for harmonious balance between different elements. 8/15
Finally, if we look at all this from a "sociology of knowledge" perspective, we can see why Superintelligence Apocalypse scenario seems to appeal in particular (indeed, by my estimation, almost exclusively) to extremely smart, professionally hyper-aggressive Western men. 9/15
This is a category of people who are personally invested in the idea that aggression & domination are "natural" (rather than ethically questionable personal traits they happen to be extremely long on), and in the idea that high intelligence = superiority = inevitable domination
In other words, the whole discourse of AI apocalypse appeals to and reinforces the narcissistic sense of self of smart, aggressive, successful men. 11/15
Now, all of the foregoing isn't just random philosophical musing. It has immediate and practical implications for #AI organizations. 12/15
#AI shops should make sure that the people in charge of building the AIs are not exclusively people who conform to this particular psychological profile (super-smart hyper-aggressive western males), many of whom will unthinkingly (ha!) seek to create AIs in their own image. 13/15
This is not because we need to worry that these people will build superintelligent AIs that will eat the planet and destroy humanity. It is because these people are sociopaths who will build sociopathic AIs. 14/15
In other words, what we need to worry about is not AI apocalypse, but AI sociopathy -- and the first step for preventing the latter is to ensure that it's not just high-functioning sociopaths who are in charge of designing these applications.15/15
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Nils Gilman
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!