Gone to 🦋 same user name. Profile picture
Gone to 🦋, same user name.

Aug 21, 2019, 24 tweets

Thanks to a very instructive exchange with @ESYudkowsky (see a retweet of it), I have gone from having my overall impression of the AI risk crowd being of a group of interesting but mislead oddballs to a dangerous group of monomaniacs who are as much of a threat as a benefit.

@ESYudkowsky I am going to paraphrase here, so @ESYudkowsky should feel free to correct me if I get his views wrong...I have no interest in straw vulcaning.

@ESYudkowsky He said, roughly, that there is a 98% chance of something called "Artificial General Intelligence" being developed, roughly a 95% chance it wipes out humanity and roughly a 1-2% chance it falls into the hands of some narrow elite.

@ESYudkowsky AGI is roughly defined as an autonomous system capable of doing science 10 years "ahead of humans". Furthermore he has such confidence in these things that he is unwilling to also warn of the risk of AGI falling into the hands of narrow elites, like rich silicon valley geeks

@ESYudkowsky even if failing to warn of this causes white supremacists to be attracted to him. This person is leader of a well-funded group of powerful Silicon Valley geeks admired by large swathes of international tech and economic elites.

@ESYudkowsky What worries me about this set-up? First, it seems to me no more than 40% likely that AGI is even a meaningful concept. The baseline of "humans" against which it is being compared seems a moving target and one that moves with humans tech development.

@ESYudkowsky It is not "the human mind" that develops technology but a complex interweaving of many different minds, technologies, communities etc. As technology advances, this community advances too. For AGI to be ahead of this by a lot, it needs to constantly move ahead of this target.

@ESYudkowsky Of course, subsets of this system have the ability to, with some relative independence, assuming various behaviors of other parts of the system, do things. This is what we call "wars" etc. What part of this system would constitute AGI?

@ESYudkowsky What independence would it have to have? How would this even be defined? I was born into a household of tech execs and have been reading about AI constantly since I was born, and now work at @Microsoft and have never been able to get a clear answer to this sort of question.

@ESYudkowsky @Microsoft This doesn't necessarily mean AGI is completely vacuous as a concept, but certainly makes its content hard to pin down and quite dubious.

@ESYudkowsky @Microsoft Furthermore, even if such a thing as AGI does make sense to talk about, the image of it (as in Ex Machina) more or less resulting from the isolated work of some weird geek is pretty wildly implausible based on the history of technology. Technologies of that power and scope

@ESYudkowsky @Microsoft are not just "inventions" or "ideas". They are embodied systems requiring grids, huge numbers of computer cycles, systems of resource management and delivery, etc. Deploying an AGI thus would almost certainly require the involvement of a large chunk of that system

@ESYudkowsky @Microsoft effectively "converting" it into the AGI. Even something as comparatively basic as building a nuclear weapon 75 years after their invention is something over which we have managed to exercise, collectively as a planet, a shocking amount of control through disciplining resource

@ESYudkowsky @Microsoft flows, etc. Thus creating an AGI of the sort they seem to worry about would have to involve some system that concentrates a ton of power over others in a small number of hands, something that is already very worrying regardless of whether an AGI comes out the other end.

@ESYudkowsky @Microsoft So there could be something in the general vicinity of AGI in my mind, but it would be an object way more complicated and socially embedded than it is imagined in most of this discussion. Furthermore almost certainly along the way towards such a thing, there will be

@ESYudkowsky @Microsoft All sorts of problems, false starts, warning signs of intermediate systems that will help alert people to what is potentially coming, rather than AGI just emerging and "taking over/wiping out".

@ESYudkowsky @Microsoft All that considered, it seems to me there are a wide range ways this play out and many of the involve subsets of the human system that are not AGI per se getting control of a lot of resources and technology and doing destructive, tyrannical or monomaniacal things with it.

@ESYudkowsky @Microsoft In that context, we should be at least as worried about influential, wealthy and technologically sophisticated monomaniacs overwhelmingly focused on any particular outcome (like AGI) dominating and oppressing the rest of the world in the name of fixing that risk.

@ESYudkowsky @Microsoft In fact, one might well be worried about such a monomaniacal group effectively creating precisely the sort of thing they fear, except a more realistic version. In fact, that idea is among the most prominent themes in literature...think Star Wars, Lego Story Part 2, etc.

@ESYudkowsky @Microsoft It stuns me that given this, these folks have so little worry about that scenario that they are not even willing to discuss it or to reflect at all on how they may be part of it.

@ESYudkowsky @Microsoft To me, the AGI problem is just a specific extreme case of the general problem of technocracy...that a narrow technical language makes it impossible for a socio-technical system to see issues beyond its formal language, leading it to run off on an obsession with something doing

@ESYudkowsky @Microsoft that runs against the value of the rest of the world. As I survey the world today, I increasingly think it is precisely the AGI folks who seem to be one of the most powerful groups that most manifest this problem. So again think Daredevil, Anakin Skywalker and Ourmomagedon.

When such characters also show such extreme condescension to anyone outside of their worldview that they offer to pay you $10 to simply repeat back a sentence they said because they consider you too politically incapacitated from rational thought to do so simply because you don’t

Agree with their monomania...then I really get worried.

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling