Thanks to a very instructive exchange with @ESYudkowsky (see a retweet of it), I have gone from having my overall impression of the AI risk crowd being of a group of interesting but mislead oddballs to a dangerous group of monomaniacs who are as much of a threat as a benefit.
@ESYudkowsky I am going to paraphrase here, so @ESYudkowsky should feel free to correct me if I get his views wrong...I have no interest in straw vulcaning.
@ESYudkowsky He said, roughly, that there is a 98% chance of something called "Artificial General Intelligence" being developed, roughly a 95% chance it wipes out humanity and roughly a 1-2% chance it falls into the hands of some narrow elite.
@ESYudkowsky AGI is roughly defined as an autonomous system capable of doing science 10 years "ahead of humans". Furthermore he has such confidence in these things that he is unwilling to also warn of the risk of AGI falling into the hands of narrow elites, like rich silicon valley geeks
@ESYudkowsky even if failing to warn of this causes white supremacists to be attracted to him. This person is leader of a well-funded group of powerful Silicon Valley geeks admired by large swathes of international tech and economic elites.
@ESYudkowsky What worries me about this set-up? First, it seems to me no more than 40% likely that AGI is even a meaningful concept. The baseline of "humans" against which it is being compared seems a moving target and one that moves with humans tech development.
@ESYudkowsky It is not "the human mind" that develops technology but a complex interweaving of many different minds, technologies, communities etc. As technology advances, this community advances too. For AGI to be ahead of this by a lot, it needs to constantly move ahead of this target.
@ESYudkowsky Of course, subsets of this system have the ability to, with some relative independence, assuming various behaviors of other parts of the system, do things. This is what we call "wars" etc. What part of this system would constitute AGI?
@ESYudkowsky What independence would it have to have? How would this even be defined? I was born into a household of tech execs and have been reading about AI constantly since I was born, and now work at @Microsoft and have never been able to get a clear answer to this sort of question.
@ESYudkowsky @Microsoft This doesn't necessarily mean AGI is completely vacuous as a concept, but certainly makes its content hard to pin down and quite dubious.
@ESYudkowsky @Microsoft Furthermore, even if such a thing as AGI does make sense to talk about, the image of it (as in Ex Machina) more or less resulting from the isolated work of some weird geek is pretty wildly implausible based on the history of technology. Technologies of that power and scope
@ESYudkowsky @Microsoft are not just "inventions" or "ideas". They are embodied systems requiring grids, huge numbers of computer cycles, systems of resource management and delivery, etc. Deploying an AGI thus would almost certainly require the involvement of a large chunk of that system
@ESYudkowsky @Microsoft effectively "converting" it into the AGI. Even something as comparatively basic as building a nuclear weapon 75 years after their invention is something over which we have managed to exercise, collectively as a planet, a shocking amount of control through disciplining resource
@ESYudkowsky @Microsoft flows, etc. Thus creating an AGI of the sort they seem to worry about would have to involve some system that concentrates a ton of power over others in a small number of hands, something that is already very worrying regardless of whether an AGI comes out the other end.
@ESYudkowsky @Microsoft So there could be something in the general vicinity of AGI in my mind, but it would be an object way more complicated and socially embedded than it is imagined in most of this discussion. Furthermore almost certainly along the way towards such a thing, there will be
@ESYudkowsky @Microsoft All sorts of problems, false starts, warning signs of intermediate systems that will help alert people to what is potentially coming, rather than AGI just emerging and "taking over/wiping out".
@ESYudkowsky @Microsoft All that considered, it seems to me there are a wide range ways this play out and many of the involve subsets of the human system that are not AGI per se getting control of a lot of resources and technology and doing destructive, tyrannical or monomaniacal things with it.
@ESYudkowsky @Microsoft In that context, we should be at least as worried about influential, wealthy and technologically sophisticated monomaniacs overwhelmingly focused on any particular outcome (like AGI) dominating and oppressing the rest of the world in the name of fixing that risk.
@ESYudkowsky @Microsoft In fact, one might well be worried about such a monomaniacal group effectively creating precisely the sort of thing they fear, except a more realistic version. In fact, that idea is among the most prominent themes in literature...think Star Wars, Lego Story Part 2, etc.
@ESYudkowsky @Microsoft It stuns me that given this, these folks have so little worry about that scenario that they are not even willing to discuss it or to reflect at all on how they may be part of it.
@ESYudkowsky @Microsoft To me, the AGI problem is just a specific extreme case of the general problem of technocracy...that a narrow technical language makes it impossible for a socio-technical system to see issues beyond its formal language, leading it to run off on an obsession with something doing
@ESYudkowsky @Microsoft that runs against the value of the rest of the world. As I survey the world today, I increasingly think it is precisely the AGI folks who seem to be one of the most powerful groups that most manifest this problem. So again think Daredevil, Anakin Skywalker and Ourmomagedon.
When such characters also show such extreme condescension to anyone outside of their worldview that they offer to pay you $10 to simply repeat back a sentence they said because they consider you too politically incapacitated from rational thought to do so simply because you don’t
Agree with their monomania...then I really get worried.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with ⿻(((E. Glen Weyl))) 🇺🇸/🇩🇪/🇹🇼 🖖

⿻(((E. Glen Weyl))) 🇺🇸/🇩🇪/🇹🇼 🖖 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @glenweyl

Oct 18, 2023
My personal reflections and position on the horrors in the holy land: “Next year in Jerusalem” has for thousands of years been the refrain of Jews separated by thousands of miles and dozens of empires from their homeland.
It has also been my refrain for the better part of the last decade, but my separation was imposed not by physical or social realities, but by moral ones.
Eight and a half years ago, when Prime Minister Netenyahu was re-elected with he help of Arab-baiting rhetoric, I concluded that the prospects of Israel returning, through its own internal political dynamics, to a path of international law and peace-making had been exhausted.
Read 25 tweets
Mar 19, 2023
Today may be the most important/culminating of my professional life. Together with dozens of colleagues and collaborators, I’m releasing/launching a series of papers, initiatives and other work. Thanks especially to one of my favorite journalists, @RanaForoohar,
for her piece in the @FinancialTimes today to which this is all pinned: ft.com/content/3e27cf…
First and most importantly, I am thrilled to support @dsallentess, @AKStanger, @sarahhubba and all our collaborators on the launch of the GETTING-Plurality Research Network @HarvardEthics: gettingplurality.org
Read 15 tweets
Sep 15, 2022
It's my honor today to announce the launch of a collaborative book project with my hero, @audreyt, "Plurality: Technology for Collaborative Diversity and Democracy": plurality.net. As you will read if you click through, this will be a unique project, mirroring in the
structure of how we are creating it the values it espouses and hopefully in the process helping to both show and tell new form of collaboration, content creation, publishing, etc. We are honored to be working closely with @protocollabs and particularly @maymounkov on building
the technical infrastructure that will make this possible. Thanks to everyone who has already participated in making this work accessible to speakers of many different languages; I hope others will expand on this soon! And many thanks to @AlexRandaccio and @RadxChange for the
Read 6 tweets
Jul 14, 2022
@VitalikButerin While overall I like this post. there is one very fundamental point on which I think it is just completely wrong and unfortunately much else turns on it. I hope you'll agree with me once I clarify and if not I am very happy to be on it as I think it is basically verifiable.
@VitalikButerin You say that you'd like to live in Ketoland. I am almost sure this is false. 1. You don't really like to live anywhere. 2. There are many things you care about more than Keto and you are very likely to find that a) even by close to random selection you are misaligned on
@VitalikButerin those things with people who select into Ketoland and b) that the people who actually want (given this) to select into Ketoland are particularly nutty and really not the people you want to be around. 3. Once you do want to settle down (if you ever do, and like most people do)
Read 14 tweets
Jul 14, 2022
Two other claims in "The Network State" that I won't write about elsewhere (because they are pretty random and tangential to the book's thesis) but that I think are also bizarre: 1) 1950s as peak centralization and 2) wokeness as core ideology surrounding US imperialism.
Both have specific properties/facts supporting them and an enormous range of other data almost utterly refuting them. The are the sorts of conclusions that strike me as things one could only arrive at by allowing a monomaniacal obsession with some idiosyncrasy to metastasize
into a worldview. Let me explain.
1. 1950s was a bipolar world with relatively small set of broadcast media. Let me reinforce @balajis's point: nuclear weapons held by a few states were really important, income was concentrated in relatively few countries, US had
Read 10 tweets
Jul 13, 2022
Will have more on this soon, but the #1 problem with @balajis "network state" is that it has nothing to do with a network other than that people met via the internet. It is conceptualized as a tightly ideologically aligned, almost monomaniacal group
with a quasi-authoritarian heroic leader, conditioned by exit. This is almost precisely the opposite of a network, which is a loose intersection of partially aligned by diverse participants, constantly evolving and with distributed partial leadership.
Current nation states are far more network-like than @balajis's "network state" would be and I want to see us go more in that direction, not return to feudalism wrapped in physical technological substrates.
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(