Connor Leahy Profile picture
Hacker - CEO @ConjectureAI - Ex-Head of @AiEleuther - I don't know how to save the world, but dammit I'm gonna try
Dec 2, 2024 25 tweets 8 min read
Conjecture's Roadmap: Cognitive Software and a Humanist Future of AI

Good Cognitive Software should transform the future for the better, but instead all we are getting is AI slop and extinction risk.

What are we doing wrong with AI, and how can we do better? Image @Gabe_cc and I have written up our roadmap for @ConjectureAI to tackle this huge issue.

conjecture.dev/research/conje…Image
Oct 31, 2024 17 tweets 6 min read
RELEASE: THE COMPENDIUM

Several reckless groups are racing towards AGI, and risking the extinction of humanity.

What is AGI? Who are these people? Why are they doing this? And what can we do about it?

We answer these questions in The Compendium.

1/17Image Read it here:



2/17thecompendium.ai
Mar 2, 2024 6 tweets 4 min read
The gods only have power because they trick people like this into doing their bidding.

It's so much easier to just submit instead of mastering divinity engineering and applying it yourself.

It's so scary to admit that we do have agency, if we take it.

In other words: "cope" It took me a long time to understand what people like Nietzsche were yapping on about about people practically begging to have their agency be taken away from them.

It always struck me as authoritarian cope, justification for wannabe dictators to feel like they're doing a favor to people they oppress (and yes, I do think there is a serious amount of that in many philosophers of this ilk)

But there is also another, deeper, weirder, more psychoanalytic phenomena at play. I did not understand what it was or how it works or why it exists for a long time, but I think over the last couple of years of watching my fellow smart, goodhearted tech-nerds fall into these deranged submission/cuckold traps I've really started to understand.

e/acc is the most cartoonish example of this, an ideology that appropriates faux, surface level aesthetics of power while fundamentally being an ideology preaching submission to a higher force, a stronger man (or something even more psychoanalytically-flavored, if one where to ask ol' Sigmund), rather than actually striving for power acquisition and wielding. And it is fully, hilariously, embarrassingly irreflexive about this.

San Francisco is a very strange place, with a very strange culture. If I had to characterize it in one way, it is a culture of extremes and where everything on the surface looks like the opposite of what it is (or maybe the "inversion") . It's California's California, and California is the USA's USA. The most powerful distillation of a certain strain of memetic outgrowth.

And on the surface, it is libertarian, Nietzschean even, a heroic founding mythos of lone iconoclasts striking out against all to find and wield legendary power. But if we take the psychoanalytic perspective, anyone (or anything) that insists too hard on being one thing is likely deep down the opposite of that, and knows it.

There is a strange undercurrent to SF that I have not seen people put good words to where it in fact hyperoptimizes for conformity and selling your soul, debasing and sacrificing everything that makes you human in pursuit of some god or higher power, whether spiritual, corporate or technological.

SF is where you go if you want to sell every last scrap of your mind, body and soul. You will be compensated, of course, the devil always pays his dues.

The innovative trick the devil has learned is that people tend to not like eternal, legible torment, so it is much better if you sell them an anxiety free, docile life. Free love, free sex, free drugs, freedom! You want freedom, don't you? The freedom to not have to worry about what all the big boys are doing, don't you worry your pretty little head about any of that...

I recall a story of how a group of AI researchers at a leading org (consider this rumor completely fictional and illustrative, but if you wanted to find its source it's not that hard to find in Berkeley) became extremely depressed about AGI and alignment, thinking that they were doomed if their company kept building AGI like this.

So what did they do? Quit? Organize a protest? Petition the government?

They drove out, deep into the desert, and did a shit ton of acid...and when they were back, they all just didn't feel quite so stressed out about this whole AGI doom thing anymore, and there was no need for them to have to have a stressful confrontation with their big, scary, CEO.

The SF bargain. Freedom, freedom at last...
Sep 4, 2020 14 tweets 5 min read
arxiv.org/abs/2009.01719
This paper (by @FelixHill84 et al) is really an "It's all coming together" moment for @DeepMind I feel.

Let me try to describe my takeaways from my first readthrough.

1/14 The paper tests several variants of a 3D environment that includes a number (usually 3) of objects. When the agent looks at an object, it also gets the name of the object as a natural language input.

2/14