Benjamin Todd Profile picture
Apr 23 2 tweets 1 min read Read on X
Breaking: Nobel laureates, law professors and former OpenAI employees release a letter to CA & DE Attorneys General saying OpenAI's for-profit conversion is illegal, and betrays its charter.

The letter details how the founders of OpenAI chose chose nonprofit control to ensure AGI would serve humanity, not shareholders.

Now when Altman says "AGI will probably get developed during this president’s term", control (and uncapped upside) is to be handed to investors, scrapping safeguards Altman told Congress were necessary in 2023.

The nonprofit would surrender its most powerful mission-achieving tool—control of the leading AGI lab—in exchange for an equity stake it already holds.

"Imagine a nonprofit with the mission of ensuring nuclear technology is developed safely and for the benefit of humanity selling its control over the Manhattan Project in 1943 to a for-profit entity so the nonprofit could pursue other charitable initiatives."Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Benjamin Todd

Benjamin Todd Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @ben_j_todd

Apr 4
AGI by 2027?

I spent weeks writing this new in-depth primer on the best arguments for and against.

Starting with the case for...🧵 Image
1. Company leaders think AGI is 2-5 years away.

They’re probably too optimistic, but shouldn't be totally ignored – they have the most visibility into the next generation of models. Image
2. The four recent drivers of progress don't run into bottlenecks until at least 2028.

And with investment in compute and algorithms continuing to increase, new drivers are likely to be discovered. Image
Read 11 tweets
Feb 3
1/ Most AI risk discussion focuses on sudden takeover by super capable systems.

But when I imagine the future, I see a gradual erosion of human influence in an economy of trillions of AIs.

So I'm glad to see a new paper about those risks🧵
2/ We could soon be in a world with millions of AI agents, growing 10x per year. After 10 years, there's 1000 AIs per person thinking 100x faster.

In that world, competitive pressure means firms are run more & more by AI.
3/ A military without AI defences could get immediately disabled by cyber attack etc.

So humans are gradually taken out of the loop on more and more decisions.

What happens?
Read 15 tweets
Jan 21
People are saying you shouldn't use ChatGPT due to statistics like:

* A ChatGPT search emits 10x a Google search
* It uses 200 olympic swimming pools of water per day
* Training AI emits as much as 200 plane flights from NY to SF

These are bad reasons to not use GPT...🧵
1/ First, we need to compare ChatGPT to other online activities.

It turns out its energy & water consumption is tiny compared to things like streaming video.

Rather than quit GPT, you should quit Netflix & Zoom. Image
2/ Second, our online activities use a relatively tiny amount of energy – the virtual world is far more energy efficient than the real one.

If you want to cut your individual emissions, focusing on flights, insulation, electric cars, buying fewer things etc. will achieve 100x more.Image
Read 7 tweets
Dec 22, 2024
The AI safety community has grown rapidly since the ChatGPT wake-up, but available funding doesn’t seem to have kept pace.

What's more, there’s a more recent dynamic that’s created even better funding opportunities, which I witnessed in a recent grantmaking round..
1/ Most philanthropic (vs. government or industry) AI safety funding (>50%) comes from one source: Good Ventures.

But they’ve recently stopped funding several categories of work:

a. Republican think tanks
b. Post-alignment work like digital sentience
c. The rationality community
d. High school outreach
2/ They're also not fully funding:

e. Technical safety non-profits
f. Many non-US think tanks
g. Foundations can't donate to political campaigns
h. Nuclear security
i. Other organisations they've decided are below their funding bar
Read 7 tweets
Dec 22, 2024
How can you personally prepare for AGI?

Well maybe we all die. Then all you can do is try to enjoy your remaining years.

But let’s suppose we don’t. How can you maximise your chances of surviving and flourishing in whatever happens after?

The best ideas I've heard so far: 🧵
1/ Seek out people who have some clue what's going on.

Imagine we're about to enter a period like COVID – life is upended, and every week there are confusing new developments. Except it lasts a decade. And things never return to normal.

In COVID, it was really helpful to follow people who were ahead of the curve and could reason under uncertainty. Find the same but for AI.
2/ Save as much money as you can.

AGI probably causes wages to increase initially, but eventually they collapse. Once AI models can deploy energy and other capital more efficiently to do useful things, there’s no reason to employ most humans any more.

You'll then need to live of whatever you've saved for the rest of your life.

The good news is you have one last chance to make bank in the upcoming boom.Image
Read 9 tweets
Dec 1, 2024
Just returned to China after 8 years away (after visiting a lot 2008-2016). Here's some changes I saw in tier 1/2 cities 🇨🇳
1/ Much more politeness: people actually queue, there's less spitting, and I was only barged once or twice.

But Beijing still has doorless public bathrooms without soap. Image
2/ Many street vendors have been cleared out. Of the 30 clubs that used to exist in a tower block in Chengdu, only 1 survives. It's more similar to other rich countries. Image
Read 16 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(