After we exit the acute risk period (ie put our global house in order and "solve" x-risk) the thing to do is a "Long Reflection":
All of humanity (with our new aligned AGI assistants / augmentations) think hard together: considering our collective values, and working out moral philosophy, game theory, and other more esoteric considerations, to decide, carefully and wisely, what to do with the cosmos.
This is usually framed as "we should think really hard about what is good to do before we take any actions to transform the stars."
But actually, we'll be launching the first Von Neumann probes concurrently with the Long Reflection. Because of spacial expansion, some stars will be about to pass beyond the cosmological event horizon. If we don't send probes immediately, those stars will be forever out of reach
So it seems like the correct thing to do is load up a probe with the seed code for an aligned AGI, or maybe a bunch of human emulations, and shoot them off to the stars just at the edge of our universe, and have them do their own Long Reflection when they get there.
I feel like this is a great setup for a science fiction story:
At the cold entropic end of the universe, an ancient Von Neumann probe enters a solar system which is now causally isolated from the rest of the universe.
It sets up a Dyson swarm, and boots up a number of ems and AGIs who were seeded trillions of years ago.
And there, against the backdrop of a dying universe, they begin their Long Reflection, to determine what a good civilization entails.
From their perspective, life is young and fresh. Their story has only just begun.
They're just like us: people from ancient earth, before the intergalactic flowering of civilization.
They just have less of the cosmos to allocate than the Long Reflection on earth had.
(Though, of course, there are specific organizations, both for-profit and non-profit, that are much more promising than either "random good sounding charity" or "broad-based index funds".)
That is, I claim that the default assumption should be that giving away money approximately doesn't do anything (and might even cause harm), and that a high burden of proof is required to overcome that prior.
My current understanding is that while there was some impeachment of US sailors by the English, the admiralty had ordered the British navy to cut it out, and the practice mostly ceased by the start of the war of 1812.
"Impeachment!" was the official causis belli for the war, that was just an excuse to try to seize British-controlled land in Canada.
Does anyone dispute that story?
(Who should I tag, and what should I hashtag, to bring this to the attention of the history buffs that like talking about this sort of thing?)
Circumcision rates rose for decades, but there started to be push-back in the 40s and 50s. And they've falling again in the second half of the 20th century.
This is useful for calibrating our expectations about tools for thought.
"Allows us to solve problems we couldn't solve before" isn't the only thing that tools for thought might allow us to do. They might also help us discover new problems, or streamline an otherwise-viable but costly creative process.
There's a particular kind of romantic partnership, with a certain sort of person, that I've wanted since I became a self-aware, directed agent at around age 15.
Empirically, this kind of relationship has been hard to achieve. It hasn't worked out yet, at least.
And this sometimes leaves me wondering if my standards are unreasonable.
In years past that desire was often very alive.
These days, I'm rarely directly or viscerally in contact with it.
I was reading something that suggested that trauma "tries" to spread itself. ie that the reason why intergenerational trauma is a thing is that the traumatized part in a parent will take action to recreate that trauma in the child.
This model puts the emphasis on the the parent's side: the trauma is actively "trying" to spread.
This is in contrast to my previous (hypothetical) model for IGT, which puts the emphasis on the child's side: kids are sponges that are absorbing huge amounts of info, including via very subtle channels. So they learn the unconscious reactions of the people around them.