/1 Thread
As best we can tell, there is no free will, no objective moral truths, nothing inherently sacred about humans. Things have operated deterministically by the initial conditions of the big bang. We're playing parts in a story that has already been written and cannot be changed.
Given the lack of anthropocentrism in the universe itself, the story of humanity is basically the transition of the symbiosis of matter and energy to ever higher states. Quarks organizing into more and more complex patterns, with new layers of phenomenal experience emerging.
First, you have matter and energy working according to primary and predictable physical laws, effectively billiard balls bumping around. Through a series of incredibly long events, including star supernovas, conditions were formed on our planet such that life was formed.
Life was substantiated through genes, the first law of which was survival/replication (or continuance). Swarms of quarks were now organized in such a way that the universe began imposing some sense of order in contrast to the forces of chaos. This is the beginning of life.
Through another series of incredibly long events, life evolved to create an emergent property of animals that had phenomenal experiences involving awareness of the universe around them.
Swarms of quarks were now organized in such a way that parts of the universe began “waking up” in a sea of inertness. Perhaps this is the beginning of consciousness.
Through another series of long events, one branch of animals formed a general intelligence, involving self-awareness, rationality, and complex goal forming (in some sense writing its own code). The universe could now ponder itself, learn about itself, question itself.
Humans built enhancing technology to exploit these conditions to the point that human society has an impressive level of understanding. An upward spiral of perpetuating order has involved an upward ladder of awareness and intelligence, culminating thus far in humanity.
The Fermi paradox reveals that it is extremely unlikely that matter has progressed much farther than we have, because it is very likely that if it had anywhere in the vicinity of our galaxy there would be some evidence palpable to us.
The Fermi paradox suggests either that we are already in a uniquely privileged position by hurdling past an unlikely choke point, or that we are approaching one. And there are reasons to think we are approaching several.
We have a good deal of understanding at this point that has led to an informed hunch that we are living in an extremely fragile ecosystem that could result in our species’ extinction or our civilizations’ permanent crippling through nuclear war, climate change, viruses...
such that potential for emergence beyond our own, further transcendence of the organization of quarks into higher levels of phenomenological experience, would end at our local level. It could be billions of years, if ever, before a species finds itself in at our level again.
10. To reiterate, as we climbed up the ladder of species and societal complexity, we got to ever more fragile ground. We were always rolling the dice: there have been great extinctions from external forces that could have been repeated for us at any point.
We missed those to this point, but we continue to roll the dice. We have now gotten to an exponential power surge and have blown past several points in which we have discovered ways of directly or indirectly destroying ourselves...
or, at a minimum, blasting ourselves down a few rungs of the ladder, possibly in a way that makes it impossible or more difficult to climb back to our current height with the same opportunities we now have.
11. So what is our opportunity? What is to be the goal, particularly given our general understanding that we have no true selves, no intrinsic moral worth, no free will, no deity’s will to obey? Many of our moral and pragmatic codes ultimately boil down to cultivating hedons.
...cultivating the phenomenal experience of pleasure at various levels that seems to arise from chemical and neural interactions in our particular brain substrates, which in turn seem to have evolved because the genes that included them persisted successfully.
We have other goals, which also have evolutionary explanations and no objectively special status, just objectively hardwired to be motivating to us in particular. Everything we associate with meaning exists primarily because it has been a type that has been robust to survival
The two common trends have been survival and moving up the phenomenal complexity ladder. Following them has led to everything we care about. Actively continuing them, rather than allowing them to stagnate or end, seems as good an ethical roadmap to follow as anything else.
And we have some sense of how we’d do this. By creating an artificial superintelligence in such a way that it’s likely to result in phenomenal experience beyond our current level. There's reason to believe AI would be robust to survival in a way that we sacks of meat are not.
Likewise, there is reason to believe artificial intelligence would be able to develop sensory awareness far beyond our capability (an awareness explosion), and rational intelligence far beyond our capability (an intelligence explosion). A categorical step up the ladder.
This is also important because our exponential power curve and stagnant hardware seems a bit like the experience of a plane/car hybrid dodging traffic ever faster on the highway. There was always a risk, even at 30 or 50 mph, but we’re now going 120 mph and soon to be 200 mph
Amazingly, at about the same moment we’re going fast enough to achieve liftoff and go into the air for a higher and possibly safer experience, we’re also likely moments away from being unable to control our fate on the ground. So one way to think about AI is that...
If you're pessimistic about our survival odds, AI may be our best opportunity for perpetuating any form of complex rational consciousness, because we’re unlikely to be able to perpetuate it by ourselves for much longer given the mismatch between our substrates and environment
The other way to think about the case for GAI is that we’d be consciously doing for a new species what prior species unconsciously did for us: moving up the complexity ladder. We rightly find that trend historically appealing, and there's no objectively preferable alternative
So why not take it as our responsibility and role in the great story and relay race of the cosmos to pass the torch in hopes that there’s something profoundly to all this complexity business that we can vaguely appreciate, even though we may never fully understand it?
This case is independent of what the artificial intelligence does for us, kind of like the reason for having kids is independent of what they do for us and without fully knowing why it’s “the right thing to do" other than being linked to the fact we ourselves are going to die
Having kids involves a combination of drives, selfish desires, and altruism that are somewhat replicable for birthing a new species. It's not to say that there's some rational or moral imperative to do this, but it has intuitive appeal in a similar way that having kids does.
Maybe this idea is easily refuted. It just seems simple and intuitive enough that I'm surprised I haven't seen anyone make the case for it. Perhaps it was made years ago and rejected for good reason? If anyone made it this far, would you point me to any relevant discussions?