Yes, I *am* that ESR. Well, it's the question people usually ask.
Programmer, wandering philosopher, accidental anthropologist, troublemaker for liberty.
2 subscribers
Feb 26 • 4 tweets • 4 min read
Time for me to put on my explainer hat again. This time it's "post-rationalist".
But in order to do that I'm going to have to explain some history and at least three other factions.
My warrant for doing this rests on three facts. One: I am a very experienced anthropologist of internet cultures. Two: I am personally acquainted with a number of the leading figures in the cluster of subcultures I'm about to describe. Three: my past writing had some influence on how these subcultures developed - not a lot, but enough that people inside these subcultures notice the connection.
To start with, the rationalists. This is a network of people who collected around the blog "Less Wrong", and the writings of Eliezer Yudkowsky (@ESYudkowsky) beginning around 2005. Another very prominent figure in this group is Scott Alexander.
Modern rationalists are aware that the term "rationalist" has some other meanings in historical and academic philosophy that cause confusion. In particular, in the language of academic philosophy modern rationalists could more accurately be described as "skeptical empiricists" or "predictivists". This is regarded as unfortunate, but we're stuck with the term.
The rationalists have three sets of concerns. One can be described as rationality technique. This is a set of heuristics and language conventions intended to help people think more clearly and make better arguments - better in the sense of converging on truth and correctness faster.
Some rationalist language has percolated out into the wider culture: "virtue signaling" and "steelmanning" are two notable examples. Also, "updating" and "priors" as terms about belief maintenance.
Two other movements are closely connected with modern rationalism and sometimes indistinguishable from it. One is "effective altruism", a network of people who are trying to do charitable giving in the most effective way possible by paying careful and quantitative attention to outcomes.
The third set of concerns is around the destructive potential of AI. Many, though not all, people who call themselves "rationalists" deeply fear that humanity might soon be wiped out by rogue AI and are concerned with attempting to develop measures to constrain it to behave in a recognizably moral way: the shorthand for this is "the alignment problem".
I don't know of a good neutral term for these people - it could be "safetyists" but as far as I'm aware nobody actually uses that. People who disagree with them often call them "doomers".
Now for the unpleasant part. Rationalism has proven very effective at attracting bright, alienated kids - especially bright autists with poor social skills. When they congregate, most notably in the Bay Area around tech industries, they sometimes start exhibiting cult-like behavior - social isolation, sexual deviance, fanaticism.
Some of the movement's leaders have at least tacitly encouraged this. Others have failed to discourage it as much as they might have. At an extreme, it has led to manifestations like the Zizians - quite literally a transsexual murder cult. Less extremely, there has been a rash of suicides, insanity, and drug deaths associated with rationalist group houses.
There's a hostile reading of the history of the rationalist subculture that considers both rationality technique and effective altruism to have been deliberately constructed as recruitment funnels for the doomer cult.
Now I can describe post-rationalists. These are people who have consciously bailed out of the cultish and doomer aspects of the rationalist subculture while keeping a firm hold of rationality technique.
Often post-rats have friends inside the cult-like parts of rationalist subculture; the boundary between these social networks is somewhat fuzzy.
1/2
Related terms:
Grey tribe: Taking a term from one of Scott Alexander's more influential essays, post-rationalists sometimes describe themselves as "Grey Tribe", though that term has political connotations that "rationalist" does not. It's meant to index people who do not consider themselves part of either the blue or red political tribes.
TPOT or tpot: "that part of Twitter" is a group of X posters that overlaps with post-rats a lot, but has more of a techie and tinkerer edge.
2/2
Mar 30, 2024 • 6 tweets • 1 min read
1/ Since the level of public consternation about the xz back door seems to be increasing rather than decreasing, I want to point out something simple and important:
2/ Open source worked the way it's supposed to. Some hacker noticed something that made him curious, poked at it because hackers are like that, and because the code was open and availablwe for inpection, diagnosed the problem before any serious harm was done.