I was talking with someone about why most EAs are not a good romantic match for me, and I said "well, for one thing, most of them are not ambitious enough."
I had to clarify that when I said "ambitious", I didn't mean "aiming to get a prestigious high paying, conventional job."
It was disheartening to me to realize that what most people mean by the world "ambitious" is something that I consider somewhere between boring and pathetic, because I had been previously thinking that it was a key-word that filters for part of what I care about.
The thing I care about is something more like "in whatever you're trying to do, refusing to be satisfied with the level of success that is typical, or that others of expect of you."
eg aiming to actually have a correct world model, despite how "everyone's biased".
Striving to makes something truly great, worthy of admission to the cannon of your craft: a truly great novel, truly great software, a truly great company culture. Work that goes above and beyond.
Or after every success, asking how you could save an order of magnitude more lives / do an order of magnitude more good / create an order of magnitude more value, instead of getting comfortable doing the same thing over and over again.
Or working tirelessly to bring to fruition some vision that you can see, despite the fact that when you talk to most people about it, they tell you that that reference class has already been explored.
(h/t @Conaw, as an example now legible to most people.)
Or striving to literally, not as a buzz word, but for real, change the world.
Like, human history flows in a different direction than the counterfactual, because you happened to be born.
I want to add some clarification here, because the way I phrased this makes it sound like I'm insisting on ambition in a partner, which is not quite right.
The main thing is that a causal outsider might think "there must be lots of EA women who are like, and who might want to date, Eli."
And there's a mistake here, which is reading most EAs as doing pretty much the same thing as me.
Which is not how I conceptualize it at all.
There is a foundational thing that I share with EA culture, which is something like a desire to help + basic quantitative reasoning + the idea that we should check that our "helping" actually helps.
If you’re calling it “overthinking”, then you’re doing it wrong.
I have some annoyance at people who assume that thinking A LOT, about something simple, means that you’re overthinking it.
It seems to me, that this could equally mean that THEY’re bad at thinking, and so can’t imagine how doing more of it would help.
OK. So this tweet was coming from a place of annoyance, But phrasing it like that, I feel compassion for people that don’t know how to think well enough for it to be a useful thing to do.
The Powers That Be actually DO regularly lie to us “for our own good”. I’m very sympathetic to not trusting them, because I don’t, and I think one mostly shouldn’t.
The vaccines appear to be genuinely super great (+1 humanity!).
But unless you can read the stats (which apparently most people can’t), all you have to go on is whether or not you trust the Powers That Be and what your friends are doing.
People in my circles sometimes talk about "civilizational collapse" or "civilizational decay" or "decay of societal fabric".
It sure seems like there's a real thing here, but when people use those words, I usually don't know what they mean.
As a starting point for discussion, what are some concrete indicators of more or less severe decay?
Some that occur to me [in no particular order]:
- No “sophisticated” [operationalize] international supply chains.
- The US government/society can’t respond effectively to COVID.
- The US government can’t keep law and order. To the extent that people are safe from crime, it’s because they pay tribute to gangs.
- Some largish percentage of the population considers the US government to be illegitimate.