EAs are my favorite type of people.

Like, if I list my top 5 people in the world, all of them have a very strong EA stain (stronger than that of most EAs).

But I'm also flipping through attendees of EAG reconnect, and feeling boredom / despair.
I'm starting to put my finger on the signs that make me feel pessimistic about talking with someone.

I'm still feeling this out, and none of these are prefect indicators, but...
One bad sign is if a person seems interested in talking about the EA movement, instead of the problems to be solved.

This suggest to me that their intuitive/natural impact model is something like "coalition building", or if I'm being uncharitable, a "pyramid scheme."
Like, it seems to me that the majority of people in EA don't really have a grasp on how to, personally, make progress on the hard problems that the EA community has identified.

But, they're covering this up from themselves by trying to get more people on board.
Movement building and outreach feels like a useful thing to do. But if you don't have a machine that can actually solve the hard problems, adding more people to the mix doesn't actually help.
It feels like a pyramid scheme to me because the impact-story implicitly bottoms out at people doing things that make progress on the problems. But for most EAs, their mental map of the world doesn't actually have any detail in that step.
Which is to say, they're just guessing, or trusting, that it DOES bottom out.

But I don't think that's well founded. I think that...
1) it isn't usually the case that adding more people helps you solve hard, barely-specified problems and

2) it is really easy for people to do useless work and if you can't identify good work yourself, you can't know if you're funneling people-hours into a useless machine.
I think this difference between me and my projection of a stereotypical EA probably comes down to a crux of how doomed we are and how tractable things seem.
Like, I imagine people having an attitude of "EA is great. [It is so much better than other kinds of non-profit work]. Obviously we should spread this good thing far and wide."
Whereas, I'm like "The world is not on track to survival, and EA is not on track to solving that. The default trajectory for EA has some people doing some plausibly good things, most people doing things that don't matter at all, and then everyone dying shortly afterwards."
Another related crux is how good donations to effective charities are. I think that most EAs think that donations to effective charities are great, and we should get more people excited about giving to them.
In contrast, I am pretty uncertain how good our best near-term interventions are, but I broadly agree with @TheZvi that most of the good done in the world is not done by non-profits or by donations.
"The most impactful and successful charity in the world is Amazon.com."

thezvi.wordpress.com/2017/08/20/alt…
Therefore, if you really want to do the most good (near term), you have to step outside the frame of non-profits, and just ask where "good" comes from, not how to do good via charity.

And and thinking about the long term: it currently seems pretty hard to turn money into x-risk reduction.
(Another possible crux is just how sensible it is to trust the leadership of the EA moment about what things are good to do.

In brief: I think it is decently sensible, if you gotta trust someone, but really what is most needed is people who can contribute to the figuring out.)
Relatedly, I was sort of amused at myself, when I noticed, looking at EAG reconnect profiles, that I felt exasperated because (to quote my verbal loop)"these people all have jobs."

Which, apparently, implies that they "don't get it."
(Noticeable, because usually being unemployed is a bad sign.)
[Also, I am arguably unemployed right now.]
Like, obviously, it often makes sense to have a job in this world, because it gives you money to live and a "face" for engaging with the world. And it often makes sense to do one's work in a job. Some work can ONLY be done in a job in an institution.
But I imagine that these people are asking themselves a question like "what job should I get to have impact in my career?" instead stepping back and asking "What is happening in the world and what needs to be done?"
This is related to what @paulg says in "How to Make Wealth".

"Having a job" is an artificial level of abstraction. The natural level of abstraction, the one that matters is "creating value."

paulgraham.com/wealth.html
Or another tact is the advice that I give people aspiring x-risk-migrators about grad school:
"I'm not saying that you shouldn't do grad school. Maybe you do actually need the credentials.

But don't confuse credentialing with your _real_ Job, attempting to solving the core problem / figuring out how to become an excellent scientist."
Sometimes you can use grad school to do that. If you can get actual mentor-ship from an excellent scientist, that sounds amazing.

But, usually, you have to work to make that happen.
Getting a PhD does not, by default, make you an excellent scientist, or institutional science would look pretty different.
[Full disclosure: I dropped out of undergrad, and you might want to take that into account when I give you advice about grad school.]
In both these examples, the point is that you need to zero in on the _work_ that is actually being done, not the social categorization of the people that typically (at least according to the prevailing narrative) do that work.
(Because, [spoiler alert], the actual work often looks very different from the societal stereotype of the work, and the people actually doing the work often don't look anything like the people who the prevailing narrative says does the work.)

I've often said that I feel like the main problem with EA is that EAs think that their Job is to get a job at an EA org. But actually, their Job should be to figure out what the frick is happening in the world.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Eli Tyre

Eli Tyre Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @EpistemicHope

20 Mar
I claim that, if it matters for world history who wins WWII (as just one example), then the great man theory of history is straightforwardly correct.
Bismark and Hitler come to mind: if you substitute them with their counterparts from nearby worlds, the power balance of Europe, and the world, looks radically different in their time and, I think, today.
And I think it DOES matter who wins WWII (for instance), because, at minimum, which nations have the "center of mass" of power is going to influence the way the deployment of transformative AI plays out.
Read 6 tweets
18 Mar
Question: Have Moral Mazes been getting worse over time?

Could the growth of Moral Mazes be the cause of cost disease?
I was thinking about how I could answer this question. I think that the thing that I need is a good quantitative measure of how "mazy" an organization is.
I considered the metric of "how much output for each input", but 1) that metric is just cost disease itself, so it doesn't help us distinguish the mazy cause from other possible causes.
Read 6 tweets
18 Mar
This was the most informative and thought provoking youtube video that I've watched in 6 months at least.

It changed my sense of China, and I've made use of the ideas expressed multiple times since I first watched it (different ideas each time, too).

This is the only one that comes to mind as a contender for value per minute.

Again, I would love recommendations that are similar to either of the above.
Read 5 tweets
17 Mar
This seems right to me.

I don't know what the cause is. But my guess is that it has to do with the KIND of threat that each adversary poses/posed.

We COULD acknowledge that maybe Japan had some things right. But we can't acknowledge that about China, without loosing our soul.
The fear of Japan was concrete: in near-mode.

We were afraid that they were going to out-compete us by just being smarter and better at the capitalism game.

They were another western-ish liberal Democracy, like us. But they were going to be better than us at it.
The fear of China is different. It's is ideological, far mode.

Like, it isn't just that they'll win on the merits, but they'll win with (because of?) their evil system.

It's not just a material threat, but also a spiritual threat to American ideals.
Read 13 tweets
8 Mar
I hadn't realized this, but it seems like a crucial insight.

Systems incorporate the powerful effects of the things that they're exposed to, so that there are diminishing marginal returns to those powerful effects.

"Powerful" in this sense is relative to a time and place.
Also (interestingly) related: slatestarcodex.com/2013/04/11/rea…
Read 5 tweets
8 Mar
"Maybe, power dynamics in society aren’t so much about who is in charge. The problem is someone has to be."

I'm not sure that this is _necessarily_ true. Is there a way to socialize people so they do a good job of being in charge?
Is it possible to train people to be good rulers? So that the people with privilege are good social stewards instead of assholes?
I don't know. The historical record is not great on this point. It seems like trying to be a social steward tends to morph into thinking that you're better than everyone else, and deserve the best stuff at the expense of everyone else.
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!