If you aspire to do or embody something rare, and you're really hardcore about it, it's hard not to fall into a sense that most lives aren't worth living
There are at least two things here, maybe
One is that if you think about lives in a way that's like 'a life is good to the degree that it's a thread that makes the tapestry of human lives more good,' then you can reconcile a lot of contradictions in that neighborhood
Another is that maybe hardcore existential ambitions are an instance of the double consciousness @add_hawk associates with competitive games, where desire to win both serves and undermines love for the game as a collective project
A dialectics of Caesar-or-nothing and agape love
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I love how characters in Frankenstein (1818) blow each other's minds with 'Western science is historically a spinoff of alchemy and Cabala' takes, just like we do today
A totally correct genre of takes that somehow kept its galaxy-brain oomph for centuries
I'd actually love to know if Shelley adapted that stuff from a known source or was an early contributor to that discourse or what
I think that people like Scott Alexander come to scientific racism through faith that the rise of the nerds untwists a twisted world. Faced with the observation that the rise of the nerds isn't lessening racial disparities, they conclude that racial disparities aren't twisted
If there were data that Black people do much better in tech startups than in trad elite careers, people like Alexander would instead be all 'nerd culture overcomes the oppressive racial hierarchies of the normies that kept Black people down for hundreds of years'
Slate Star Codex is a single-issue ideology, and that issue is 'nerds good'
Some literary theory and AI person who's not fixated on autoencoders like I am should really write on how transformers (BERT, GPT and so on) never 'extract' meaning from a text but rather 'charge' the text's words with relational significance and then 'reread' the text
A little bit of Stein, a little bit of Derrida, a little bit of 'hermeneutic circle' -- could be a good time
I think there's legit interesting stuff here about the persistence of a text as text in our cognition of it, if transformer-like architectures really do prove indispensable in the long term
The thing that's not emotional labor that people call emotional labor -- labor managing others' emotions -- is a good concept and does have a name: affective labor
If my boss has me put up Christmas decorations in the office to improve morale, that's affective labor but not (uniquely) emotional labor
If working alone in the office at night makes me want to start a fire but I grit my teeth through it, that's emotional labor but not (uniquely) affective labor