Venkatesh Rao ☀️ Profile picture
Jul 31, 2021 83 tweets 13 min read Read on X
I've been noodling on an idea for a while that I've been reluctant to do a thread on for... reasons that will become obvious, but let's yolo it. I call the idea "charismatic epistemologies." Aka... how successful people explain the world, and how those explanations fail.
I've been reluctant to do this thread because it runs the risk of specific successful people I know thinking I'm subtweeting them, which is ironic, because a big feature of charismatic epistemology is believing things are about you when they are not.
My n size for this theory is probably several dozens. I've been around people who are far more talented and successful than me for like 30 years now, and sort of figured out how to free ride in their slipstreams. Sometimes parasitically, sometimes symbiotically.
For a while, I was confused about this phenomenon because I was thinking of it as a minor special case of the fundamental attribution error, with minor additions. No, it's a distinct and richer/larger phenomenon. en.wikipedia.org/wiki/Fundament…
The FAE is the tendency to explain outcomes based on dispositional factors rather than situational, with a personal bias. As in "you just got lucky, I won by genius... you asked for your misfortunes, I got unlucky."

But the FAE applies to all of us, not just outsize successes.
Charismatic epistemology is what you get when the fundamental attribution error meets an outlier success with a non-trivial evidentiary basis for their own uniqueness/outlier-ness.
A familiar but peripheral example is simple, lazy self-mythologizing, as in a poorly written self-congratulatory autobiography. This is peripheral because it is something of a conscious construction, and the person knows they're doing it at least dimly/unconsciously.
Charismatic epistemology isn't about the story you tell about your *own* success story. It's how you explain the *rest* of the world by extrapolating from your own experiences. This is an egoist bias rather than an egoTist bias. differencebetween.net/miscellaneous/…
EgoTists are boring. Trump is an example. EgoTists not actually interested in the rest of the world. Only in themselves. The charismatic manifestation of EgoTism in a successful outlier person is narcissism. I'm not talking about narcissism in this thread.
Charismatic epistemology is something you get even with self-effacing, non-egoTistic egoists. And here I mean egoists in a broader sense than usual. People who believe they're special in some way, not necessarily "superior," but may or may not want to perform it egoTistically.
The sources of charismatic epistemology may or may not even be fodder for egoTism. Any innate trait or life experiences will serve -- being much taller or shorter than others, race, gender, early childhood trauma or privilege, extreme poverty or wealth, living through wars...
2 special cases of interest:

One is people who've had extreme objective validation of their sense of specialness (scoring genius IQs when young, winning olympic gold at 16, becoming a big movie star at 21)
The other is people who've had experiences of specialness that are extremely NOT objectively validatable but they know from the inside were huge. Enlightenment type altered mind states, periods of deep suicidal behavior, life-altering but not very visible illnesses.
So that's the eligible set. Note that this specialness is not of "everybody has some special shit if you look hard enough." This is actual outlier attributes relative to closer-to-the-mean humans. This is necessary but not sufficient for charismatic epistemology to emerge.
Necessary+sufficient conditions are:

1. Subjectively compelling (even dispositive) evidentiary support for your specialness

2. Outlier success where your specialness is a non-trivial, salient explanatory factor.

3. Tested predictive power of "I am special"

4. Broad curiosity
If ALL 4 factors hold, you might develop a case of "charismatic epistemology."

What are the features of this epistemology?
The biggest feature is: believing the universe works in a way where your account of your own success is Exhibit A. Ie the same laws that you think determined your successes, determine how the world itself works.
For example, if a core belief of your explanation for your own success is "optimism and daring shapes events, and I won because I was optimistic and daring" you'll tend to explain everything else in terms of the effect of optimistic-daring.
Several secondary features:

1. You "do your own research" a lot

2. You are acutely aware/sensitive to the flaws in how other people think

3. When people disagree with you, you look for hidden agendas/backstories first, examine the merits of their arguments second
If this sounds similar to crackpot reasoning patterns, it is not an accident. It IS the same pattern, except that it is accompanied by high success and comes from your own life. For most crackpots, it is sort of "borrowed" from charismatic leaders they become true believers of.
Ie, charismatic epistemologies that are born of successful and "different" people attempting to explain their own lives to themselves "seed" the crackpot epistemologies of... much less impressive people. But let's set that aside. This is not about the crackpot echoes.
One interesting sign of this. The Peter Thiel "contrarian/heretic" question of a "secret" is designed to DETECT a charismatic epistemology at work. But it can't actually tell crackpots apart from the future outsize success types.
So everybody who uses the "secret" question for diagnostic purposes tends to pair it with OTHER evidence to determine whether you're a genuine potential future success or a crackpot. Some people call it "earned secret" rather than just a "secret"
Cf necessary and sufficient... it's bidirectional. If you satisfy conditions 1, 3, and 4, there's a stronger-than-random chance you'll develop condition 2 (outsize success)... and if you have charismatic epistemology, chances are you got there via satisfying the 4 conditions
The mechanics of this are mostly obvious, so I won't belabor them, but one mechanic is worth calling out. The "tested predictive power." People who develop charismatic epistemologies don't just make up just-so self-congratulatory theories about their past. They TEST them.
If you believe, for eg. that having a green-dyed beard is a big part of your success, you'll actually try to prove this, by (for eg) giving important speeches with and without green-dyed beards. And these experiments will prove you right! Green beards lead to success!
Now this is where charismatic epistemologies start to turn cancerous, and any genuine explanatory power starts turning into your personal pseudoscience. Now why might your "green beard" theory prove out?
The thing is, success is non-ergodic! Once you start succeeding, you lose the ability to systematically experiment with your own success factors because it has *already* altered your decision environment!
If you

a) have a reputation for success, however strong

b) have been telling a story about it with confidence and self-assurance

c) people have believed in, adapted to, and responded to it...

Well... it's a self-validating theory.
This is the "reality distortion field." It takes a while for it to grow to Steve Jobs size and power, but it is visible fairly early on a success trajectory of any sort. I've met enough people "before" and "after" that I've now seen it multiple times.
The more you succeed, the more people around you will adopt complementary patterns of effective behavior to ride your coat-tails, and the more they'll reinforce the theory of success. This will make the next time you tell your story even MORE confident and self-assured.
It's a bit like a stock market effect, where a stock goes up and up and up because more money pours in, and up to a point, actually makes future success more likely.
Notice the positive feedback effect here: you are confidently espousing a theory of success, backed by a track record... the confidence and success attract ever-more talented people hoping to leverage your "formula" to succeed themselves. Proven winners attract likely winners.
The problem here is that your chances of being *right* about the world don't actually increase at the same rate as your chances of *succeeding* in the world. You're going viral like a meme, not converging on an ever-truer theory of the world.
l'il break here for dinner (stuffed arepas 🥳) to be continued...
Ok, to continue. Here's where the crackpot connection comes in. A charismatic epistemology is something like a flat earth theory that's not just approximately locally true, it is increasingly exactly true over time.
This is because success has a "flattening" effect in the immediate neighborhood. The world around you adapts to you through the progressive effect of a strengthening reality-distortion field. It grows to include a core group, a larger group, a company, a market...
Before we examine the implications of this, a couple more pieces of evidence/illustration.

First, the Amazon Leadership principle #4: "Good leaders are right, a lot."

amazon.jobs/en/principles
This is, in my experience... basically true. Good leaders are weirdly, presciently, right. In a series of uncertain, ambiguous decisions, where rational analysis might expect say a 30% hit rate, they're hitting 80-90, even 100% for brief periods.
This is more than Taleb's "fooled by randomness" effect because it's not like betting on the market. This is "being right a lot" due to the world having arranged itself in your local neighborhood to consistently prove your theory right.
Second piece of... not evidence, but... illuminating narrative false-color from a perceptive observer, Douglas Adams In Hitchhiker's Guide.
Zaphod Beeblebrox goes into the Total Perspective Vortex and, much to the amazement of the others, does NOT come out stark raving mad from having gazed into the void and realized his own utter insignificance (which is what the machine is supposed to do)
Instead he comes out feeling validated: the TPV validates what he already believes: he is the most important guy in the universe.

The reason, in the book, is simple: they are in a universe that was created for Zaphod so he's obviously the most important person in it.
So successful people develop a genuine local Midas touch that makes them BOTH winners and "right a lot" ... in their sphere of influence/within their reality distortion field. Which can be as big as a large corporation or a third of a nation, and last as long as 10-15 years.
These are crazy conditions, right? No normal human mind could get through the stress of being so much rightness, so much winning, without developing an epistemology based on the locally simplest Occam's Razor: your theory of your own success is right, as far as your eye can see.
Worse, you NEED this epistemology to function. It has to become an efficient System 1, thin-slicing decision system that's second nature, because your success has now created oversubscription.
You have a queue winding around the block, with people hoping your oracular midas touch can do its presciently "true and winning" thing with whatever you're bringing to that success party. You have to apply your success theory in fast-mode, like a trained deep neural net.
To be clear, this is not something ordinary people experience. It's not the normal fundamental attribution error that average mediocre types might develop. It's that on steroids, plus several other dynamics creating a perfect storm of high confidence belief in yourself.
Regular people don't experience this. We are always aware of the zone and strength of our "rightness" and "winning." If we wander outside, we instantly doubt ourselves and incline towards trusting others over ourselves when it sounds like they know what they're talking about.
A person in the grips of a CE, otoh, has to work extraordinarily hard to even find the boundary of their rightness/winningness, because it is so big and comprehensive.
This is the reason, for eg., there's that folktale trope of a king disguising himself as a commoner to wander the city at night to try and see the world without the reality distortion field in effect where his presence alters whatever he's trying to observe accurately.
Now here's where things start to go wrong, due to factor 4 -- broad curiosity. Charismatic epistemology wouldn't lead to problems if it sort of stayed contained within its zone of strong truth. But generative and talented people are rarely that narrow in outlook.
So when they slow down with age or tire of their formulaic success pattern, and let their mind wander to things far afield... they find it really hard to suspend their RDF and walk the streets of reality like commoners in rags.
At this point, it's hard to illustrate what happens next without either making up fake examples, or subtweeting real people which is why I was reluctant to do this thread 🤣
Suffice it to say, their "win rate" starts dropping sharply, and their ability to "be right a lot" requires more and more... epicycles. They never quite get to entertaining the possibility that they're outside their zone.
Prognosis:

-- win rate drops
-- takes more and work to be less and less right
-- theories get more convoluted, with more epicycles, less of that sweet occam-razor sharp elegance
Here a funny thing happens, involving others in their RDF. Haters and critics are as irrelevant to the epistemology as ever, because they've already been proven wrong long ago during the apogee of the CE's success.

But true believers... hmm there's trouble with them.
The thing is, the ability of true believers to do various validating things relative to the charismatic epistemology is dependent on the theories arising from that epistemology being elegant, simple, powerful, and effective.
As the CE starts to get overextended and develop epicyclic clutter, it becomes hard for true believers to even pretend in the Winning and Rightness theater, because it's too complicated now.
A dark cartoon version of this played out, for eg. in QAnon circles leading up to, and beyond, election night. Something very like this happens around every late stage charismatic epistemology that is wandering abroad far from where the world is flat.
The central charismatic figure at this point can react in many ways.

Some react by progressively cutting off people who are starting to stumble. The circle of "good people" starts to shrink and tighten. More and more people get cut out. It's a LIFO effect.
The healthiest reaction is to recognize what's going on, and *realize that there is no easy way out.* The only way out is to take a looong break where you stop running the success script entirely, and wait for the RDF to wane and dissipate and ultimately collapse.
This will happen because if you just stop using the charismatic epistemology, it will stop feeding large groups of people and developing in strength. My guess is a field that takes 10 years to build up takes about 1-2 years to dissipate.
The only way to accelerate the field decay is to make a radical leap into a new endeavor where your CE is not just slightly fraying at the edges, but wildly wrong in ways that make you a fumbling beginner from day 1.
Many even-headed big successes seem to do this naturally. They reboot in ways that make them a beginner again. A good heuristic for doing this is to find another, equally successful person whose CE is dramatically different from yours.
There's a lot more to say... including about dark descents into hell, phenomena that arise in response like "hater" patterns, Big Man Straussian theorizing cottage industries... it's a whole extended universe.
But I don't want to write a whole grand unified thesis here. So I'll close with a personal angle. One reason I've developed these theories is that I've spent a lot of time around people with charismatic epistemologies, and built something of a career out of pen-testing them.
The reason I have this career is that many in the early stages of success and a modest RDF dimly recognize the risks and want it challenged enough to prevent cancerous over-extension, but not so much that it gets undermined even where/when it works.
Haters and actually hostile critics are like actual hackers in this picture. They’ll actually cause damage if you let them into the RDF out of naive belief in “critical skepticism.” Their anger and resentment will seek to replace the RDF with a self-destructive alternative.
You want someone who is not hostile but also has relative natural immunity to the RDF by virtue of lacking the ability to profit from it. If you were not a talented engineer/designer you’d have been less susceptible to Jobs’ RDF: you can’t work for it, it can’t work for you.
But there are 2 other reasons I ended up accidentally being a connoisseur and wrangler of charismatic epistemologies: mediocrity and social media.
First, I spent ~20y, age 15-35 having my sheer ordinariness and mediocrity drilled into me. I wasn’t right a lot, I wasn’t winning a lot. I wasn’t wrong a lot, I wasn’t losing a lot. I had an average amount of good and bad luck.
For about 5 minutes after making it into IIT (which all of India believes is a Special Thing), I believed I was special. That self-congratulation party ended rapidly when I found myself strictly in the middle of the s distribution and with legit special geniuses all around.
Second, age 35, I became “internet famous” via a viral blog post. And the sheer mind-boggling vacuity and inconsequentiality of that “arrival” served as a vaccine against ever developing a charismatic epistemology myself.
People think I’m humble-bragging or being self-deprecating when I insist on my own mediocrity. It’s not. Do you know what it means to have 45k followers or a blog with a 14y history of multiple viral hits, and many famous friends?

It’s about the same as being a middle manager 🤣
But because people believe internet reputational currency is worth a lot more than it is, you get what I think of as a “fake reality distortion field” or fRDF that can’t actually generate the winning and rightness of the real thing.
I often get mentioned in the same breath with people with comparable social media reputations but a lot more behind it.

One sign: startup rubes sometimes assume I’m rich and approach me for funding. Not intros, actual funding.

Also various requests for magic I can’t perform.
This is like a vaccine. Having an fRDF with middle-manager mediocrity behind it means you get inoculated against developing a charismatic epistemology yourself.

It’s not always effective. Curiously it’s in the online minor leagues that people develop the most CE from fRDFs.
But once you make it to D-list, the sharp disconnect between the optics and the reality inoculates you.
But to bring it back to macro… the world is now awash in Big Man charismatic epistemologies and colliding reality distortions. It is a world of weird epistemic feudalism. There is no banal public epistemology that mundanely pursues rightness without charismatic epiphenomena.
The larger emergent charismatic epistemology that was emanating from Mount Davos has collapsed after Gollum Trump jumped into it with the One Ring — the White House charismatic epistemology.

And with it, 40 years of neoliberal harmony has collapsed into a 1000 warring satrapies.
Haha you thought “science” was the uncharismatic truth-seeking epistemology, but that got TEDified and sucked into its own charisma vortex.

There’s nothing left. No commons-based public domain epistemology to navigate by. Just a Straussian War of Titans bumping RDFs.
I’ll stop here. Tldr: this is happening and shaping the world. It’s neither good nor bad. It just is, like homelessness and nimbyism and religion and droughts and floods and wildfires. Part of the phenomenology of wild earth. Hope you enjoyed the faux-Attenborough documentary.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Venkatesh Rao ☀️

Venkatesh Rao ☀️ Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @vgr

Mar 6, 2023
1/ 20, I am pleased to officially announce the Summer of Protocols (SoP) program, along with a draft of the pilot study that led to it, The Unreasonable Sufficiency of Protocols (TUSoP), which I've been working on with a bunch of collaborators for the last 3 months.
2/ The program will be primarily virtual, and run for 18 weeks from May-August. It will fund a set of full-time Core Researchers and part-time Affiliate Researchers (primarily in the second half) to think broadly and creatively about protocols. summerofprotocols.com
3/ The goal of the program is to catalyze conversation and experimentation around all kinds of protocols, including cultural, social and political ones. We want to get the world thinking in "protocol-first" ways and foster what we call protocol literacy.
Read 20 tweets
Nov 9, 2022
The larger the pile of money the dumber it is

If you have only $10 it is probably really smart money because you’re going to think hard and object level about spending it

If you have $10B, it’s being deployed mostly in > $250m chunks via org charts with 7 levels of bs theories
The largest object level thing you might ever buy even as a billionaire is probably like a car. Anything bigger, you’re actually buying a theory of ownership with multiple levels of abstraction each with assumptions.
For eg. buying a refurbished aircraft carrier — probably biggest “existing thing” that is ever bought — means buying training, maintenance, technology transfer, etc. Above that, retrofit/upgrade roadmaps, aircraft options, fuel futures… it looks like a “thing” but is not.
Read 20 tweets
Oct 23, 2022
Over the last 3 years with the @yak_collective I’ve really come to appreciate the power of committing a small amount of weekly time over a long period. If you have 10 hours to spare for me, I’ll pretty much always pick an hour a week for 10 weeks over 10 hours in 1 day .
Lifestyles tend to be stable for 3-5y at a time. If you commit 1 hour/wk indefinitely, that’s implicitly 150-250 hours if it sticks. Equal to 4-6 weeks of full-time, but that’s harder to use 🤔

An hour is optimal. Can’t do much with 15-30min, but >1h calls for too much org/prep.
It *sounds* powerful to get 4-6 weeks full-time commitment from a talented person (especially skilled ones who can code or design etc) but it’s actually useless because 4-6 weeks means you can create something complex enough to need maintenance/follow through.
Read 32 tweets
Nov 17, 2019
Thinking about my thread this morning on why independent research is hard, and what it would take to make it possible, and whether it’s within the reach of private investors who ALL complain endlessly about how they have far too much capital and don’t know where to put it.
On one extreme you can think UBI, which is roughly ~ early grad student level $.

On the other extreme, you could think of early career faculty grants.

An NSF CAREER grant is 100k/year for 5 yrs, and in 2018, about 150 million was disbursed or about 300.
A subset of ~20 get PECASE awards which push up the 100k to 500k/yr, sp that’s another 40 million. This 190 million basically supports 300 new faculty every year which I think is approximately ALL new faculty in say the top 25-30 universities.
Read 32 tweets
Nov 16, 2019
I was briefly calling myself an independent researcher: somebody who self-funds spec R&D on their own ideas. In theory it’s something like indie-research : academic research :: blogging/self-publishing : traditional publishing.

But the idea doesn’t really work.
Unlike the market for general interest writing, the market for R&D is almost entirely institutional, and they don’t really “buy” indie R&D. It’s 99.9% crackpot inventors, 0.1% black swan stuff.
Most “research” indie consultants sell institutions is in the market-research class, not academic research
Read 34 tweets
Apr 30, 2019
99% of the questions people ask in their 20s and early 30s are roughly the same seemingly “important” ones everybody has always asked at those ages. And 99% come up with roughly the same answers ranging from pretty dumb to reasonably smart regardless of effort.
The 1% different answers people come up with might make them somewhat more famous/rich, but are rarely different enough to change much beyond their own lives. The age-old questions are age old because the answers are in our collective diminishing marginal returns zone.
They are important, like air or water, but they aren’t wellsprings of meaning. How to make money, how to get laid, how politics works, who is good/bad, how to choose friends. You’ll spend 99% of your time on this stuff getting to useful and necessary but uninteresting places.
Read 12 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(