Forgive me, for I am about to Bayes. Lesson: Don't trust intuition, for even simple prior+likelihood scenarios defy it. Four examples below, each producing radically different posteriors. Can you guess what each does? Revealed in next tweet >>
Huzzah! Posterior distributions in red. The shape of the tails, which isn't so obvious to the eye, can do weird but logical things.
Gotta go to a meeting, but I will return to explain each of the four above later!
These are combinations of normal (Gaussian) & student-t (df=2) distributions. Gaussian has a very thin tails. Student-t has thicker tails.
Top-left: normal prior, normal likelihood
Top-right: student, student
Bottom-left: student, normal
Bottom-right: normal, student
>>
Normal prior, normal likelihood
y ~ Normal(mu,1)
mu ~ Normal(10,1)
The classic flavor of Bayesian updating - the posterior is a compromise between the prior and likelihood
Student prior, student likelihood (df=2)
y ~ Student(2,mu,1)
mu ~ Student(2,10,1)
The two modes persist - the extra mass in the tails means each distribution finds the other's mode more plausible and so the average isn't the best "compromise"
Student prior, normal likelihood
y ~ Normal(mu,1)
mu ~ Student(2,10,1)
Now the likelihood dominates - it's thin tails are very skeptical of the prior, but the prior's thick tails not so surprised by the likelihood
Normal prior, student likelihood
y ~ Student(2,mu,1)
mu ~ Normal(10,1)
Now the prior dominates, so reason as previous example but in reverse
Here's the code to reproduce:
The tail differences are easier to see on log scale. If I get some time later today, will make a version showing that.gist.github.com/rmcelreath/39d…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Week 1 of Statistical Rethinking 2023 is done. Here are the memes from this week's lectures
The Spidermen: Causal inference, descriptive studies, and research design are alike in that they all depend upon some generative/scientific model of how the sample was produced
Correlation implies causation is obviously (?) wrong. Correlation does not imply causation is true, but not helpful. Causation does not imply correlation either. *sad trombone* Reality is a simulation — a joke BUT when we simulate causation, what are we simulating?
Okay so I made a transparent gif of Brandenburger Hasselhoff, in case anyone wants to add him to other historical events. Here he is e.g. at Zeppelinfeld in 1945
Working with a colleague on some household income data, where work is irregular. As usual, I start by writing a synthetic data simulation to talk through with colleague. Helps to ensure I understand the problem right. Also brings up fun (for me) like sources of measure error.
In this case, income data are reports and almost certainly suffer rounding and heaping. It's the little things like this that make even simple exercises not so simple.
Also beginning to worry I am a weird sort of economist now, since half of my recent projects are household income data and I've started using the word "elasticity" in casual conversation.
Many performers of music cannot read it. Okay. There are other, often more intuitive, ways to learn music.
Scientists perform stat models. Most scientists cannot read them. This is less OK, but there are other ways to learn models.
Short thread in which I strain this comparison
If you don't read music, the Rzewski excerpt above (left) is meaningless. If you do, it is perfectly clear. You'd read it not by each individual note, but through higher structure like chords & arpeggio patterns & progression.
It's not the notes so much as their relationships.
If you don't read math stats, the social network model above (right) is most meaningless. But again, when you do read these models, you read the model in chunks, through its grammar and phrasing.
It's not the variables so much as their relationships.
In my dept today, I gave a bare minimum proof of why natural selection can favor strategies that do not maximize reproductive rate, provided variance is also reduced. Some papers to start with, if this literature is unfamiliar: >