Statistical things to worry about *less*
1) significance of univariable associations
2) significant model goodness-of-fit tests
3) imbalance in randomized trials
4) non-normality of observations
5) multicollinearity
1) As a precursor to multivariable analyses, the associations between each individual covariate and outcome are often "screened" for significance. Often does more harm than good, so don't bother doing or worry about it…
2) Every model is a simplification of reality: perfect model fit isn't a real thing. The question is not *if* but *how much* imperfection.

E.g. by evaluation of a hierarchy of calibration, one doesn't have to worry about significant Hosmer-Lemeshow tests…
3) Randomization might be the closest thing to magic. Thanks to randomization (and inferential statistics), we don't have to worry about random imbalances in baseline characteristics. Hooray!…
4) Normality is a rare shape for data. Fortunately, we don't need data to be shaped liked that for most of our analyses. No worries about skewed distributions, if anything, modeling assumptions are usually about shape of residuals rather than observations…
5) Multicollinearity might be the most overrated problem in statistics. While it can cause imprecision in regression coefficients of co-linear variables (shows clearly in results), remaining coefficients and model predictions are typically unaffected:
Now that we have 5 statistical things to worry about less, maybe we can worry a bit more about formulating researchable questions, what we measured, what and who we didn't measure (missing data), sample sizes, unnecessary dichotomizations, overfitting, etcetera

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Maarten van Smeden

Maarten van Smeden Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @MaartenvSmeden

6 Feb
Statistical terms: what they really mean

Multicolinearity— they all look the same
Heteroscedasticity— the variation varies
Attenuation— being too modest
Overfitting— too good to be true
Confounding— nothing is what it seems
P-value— it’s complicated
Sensitivity analysis— tried a bunch of stuff
Post-hoc— main analysis not sexy enough
Multivariate— oops, meant to say multivariable
Normality— a very rare shape for data
Dichotomized— data was tortured
Extrapolation— just guessing
Linear regression— line through data points
t-test— linear regression
correlation— linear regression
ANOVA— linear regression
ANCOVA— linear regression
Chi-square test— logistic regression
Deep learning— bunch of regressions
Read 6 tweets
1 Feb
Personal top 10 fallacies and paradoxes in statistics
1. Absence of evidence fallacy
2. Ecological fallacy
3. Stein’s paradox
4. Lord’s paradox
5. Simpson’s paradox
6. Berkson’s paradox
7. Prosecutors fallacy
8. Gambler’s fallacy
9. Lindsey’s paradox
10. Low birthweight paradox
1. Absence of evidence fallacy

Absence of evidence is not the same as evidence of absence. Wouldn't it be great if not statistically significant would just mean "no effect"?…
2. Ecological fallacy

Hard to resist those sweet population level data to make inferences about health effects on the individual level…
Read 11 tweets
16 Jan
How to become a SUCCESSFUL academic: a guide 1/n
How do I know how to become a successful academic? I don't, but I have received plenty of advice. As a good academic, I will just summarize what I have learned from listening
1) Be the ultimate collaborator but also don't be

Say yes to as many collaborations as physically possible: co-produce papers, LEARN, co-write grants, DISCUSS, it is all about synergy. But also, collaborations slow you down, have your own ideas! Just say no to collaborations
Read 13 tweets
15 Jan
The infamous retracted Hydroxychloroquire Lancet article?
Cited.... 883 TIMES
Only referenced as a joke or warning you say? Think again.. (screenshot from a 2021 paper)
At least the first author doesn't use it to boost his citations and H-index... oh wait...
Read 4 tweets
21 Dec 20
Another year, another personal TOP 10 favorite methods papers
Disclaimer: this top 10 is just personal opinion. I’m biased towards explanatory methods and statistics articles relevant to health research, particularly those relating to prediction

The order in which the articles appear is pseudo-random
1) The first one is related to the pandemic. Title and subtitle give away the conclusions, but the arguments are particularly well put…
Read 14 tweets
26 Oct 20
@Laconic_doc @statsmethods I think Alama has been called out by @GSCollins, I don't know about Public Health England.

Also, I actually never mentioned your name or link to your website to avoid public ridicule
@Laconic_doc @statsmethods @GSCollins That said, I have personally did quite a few things to warn you

First, I send you emails to which you politely and quickly responded. Thanks. You seemed to agree with my critique, but you didn't show any initiative to change it or remove the model
@Laconic_doc @statsmethods @GSCollins Second, I am one of the authors of a reply to the OpenSAFELY study where we specifically mention their model falls short of developing a risk model. You seem to have ignored that and used their multivariable results anyway
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!