, 10 tweets, 6 min read
“Normalization of deviance,” a conceptual framework due to organizational sociologist Diane Vaughan, might help understand the current shaky state of science.

sma.nasa.gov/docs/default-s…
Deviance from systematic error-prevention rules gets normalized when a community of practice repeatedly recovers from near-disasters. That gives the impression that the rules were unreasonably strict, and you can walk closer to the line in future… gnssn.iaea.org/NSNI/SC/TM_ITO…
Management pressures front-line workers to do more. Feeling overworked, they want to expend less effort where feasible. Safety checks are easy to jettison because, nearly always, nothing really bad happens.

gnssn.iaea.org/NSNI/SC/TM_ITO…
Management—in the science analogy, the senior figures in the field who edit the journals and make the funding decisions—justifies what would previously have been considered deviance using group-think strategies, “for the benefit of the field overall.” sma.nasa.gov/docs/default-s…
Once deviance is normalized, most participants don’t even notice it. They learned the notional rules, but those are obviously and routinely ignored, without any negative consequences, so they live in a separate mental compartment.
Defending against system failures (including those revealed in the science replication crisis) is feasible only when it’s part of everyone’s job.

how.complexsystems.fail
Individual failures are visible only to people actually doing the work—not to higher-ups. Scientists in the trenches know most journal articles they see are nonsense (and must find informal ways of weeding them out). Editors measure citations instead gnssn.iaea.org/NSNI/SC/TM_ITO…
The work of reform, to make a system resilient to sources of error, is necessarily meta-rational: working on the system from outside.

It necessarily depends on circumrational understanding: how the system interacts with its non-systematic environment.

how.complexsystems.fail
Understanding the details of individual cases of deviance is the first step in generalizing into an understanding of failures in systematic processes. A culture of openness is key both to gaining that understanding and to ongoing error prevention.

ncbi.nlm.nih.gov/pubmed/25742063
“Safety clutter”: bureaucratic procedures in the name of reducing risk that don’t, or maybe even increase it. h/t @Dioptre

Scientific grant process aims to avoid funding bad work, but doesn’t, and significantly decreases the amount of good work.

tandfonline.com/doi/abs/10.108…
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with David Chapman

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!