Profile picture
, 44 tweets, 6 min read Read on Twitter
Carl Sagan’s Balony Detection Kit, a set of cognitive tools and techniques that fortify the mind against penetration by falsehoods:
1. Wherever possible there must be independent confirmation of the “facts.”
2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view
3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives.
4a. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations.
7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much.
Just as important as these tools of bullshit detection are unlearning these common pitfalls that are the modus operandi of every bullshitter...
1. ad hominem — Latin for “to the man,” attacking the arguer and not the argument (e.g., The Reverend Dr. Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously)
2. argument from authority (e.g., President Richard Nixon should be re-elected because he has a secret plan to end the war in Southeast Asia — but because it was secret, there was no way for the electorate to evaluate it on its merits;
2a. the argument amounted to trusting him because he was President: a mistake, as it turned out)
3. argument from adverse consequences (e.g., A God meting out punishment and reward must exist, because if He didn’t, society would be much more lawless and dangerous — perhaps even ungovernable.
4. appeal to ignorance — the claim that whatever has not been proved false must be true, and vice versa (e.g., There is no compelling evidence that UFOs are not visiting the Earth; therefore UFOs exist — and there is intelligent life elsewhere in the Universe.
4a. Or: There may be seventy kazillion other worlds, but not one is known to have the moral advancement of the Earth, so we’re still central to the Universe.) This impatience with ambiguity can be criticized in the phrase: absence of evidence is not evidence of absence.
5. special pleading, often to rescue a proposition in deep rhetorical trouble (e.g., How can a merciful God condemn future generations to torment because, against orders, one woman induced one man to eat an apple? Special plead: you don’t understand the Doctrine of Free Will.
6. begging the question, also called assuming the answer (e.g., We must institute the death penalty to discourage violent crime. But does the violent crime rate in fact fall when the death penalty is imposed?
7. observational selection, also called the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses (e.g., A state boasts of the Presidents it has produced, but is silent on its serial killers)
8. statistics of small numbers — a close relative of observational selection (e.g., “They say 1 out of every 5 people is Chinese. How is this possible? I know hundreds of people, and none of them is Chinese. Yours truly.”
9. misunderstanding of the nature of statistics (e.g., President Dwight Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence);
10. inconsistency (e.g., Prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they’re not “proved.”
10a. Or: Attribute the declining life expectancy in the former Soviet Union to the failures of communism many years ago, but never attribute the high infant mortality rate in the United States (now highest of the major industrial nations) to the failures of capitalism.
10b. Or: Consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past);
11. non sequitur — Latin for “It doesn’t follow” (e.g., Our nation will prevail because God is great. But nearly every nation pretends this to be true; the German formulation was “Gott mit uns”).
12. post hoc, ergo propter hoc — Latin for “It happened after, so it was caused by” (e.g., Jaime Cardinal Sin, Archbishop of Manila: “I know of … a 26-year-old who looks 60 because she takes [contraceptive] pills.” Or: Before women got the vote, there were no nuclear weapons)
13. meaningless question (e.g., What happens when an irresistible force meets an immovable object? But if there is such a thing as an irresistible force there can be no immovable objects, and vice versa)
14. excluded middle, or false dichotomy — considering only the two extremes in a continuum of intermediate possibilities (e.g., “Sure, take his side; my husband’s perfect; I’m always wrong.” Or: “If you’re not part of the solution, you’re part of the problem”)
15. short-term vs. long-term — a subset of the excluded middle, but so important I’ve pulled it out for special attention (e.g., We can’t afford programs to feed malnourished children and educate pre-school kids. We need to urgently deal with crime on the streets.
15a. Or: Why explore space or pursue fundamental science when we have so huge a budget deficit?
16. slippery slope, related to excluded middle (e.g., If we allow abortion in the first weeks of pregnancy, it will be impossible to prevent the killing of a full-term infant.
16a. Or, conversely: If the state prohibits abortion even in the ninth month, it will soon be telling us what to do with our bodies around the time of conception);
17. confusion of correlation and causation (e.g., A survey shows that more college graduates are homosexual than those with lesser education; therefore education makes people gay.
17a. Or: Andean earthquakes are correlated with closest approaches of the planet Uranus; therefore — despite the absence of any such correlation for the nearer, more massive planet Jupiter — the latter causes the former)
18. straw man—caricaturing a position to make it easier to attack (Scientists suppose that living things simply fell together by chance — a formulation that willfully ignores the central Darwinian insight, that Nature ratchets up by saving what works and discarding what doesn’t.
18a. Or — this is also a short-term/long-term fallacy — environmentalists care more for snail darters and spotted owls than they do for people)
19. suppressed evidence, or half-truths (e.g., An amazingly accurate and widely quoted “prophecy” of the assassination attempt on President Reagan is shown on television; but — an important detail — was it recorded before or after the event?
20. weasel words (e.g., The separation of powers of the U.S. Constitution specifies that the United States may not conduct a war without a declaration by Congress. On the other hand, Presidents are given control of foreign policy and the conduct of wars,
20a. which are potentially powerful tools for getting themselves re-elected. Presidents of either political party may therefore be tempted to arrange wars while waving the flag and calling the wars something else — “police actions,” “armed incursions,”
20b. “protective reaction strikes,” “pacification,” “safeguarding American interests,” and a wide variety of “operations,” such as “Operation Just Cause.” Euphemisms for war are one of a broad class of reinventions of language for political purposes.
20c Talleyrand said, “An important art of politicians is to find new names for institutions which under old names have become odious to the public”)
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Jonny
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!