@engexplain@nntaleb 1/10
Science is assumed to be “evidence-based” but that term alone doesn’t mean much. What constitutes good evidence? How is evidence being used? Is it supporting or refuting a hypothesis? Was the hypothesis and experimental design predetermined or found ex post facto?
@engexplain@nntaleb 2/10
The reality is you can find “evidence” for almost any narrative. Limit the sample size, cherry-pick studies, etc. Systematic reviews, meta analyses, and randomized controlled trials are all susceptible to selective interpretation/narrative fallacy.
@engexplain@nntaleb 3/10
At the heart of the problem is the over-reliance on simplistic statistical techniques that do little more than quantify 2 things moving together.
@engexplain@nntaleb 4/10
Take Pearson’s correlation, based on covariance. Variation can increase simultaneously across 2 variables for countless reasons, most of which are spurious. Yet this simple notion of “causality” undergirds much of scientific literature.
@engexplain@nntaleb 5/10
Information-theoretic (entropy based) approaches on the other hand can assess *general* measures of dependence. Rather than some specialized (linear) view based on concurrent variation, entropy encompasses the amount of information contained in and between variables.
@engexplain@nntaleb 6/10
If you were genuinely interested in giving the term “evidence” an authentic and reliable meaning then the methods used to underpin an assertion would be rigorous.
@engexplain@nntaleb 7/10
We wouldn’t look to conveniently simplistic methods to denote something as evidential, rather we would look for a measure capable of assessing the expected amount of information held in a random variable; there is nothing more fundamental than information.
@engexplain@nntaleb 8/10
Consider Mutual Information (MI), which quantifies the amount of information obtained about one random variable through observing another random variable. This observing of the relationship between variables is what measurement and evidence is all about.
@engexplain@nntaleb 9/10
MI determines how different joint entropy is from marginal entropies. If there is a genuine dependence between variables we would expect information gathered from all variables at once (joint) to be less than the sum of information from independent variables (marginals).
@engexplain@nntaleb 10/10
If “evidence-based” science was genuinely invested in authentic measurement it would leverage *general* measures of dependence; that demands an approach rooted in information-theory. Without entropy you’re just picking data, choosing a narrative, and calling it “evidence.”
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It combines with domain knowledge from epidemiologists, analyzes over 100,000 reports daily (multiple languages), then sends out regular alerts to health care, government, business, and public health clients. Highlights outbreaks discovered by AI and their risk.
AI has also been adopted to detect people with fever in large crowds via AI-powered smart glasses. They are worn by security guards who can now check hundreds of people within a few minutes without making contact.
Attempts to break this model usually focus around bootstrapping, enabling companies to grow organically, from the bottom-up; albeit much more slowly than their VC-back counterparts.
3/n
Accepting slower growth is supposed to mean making something better, but better how? Are the products produced by smaller firms better?
Important work here; goes well-beyond physics. I hope upcoming generations of scientists pay close attention.
Some of my quick thoughts:
2/12
What is the qualifier for unification under this framework? How will QM and GR demonstrably work together enough to say they are unified"?
3/12
It would be great to have some distance metric between the shapes you create with this approach and known shapes in nature (some connection between the isomorphic classes of nature's shapes and those generated here).
If you over-manage your distractions you’ll miss out on the surprisal needed to fuel creative efforts. But, if everything is unstructured you’ll fail to maintain momentum on your most important tasks.
Nature exhibits pareto-esque distributions for a reason. It needs to tap into variation (surprisal) to make serendipitous discoveries, but those discoveries must benefit some structured goal.
Like anything important in life, it’s best not to think dualistically. Distractions are good to a point. Structure is good to a point. Too much of either will kill you. Dose response.
Companies, particularly large ones, hire "cogs." You fit into an existing recipe that has made the company money for years. Cog work is highly-focused labor that allows companies to scale by division of work.