Ian David Moss Profile picture
Aug 20, 2019 22 tweets 9 min read Read on X
You can always rely on @nonprofitssay to tell the hard truths, and it turns out that this particular one is backed up by research. Thread:
Earlier this year I highlighted a study from @CEPData and @Eval_Innovation that revealed the futility of many foundations' evaluation efforts. The numbers from that report are eye-popping, but it's admittedly only one study. How much stock to put in it? 1/
Fortunately, there are other studies out there. And on the whole, they strongly reinforce the point that people with influence over social policy simply don't read or use the vast majority of the knowledge we produce, no matter how relevant it is. 2/
One of my favorite factoids of all time comes from a study the @WorldBank conducted on its own policy papers several years ago. The methodology was simple: researchers just counted the number of times each paper had been downloaded between 2008-12. documents.worldbank.org/curated/en/387… 3/
They found that three-quarters of these papers, each one representing likely hundreds of hours and thousands of dollars of investment on the institution's part, were downloaded fewer than 100 times. Nearly a third had never been downloaded EVEN ONCE! Not even by their authors! 4/
This was all chronicled in a Washington Post piece by @_cingraham with the unforgettable title, "The Solutions to All Our Problems May Be Buried in PDFs Nobody Reads." Touché! washingtonpost.com/news/wonk/wp/2… 5/
Research that asks policymakers and philanthropists about their reading habits tells a similar story. I wrote last year about the @Hewlett_Found's "Peer to Peer" study, which surveyed more than 700 foundation professionals in the United States. 6/
Funders responding to that survey report being completely overwhelmed with information, to the point where some of them just delete emails announcing new reports and studies without even skimming them first to see if they’re relevant. 7/
In a study of over 1600 civil servants in Pakistan and India by @EPoDHarvard, policymakers "agreed that evidence should play a greater role in decision-making" but acknowledged that it doesn't. washingtonpost.com/news/monkey-ca… 8/
According to the study, the issues are structural. "Few [respondents] mentioned that they had trouble getting data, research, or relevant studies. Rather, they said...that they had to make decisions too quickly to consult evidence and that they weren’t rewarded when they did." 9/
And the topline finding of an @EHPSAprog/@INASPinfo study looking at HIV policymakers in eastern and southern Africa is that "policymakers value evidence but they may not have time to use it." ehpsa.org/critical-revie… 10/
What about front-line practitioners? A US survey found doctors generally don't follow research relevant to their practice area, and when research comes out that challenges the way they do their work, they expect their medical associations to attack it. washingtonpost.com/news/monkey-ca… 11/
The UK's @EducEndowFoundn conducted an RCT to test strategies to get evidence in front of schoolteachers. They tried online research summaries, magazines, webinars, conferences. None of these methods had any measurable effect on student outcomes. educationendowmentfoundation.org.uk/news/light-tou… 12/
It's important to note that none of this is news to the people whose job it is to generate and advocate for the use of evidence. In my experience, the vast majority know this is a huge problem and have their own stories to tell. 13/
For example, this report from the 2017 Latin American Evidence Week @evidencialatam decried the "operational disconnect [that] makes it impossible for evidence generated at the implementation level to feed into policy (and programme) design." onthinktanks.org/articles/latin… 14/
Or why 125 social sector leaders interviewed by @Deloitte @MntrInstitute's Reimagining Measurement initiative identified “more effectively putting decision-making at the center” as the sector’s top priority for the next decade. www2.deloitte.com/us/en/pages/mo… 15/
The consistent theme in all these readings: it's really tough to get policymakers and other people in power to use evidence, especially when it challenges their beliefs. Very, very rarely will evidence on its own actually influence the views or actions of influential people. 16/
What's really astounding about this is how much time, money, attention we spend on evidence-building activities for apparently so little concrete gain. I mean, we could just throw up our hands and say, "well, no one's going to read this stuff anyway, so why bother?" 17/
But we don't. Instead, it's probably not an exaggeration to say that our society invests millions of hours and billions of precious dollars toward generating knowledge about the social sector...most of which will have literally zero impact. 18/
So we are either vastly overvaluing or vastly undervaluing evidence. We need to get it right. Because those are real resources that could be spent elsewhere, and the world is falling apart around us while we get lost in our spreadsheets and community engagement methodologies. 19/
Don't get me wrong. I believe in science. We can learn so much just from better understanding and connecting the work we've already done! At the same time, there's so much more we could be doing to ensure we get bang for our evidence buck. E.g.: cep.org/why-your-hard-… 20/
Thanks for listening. Sometime soon I'll post about what people have done to try to increase evidence use, and what seems to work. /fin

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ian David Moss

Ian David Moss Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @iandavidmoss

Aug 26, 2019
So the ironic thing about this is that the list of cognitive biases (and Cynthia's suggestion about how to use it in conversation) itself purports to be an objective representation of reality. It's based on a particular scientific tradition and paradigm of constructing knowledge.
Now, it happens to be a paradigm that I share and believe in. My point is that if you're going to claim that objectivity doesn't exist, then it seems a bit...paradoxical to speak as if you are the holder of True Knowledge about other people's judgments and behavior.
Personally, I find it more helpful to think of objectivity as existing on a continuum. We don't need to pretend that The Lancet and InfoWars are epistemological twins. And I think aspiring towards objectivity is great as long as we realize that we can never fully achieve it.
Read 4 tweets
May 17, 2019
You probably know Daniel Kahneman's classic volume *Thinking, Fast and Slow* as a comprehensive catalogue of cognitive biases and errors in judgment. But it's more than that: it's also about the meaning of life. 1/
If you DON'T know the book, here's a very quick overview. Its most basic message is that we instinctively simplify the world to make sense of it, and if we aren't careful, the methods we use can lead us astray. 2/
Kahneman describes intuitive (subconscious) thinking processes using the term "System 1," while "System 2" refers to conscious and deliberate thinking. Each of these represents a composite of mental habits and capabilities rather than literal biological systems. 3/
Read 22 tweets
Apr 3, 2019
Would you believe me if I told you that most of the folks who commission evaluations of social programs have trouble getting people--including even their own colleagues!--to use them? Well, it's true. And we need to talk about it. 1/
In 2015, @Eval_Innovation and @CEPData surveyed evaluation and program executives at 127 US and Canadian foundations with $10m+ in annual giving. The result was the report "Benchmarking Foundation Evaluation Practices," available for download here: evaluationinnovation.org/wp-content/upl… 2/
There's a lot of interesting information in the document, but for me the most striking page is the one addressing the challenges respondents have encountered in their foundations' evaluation efforts: evaluationinnovation.org/wp-content/upl… 3/
Read 14 tweets
Feb 4, 2019
Last month a friend asked me "what's the best resource you've read thus far on human decision-making?" 1/
After thinking about it, I decided to recommend her one of the first such resources I ever encountered, seven years ago: Doug Hubbard (@hubbardaie)'s How to Measure Anything. 2/ amazon.com/How-Measure-An…
While I don't agree with every word of it, How to Measure Anything is quite possibly the most important and useful book I've ever read on any topic. Here's why. 3/
Read 24 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(