We synthesize research on belief in, sharing of, & interventions against misinformation, with a focus on false/misleading news cell.com/trends/cogniti…
And now a thread that synthesizes our synthesis! 1/
We make three major points in the paper, which I will summarize here. However, there are several other elements to the paper that may be of interest. E.g. a short review on the prevalence of fake news, a discussion of the heuristics that people use, such as familiarity, and more!
The first major point is that, contrary to narratives that focus on political partisanship/motivated reasoning, we find that a lot can be explained by mere lazy thinking (overreliance on intuition).
For eg, individual differences in reflective thinking (via the Cognitive Reflection Test, + others) are consistently associated with an increased ability to distinguish b/w true/false news *regardless* of whether the news is consistent *or* inconsistent with political ideology
In a combined analysis of 14 studies (N>15k). The effect of cognitive reflection is 2x larger than the effect of political consistency on discernment (which, anyways, shows that people are *better* at distinguishing between true/false news if it’s ideologically *consistent*)
There's also experimental evidence that supports the conclusion that analytic thinking is associated with increased truth discernment and not increased polarization (as assumed by motivated reasoning accounts). See:
Of course, if one looks at *overall* belief, people find politically consistent news more plausible regardless of if it’s true or false. So, people believe things that are consistent with their ideology and (separately) being reflective is associated with more *accurate* beliefs.
Importantly, the general bias toward believing things that are consistent with one’s ideology is not evidence of a causal role of political identities per se. Can’t go into detail with so few characters, but the short story is that there are many confounds
The 2nd major point is that social media sharing does not necessarily imply belief. People are often quite good at distinguishing between true/false news when asked to do it directly. However, when it comes to sharing, they barely do so.
I.e., RT!=endorsement is actually true
This indicates that, again, the spread of fake news may be driven (to some extent) by mere inattention to accuracy. People may be getting distracted from thinking about accuracy when deciding what to share (Note: People *say* that accuracy is important to them)
This brings me to the third major point: Points 1&2 indicate that maybe people can make better choices if they slow down and consider accuracy before sharing. And, in fact, there is good evidence that this is the case See:
The broader conclusion is that interventions against misinformation should be informed by an understanding of the underlying psychological mechanisms. Things that intuitively seem that they may work may not be effective (and vice versa). nytimes.com/2020/03/24/opi…
For eg, ppl are surprisingly good at distinguishing between high/low quality news sources (when asked to do so – doesn’t mean they do it in practice, see misinforeview.hks.harvard.edu/wp-content/upl…) & crowdsourced judgments of news headlines could be used to inform algorithms
Another example is that fact-checks that directly follow a news headline actually work better than ones that come directly before the headline (a lot of people would assume that preparing people mentally for falsehood works better, but it doesn’t)
There is still so much to learn about this topic, though. For the psychologists in the crowd, misinformation reveals a lot about how our minds work. And, at the same time, it’s a nascent area where we need research to inform policy. Hopefully, this review is outdated in 5 years!
This paper is a capstone of the amazingly fun collaboration between me and @DG_Rand (+ many others) that started back in 2016. We're still going and hopefully will be for a long time! A frequently updated list of our misinfo (and related) projects is here: docs.google.com/document/d/1k2…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
This is a really interesting look into the margins: How effective are misinfo interventions for content that is selected to be highly consistent with values they consider sacred?
Accuracy prompts are unlikely to work when false beliefs are really strong.
Accuracy prompts are short interventions that subtly remind people to think about accuracy. This works, in general, b/c there's a disconnect between belief & sharing: People share content that they would be able to recognize as false if accuracy were the focus of their attention
What this implies is that the effect is the greatest for content that is most readily discernable as being false. I.e., there's a strong correlation between the effect size and how subjectively plausible the content in question is.
Conspiracy believers know that their beliefs are on the fringe… right?
Wrong!
We (@JabinBinnendyk, @DG_Rand) find that conspiracy believers massively overestimate how much others agree with them. Why? They are more likely to be overconfident people. 🧵
In 8 studies with >4k U.S. adults (all online), we find consistent evidence that conspiracy believers are more overconfident (irrespective of reasoning skills, need for uniqueness, and narcissism).
I'm not someone who publishes papers in Nature. I'm just not.
And it's not just this paper, of course. This is just the thing that caused me to reflect on my life and how absolutely bonkers this all still is for me.
First, I need to give props to @DG_Rand. He is an absolute hero. You know how some PI's just slap their name on work that has been done by junior collaborators? Not Dave. If anything, he takes LESS credit than he deserves. He's also just the best person.
But, anyway, back to me
I grew up on a farm in northern Saskatchewan. It failed & forced my parents to work several jobs. As a kid, I (and my 4 siblings) helped my parents do janitorial work from Grade 1 to Grade 9. This was not at all abnormal to me: On a farm you do chores, so that became our chores.
There has been a surge of behavioral research on misinformation & "fake news". To synthesize things, @DG_Rand & I wrote a systematic review: psyarxiv.com/ar96c
We take a cognitive/social psych perspective, but we tried to cast a wide net for the review. Feedback welcome!
Sorry to those who retweeted an earlier version of this tweet that I deleted because the image preview was too zoomed in
There's too much in the review to cover in a tweet thread, but here are some of the take-aways that we thought to be particularly important...
We're likely to face an unprecedented situation where the incumbent refuses to concede. Although it may not be necessary, things would certainly be easier if Republicans viewed the election as legitimate.
How uphill of a battle will this be? Well, I ran a study with @DG_Rand...
Study was run on Prolific & Lucid on Friday. In total, we have 509 Biden voters & 218 Trump voters. The samples are *not* nationally representative and a bit small. But, some fairly clear results came out.
A key initial Q is about people's priors. Do Trump voters believe it is *unlikely* that Biden won?
The answer is yes.
Reminder: This study was run on Friday when Biden was already well ahead & very likely to win. That he would win the popular vote was *never* in question.