Gordon Pennycook Profile picture
Apr 6, 2021 16 tweets 7 min read Read on X
New review in @TrendsCognSci “The Psychology of Fake News” w @DG_Rand

We synthesize research on belief in, sharing of, & interventions against misinformation, with a focus on false/misleading news cell.com/trends/cogniti…

And now a thread that synthesizes our synthesis!
1/
We make three major points in the paper, which I will summarize here. However, there are several other elements to the paper that may be of interest. E.g. a short review on the prevalence of fake news, a discussion of the heuristics that people use, such as familiarity, and more!
The first major point is that, contrary to narratives that focus on political partisanship/motivated reasoning, we find that a lot can be explained by mere lazy thinking (overreliance on intuition).
For eg, individual differences in reflective thinking (via the Cognitive Reflection Test, + others) are consistently associated with an increased ability to distinguish b/w true/false news *regardless* of whether the news is consistent *or* inconsistent with political ideology
In a combined analysis of 14 studies (N>15k). The effect of cognitive reflection is 2x larger than the effect of political consistency on discernment (which, anyways, shows that people are *better* at distinguishing between true/false news if it’s ideologically *consistent*)
There's also experimental evidence that supports the conclusion that analytic thinking is associated with increased truth discernment and not increased polarization (as assumed by motivated reasoning accounts). See:
Of course, if one looks at *overall* belief, people find politically consistent news more plausible regardless of if it’s true or false. So, people believe things that are consistent with their ideology and (separately) being reflective is associated with more *accurate* beliefs.
Importantly, the general bias toward believing things that are consistent with one’s ideology is not evidence of a causal role of political identities per se. Can’t go into detail with so few characters, but the short story is that there are many confounds
The 2nd major point is that social media sharing does not necessarily imply belief. People are often quite good at distinguishing between true/false news when asked to do it directly. However, when it comes to sharing, they barely do so.

I.e., RT!=endorsement is actually true
This indicates that, again, the spread of fake news may be driven (to some extent) by mere inattention to accuracy. People may be getting distracted from thinking about accuracy when deciding what to share (Note: People *say* that accuracy is important to them)
This brings me to the third major point: Points 1&2 indicate that maybe people can make better choices if they slow down and consider accuracy before sharing. And, in fact, there is good evidence that this is the case See:
The broader conclusion is that interventions against misinformation should be informed by an understanding of the underlying psychological mechanisms. Things that intuitively seem that they may work may not be effective (and vice versa). nytimes.com/2020/03/24/opi…
For eg, ppl are surprisingly good at distinguishing between high/low quality news sources (when asked to do so – doesn’t mean they do it in practice, see misinforeview.hks.harvard.edu/wp-content/upl…) & crowdsourced judgments of news headlines could be used to inform algorithms
Another example is that fact-checks that directly follow a news headline actually work better than ones that come directly before the headline (a lot of people would assume that preparing people mentally for falsehood works better, but it doesn’t)
There is still so much to learn about this topic, though. For the psychologists in the crowd, misinformation reveals a lot about how our minds work. And, at the same time, it’s a nascent area where we need research to inform policy. Hopefully, this review is outdated in 5 years!
This paper is a capstone of the amazingly fun collaboration between me and @DG_Rand (+ many others) that started back in 2016. We're still going and hopefully will be for a long time! A frequently updated list of our misinfo (and related) projects is here: docs.google.com/document/d/1k2…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Gordon Pennycook

Gordon Pennycook Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @GordPennycook

Jun 26, 2023
This is a really interesting look into the margins: How effective are misinfo interventions for content that is selected to be highly consistent with values they consider sacred?

Accuracy prompts are unlikely to work when false beliefs are really strong.

A short 🧵 on why
Accuracy prompts are short interventions that subtly remind people to think about accuracy. This works, in general, b/c there's a disconnect between belief & sharing: People share content that they would be able to recognize as false if accuracy were the focus of their attention
What this implies is that the effect is the greatest for content that is most readily discernable as being false. I.e., there's a strong correlation between the effect size and how subjectively plausible the content in question is.

From https://t.co/KFrXf5ptDDnature.com/articles/s4158…
Read 9 tweets
Dec 6, 2022
Conspiracy believers know that their beliefs are on the fringe… right?

Wrong!

We (@JabinBinnendyk, @DG_Rand) find that conspiracy believers massively overestimate how much others agree with them. Why? They are more likely to be overconfident people. 🧵

psyarxiv.com/d5fz2
Conspiracy belief is often explained as being a response to various needs and motivations, such as the need to be unique (onlinelibrary.wiley.com/doi/full/10.10…). Other work argues that believers are particularly intuitive (sciencedirect.com/science/articl…).

We argue that *overconfidence* is important
In 8 studies with >4k U.S. adults (all online), we find consistent evidence that conspiracy believers are more overconfident (irrespective of reasoning skills, need for uniqueness, and narcissism).

But what do I mean by "overconfident"?
Read 18 tweets
Mar 18, 2021
Warning: Sentimentality ahead

I'm not someone who publishes papers in Nature. I'm just not.

And it's not just this paper, of course. This is just the thing that caused me to reflect on my life and how absolutely bonkers this all still is for me.

So, I thought I would share.
First, I need to give props to @DG_Rand. He is an absolute hero. You know how some PI's just slap their name on work that has been done by junior collaborators? Not Dave. If anything, he takes LESS credit than he deserves. He's also just the best person.

But, anyway, back to me
I grew up on a farm in northern Saskatchewan. It failed & forced my parents to work several jobs. As a kid, I (and my 4 siblings) helped my parents do janitorial work from Grade 1 to Grade 9. This was not at all abnormal to me: On a farm you do chores, so that became our chores. (I'm on the far right)
Read 15 tweets
Nov 18, 2020
There has been a surge of behavioral research on misinformation & "fake news". To synthesize things, @DG_Rand & I wrote a systematic review: psyarxiv.com/ar96c

We take a cognitive/social psych perspective, but we tried to cast a wide net for the review. Feedback welcome!
Sorry to those who retweeted an earlier version of this tweet that I deleted because the image preview was too zoomed in
There's too much in the review to cover in a tweet thread, but here are some of the take-aways that we thought to be particularly important...
Read 9 tweets
Nov 8, 2020
We're likely to face an unprecedented situation where the incumbent refuses to concede. Although it may not be necessary, things would certainly be easier if Republicans viewed the election as legitimate.

How uphill of a battle will this be? Well, I ran a study with @DG_Rand...
Study was run on Prolific & Lucid on Friday. In total, we have 509 Biden voters & 218 Trump voters. The samples are *not* nationally representative and a bit small. But, some fairly clear results came out.

More info on the sample:
A key initial Q is about people's priors. Do Trump voters believe it is *unlikely* that Biden won?

The answer is yes.

Reminder: This study was run on Friday when Biden was already well ahead & very likely to win. That he would win the popular vote was *never* in question.
Read 18 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(