Dan Williams Profile picture
Dec 8, 2023 15 tweets 3 min read Read on X
New essay: I argue that misinformation is often better viewed as a symptom of deep societal problems rather than their cause. When that’s true, interventions like debunking and censorship are unlikely to help – and might make things worse. (1/15)
iai.tv/articles/misin…
The central intuition driving the modern misinformation panic is that people – specifically *other* people 👇 – are gullible and hence easily brainwashed into holding false beliefs. This idea is wrong. (2/15) journals.sagepub.com/doi/abs/10.117…
People are discriminating and suspicious learners, if anything placing too much weight on their own views relative to those of others. Persuasion is therefore extremely difficult and even intense propaganda campaigns and advertising efforts routinely have minimal effects 👇(3/15) Image
To many commentators and social scientists, this fact is difficult to accept. If people are not gullible and persuasion is difficult, what explains extraordinary popular delusions and bizarre conspiracy theories? (4/15)
This question rests on a confused but widespread assumption: that the truth is always self-evident and desirable, such that false beliefs can only arise from the credulous acceptance of misinformation. (5/15)
But first, the truth about complex issues is not self-evident, and people interpret the world through intuitions and interpretive dispositions that are often pre-scientific. To overcome these inclinations, they must encounter and trust reliable information. (6/15)
However, a minority actively distrust mainstream epistemic institutions (e.g., science, public health, mainstream media), which causes them to reject reliable info and expert consensus and seek out content - often wrong - from counter-establishment sources. (7/15)
And second, humans are not disinterested truth seekers. Much misinformation arises from factors such as affective polarisation and anti-establishment worldviews, which create widespread demand for content that demonises outgroups and elites. (8/15)
Factors such as institutional distrust, polarisation, and rationalisation markets imply a picture in which – as Dan Kahan once put it – “misinformation is not something that happens to the mass public but rather something its members are complicit in producing.” (9/15)
When that is true, standard technocratic tactics for dealing with misinformation – such as prebunking, debunking, fact-checking, and censorship – are unlikely to be effective, and censorship specifically is likely to exacerbate the problems it aims to address. (10/15)
If so, what might help? First, rather than investing so much into preventing the gullible masses from being brainwashed into holding bad ideas, it is far more important to win trust in institutions, including by *making them more trustworthy*. (11/15)osf.io/preprints/psya…
As @Musa_alGharbi points out 👇, institutional distrust is often understandable and sometimes justified. (12/15)
theguardian.com/commentisfree/…
More broadly, policies should aim at addressing the social conditions that make people avid consumers of misinformation. Intense polarisation and steep social inequalities create an inevitable demand for hyperbolic narratives that demonise outgroups and elites. (13/15)
Of course, that's easier said than done, but I hope this short essay helps to highlight some limitations, problems, and opportunity costs associated with the way in which the topic of misinformation is often understood in science and society. (14/15)
The essay draws from and links to the work of many, including @hugoreasoning , @Sacha_Altay (from whom I first heard the framing that misinfo is a symptom), @acerbialberto, @JoeUscinski, and @Musa_alGharbi (15/15)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Dan Williams

Dan Williams Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @danwilliamsphil

Nov 4, 2024
In a new article, I document how claims about a sinister "censorship industrial complex" involve preposterous exaggerations reliant on misrepresentations, omissions, low-quality reporting, smear campaigns, and conspiracy theorising. (1/8) conspicuouscognition.com/p/there-is-no-…Image
Why does it matter if claims about a "censorship industrial complex" are true or false? Because these accusations are being used to justify Trump's Big Lie about 2020 election fraud and paint Democrats as the "real threat to democracy." (2/8)
Yes, social media companies have made bad censorship decisions (e.g., Hunter Biden laptop story, lab leak theory, heterodox views during the pandemic). Yes, government officials sometimes pressure platforms to remove content. These are real issues that should be criticized. (3/8)
Read 8 tweets
Oct 28, 2024
Many claim Trump's anti-democratic threat is exaggerated. Hopefully they’re right. But if the threat turns out to be real, such people will not admit they were wrong because that would mean admitting their culpability. Instead, they will find ways to rationalise events.
That’s one of many reasons to be highly vigilant regarding authoritarian threats. Many Trump supporters seem to think, “If there’s a clear turn towards authoritarianism, I will oppose it.” But everything we know about psychology and history suggests that most supporters won’t.
That doesn't mean one should be paranoid or alarmist. But nobody could seriously claim there is nothing to be concerned about here. Trump quite literally tried to steal an election, and many supporters quickly found ways to downplay, excuse, and rationalise what happened.
Read 4 tweets
Oct 13, 2024
- Points to people saying, believing, and doing bad things
- Assumes (without evidence) social media is main cause
- Assumes (without evidence) things are worse now than in the past
- Explains bad beliefs by claiming vast swathes of Americans don't care about truth or reality
All supported by a mix of anecdata, baseless speculation, alarmism, and the implicit assumption that exposure to misinfo = belief. Also, as part of its supporting evidence, it links to a tweet with ... 5 likes where most of comments are telling the person what an idiot they are. Image
Image
Just as there is a market for unsupported alarmist narratives of the sort reported in the article, there is a market for unsupported alarmist narratives of the sort exemplified by the article - a never-ending stream of content published in elite outlets fitting the same template.
Read 5 tweets
Oct 2, 2024
The idea that online censorship poses a greater threat to American democracy than Trump's literal attempt to steal an election by peddling baseless claims about election fraud - which he continues to do - is beyond absurd.
I've written multiple pieces criticising censorship, as well as bad content moderation policies by social media companies (which is really a different thing). And I agree Dems are naive and often bad on these issues.
But to suggest current or proposed government policies surrounding censorship are a greater threat to democracy than Trump and Trumpists's behaviour betrays a complete lack of intellectual or moral seriousness.
Read 4 tweets
Feb 20, 2024
In this new post I highlight my five favourite academic articles from last year, which range over topics like political ideology, religion, misinformation, reputation management, and intellectual humility: . Thread: 1/6 conspicuouscognition.com/p/my-five-favo…
Image
#1: 'Strange Bedfellows: The Alliance Theory of Political Belief Systems' by @DavidPinsof David Sears and @haselton. A brilliant, evolutionarily plausible, parsimonious - and deeply cynical - theory of political belief systems. 2/6tandfonline.com/doi/full/10.10…
#2: 'Prosocial religions as folk-technologies of mutual policing' by @LFitouchi @mnvrsngh @jbaptistandre and Nicolas Baumard. In my view the most plausible theory yet advanced for understanding religion and its connection to cooperation. 3/6osf.io/preprints/psya…
Read 6 tweets
Feb 14, 2024
Should we trust misinformation experts to decide what counts as misinformation? In this new essay I give some reasons for scepticism: . Thread: (1/15) conspicuouscognition.com/p/should-we-tr…
Image
First, some context. A few weeks ago I published an essay 👇 arguing that misinformation research confronts a dilemma when it comes to answering its most basic definitional question: What *is* misinformation? 2/15
conspicuouscognition.com/p/misinformati…
- On very narrow definitions, misinfo appears to be relatively rare and largely symptomatic of other problems, at least in western democracies.
- On broader definitions, misinfo is so widespread that applications of the concept will inevitably be highly selective & biased. 3/15
Read 17 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(