Dan Williams Profile picture
Dec 8, 2023 15 tweets 3 min read Read on X
New essay: I argue that misinformation is often better viewed as a symptom of deep societal problems rather than their cause. When that’s true, interventions like debunking and censorship are unlikely to help – and might make things worse. (1/15)
iai.tv/articles/misin…
The central intuition driving the modern misinformation panic is that people – specifically *other* people 👇 – are gullible and hence easily brainwashed into holding false beliefs. This idea is wrong. (2/15) journals.sagepub.com/doi/abs/10.117…
People are discriminating and suspicious learners, if anything placing too much weight on their own views relative to those of others. Persuasion is therefore extremely difficult and even intense propaganda campaigns and advertising efforts routinely have minimal effects 👇(3/15) Image
To many commentators and social scientists, this fact is difficult to accept. If people are not gullible and persuasion is difficult, what explains extraordinary popular delusions and bizarre conspiracy theories? (4/15)
This question rests on a confused but widespread assumption: that the truth is always self-evident and desirable, such that false beliefs can only arise from the credulous acceptance of misinformation. (5/15)
But first, the truth about complex issues is not self-evident, and people interpret the world through intuitions and interpretive dispositions that are often pre-scientific. To overcome these inclinations, they must encounter and trust reliable information. (6/15)
However, a minority actively distrust mainstream epistemic institutions (e.g., science, public health, mainstream media), which causes them to reject reliable info and expert consensus and seek out content - often wrong - from counter-establishment sources. (7/15)
And second, humans are not disinterested truth seekers. Much misinformation arises from factors such as affective polarisation and anti-establishment worldviews, which create widespread demand for content that demonises outgroups and elites. (8/15)
Factors such as institutional distrust, polarisation, and rationalisation markets imply a picture in which – as Dan Kahan once put it – “misinformation is not something that happens to the mass public but rather something its members are complicit in producing.” (9/15)
When that is true, standard technocratic tactics for dealing with misinformation – such as prebunking, debunking, fact-checking, and censorship – are unlikely to be effective, and censorship specifically is likely to exacerbate the problems it aims to address. (10/15)
If so, what might help? First, rather than investing so much into preventing the gullible masses from being brainwashed into holding bad ideas, it is far more important to win trust in institutions, including by *making them more trustworthy*. (11/15)osf.io/preprints/psya…
As @Musa_alGharbi points out 👇, institutional distrust is often understandable and sometimes justified. (12/15)
theguardian.com/commentisfree/…
More broadly, policies should aim at addressing the social conditions that make people avid consumers of misinformation. Intense polarisation and steep social inequalities create an inevitable demand for hyperbolic narratives that demonise outgroups and elites. (13/15)
Of course, that's easier said than done, but I hope this short essay helps to highlight some limitations, problems, and opportunity costs associated with the way in which the topic of misinformation is often understood in science and society. (14/15)
The essay draws from and links to the work of many, including @hugoreasoning , @Sacha_Altay (from whom I first heard the framing that misinfo is a symptom), @acerbialberto, @JoeUscinski, and @Musa_alGharbi (15/15)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Dan Williams

Dan Williams Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @danwilliamsphil

Feb 20
In this new post I highlight my five favourite academic articles from last year, which range over topics like political ideology, religion, misinformation, reputation management, and intellectual humility: . Thread: 1/6 conspicuouscognition.com/p/my-five-favo…
Image
#1: 'Strange Bedfellows: The Alliance Theory of Political Belief Systems' by @DavidPinsof David Sears and @haselton. A brilliant, evolutionarily plausible, parsimonious - and deeply cynical - theory of political belief systems. 2/6tandfonline.com/doi/full/10.10…
#2: 'Prosocial religions as folk-technologies of mutual policing' by @LFitouchi @mnvrsngh @jbaptistandre and Nicolas Baumard. In my view the most plausible theory yet advanced for understanding religion and its connection to cooperation. 3/6osf.io/preprints/psya…
Read 6 tweets
Feb 14
Should we trust misinformation experts to decide what counts as misinformation? In this new essay I give some reasons for scepticism: . Thread: (1/15) conspicuouscognition.com/p/should-we-tr…
Image
First, some context. A few weeks ago I published an essay 👇 arguing that misinformation research confronts a dilemma when it comes to answering its most basic definitional question: What *is* misinformation? 2/15
conspicuouscognition.com/p/misinformati…
- On very narrow definitions, misinfo appears to be relatively rare and largely symptomatic of other problems, at least in western democracies.
- On broader definitions, misinfo is so widespread that applications of the concept will inevitably be highly selective & biased. 3/15
Read 17 tweets
Jan 17
Last week the World Economic Forum published its 'Global Risk Report' identifying misinformation and disinformation as the *top global threats over the next two years*. In this essay I argue its ranking is either wrong or so confused it's not even wrong: conspicuouscognition.com/p/misinformati…
Image
Some background: Since 2016 (Brexit and the election of Trump), policy makers, experts, and social scientists have been gripped by panic about the political harms of disinformation and misinformation. Against this backdrop, the World Economic Forum's ranking is not surprising. Image
Responses to the ranking were polarised. It's fair to say @NateSilver538 👇, who linked my essay in support of his view (), was not a fan. conspicuouscognition.com/p/misinformati…
Image
Read 12 tweets
Jan 3
I'm going to be writing here 👇 ≈weekly this year. In this first post I introduce the blog, explain its title 'Conspicuous Cognition', and outline why the pursuit of social approval drives the evolutionary weirdness of human behaviour and thought. 1/7
conspicuouscognition.com/p/conspicuous-…
Image
The blog is an experiment. I like writing, I like publicly engaging with ideas and debates, and now that I have a permanent academic job I can express and argue for controversial takes - which I will be doing - with less fear of consequences. 2/7
"Conspicuous cognition" is a play on Thorstein Veblen's term "conspicuous consumption", which refers to the consumption of goods and services not for practical or hedonic reasons but to signal one's wealth and status. 3/7
Read 7 tweets
Nov 3, 2023
When it comes to dis/misinformation research, it confronts a dilemma: in order to be viewed as legitimate and win broad public support, it must restrict its focus to extremely clear-cut cases such as fake news, bizarre conspiracy theories and easily-demonstrable falsehoods. 1/11
The problem is that such content is rare and not very consequential. Many people ignore it or engage with it for reasons independent of belief, and the small minority of the population who seek it out do so because of pre-existing traits, beliefs and dispositions. 2/11
It's not purely epiphenomenal. It can have real effects which are worthy of study - but it simply does not warrant the enormous amount of attention and panic that the topic of misinformation currently receives. 3/11
Read 12 tweets
Sep 5, 2023
According to this thread, anyone who is sceptical that emotionality (the tendency to evoke emotions) is a fingerprint of misinformation is simply ignorant of scientific consensus on the topic. I disagree and would like to explain why:
First, it's important to be clear about which claim is at dispute. If emotionality is a fingerprint of misinformation, misinformation on average must exhibit higher rates of emotionality than reliable information. That's a striking claim which if true has important implications
Here are some things that obviously *cannot* support this claim. It is not enough to point to some instances of misinformation and show that they exhibit emotionality. Lots of reliable information exhibits emotionality and lots of misinformation is dispassionate.
Read 40 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(