Last week the World Economic Forum published its 'Global Risk Report' identifying misinformation and disinformation as the *top global threats over the next two years*. In this essay I argue its ranking is either wrong or so confused it's not even wrong: conspicuouscognition.com/p/misinformati…
Some background: Since 2016 (Brexit and the election of Trump), policy makers, experts, and social scientists have been gripped by panic about the political harms of disinformation and misinformation. Against this backdrop, the World Economic Forum's ranking is not surprising.
Responses to the ranking were polarised. It's fair to say @NateSilver538 👇, who linked my essay in support of his view (), was not a fan. conspicuouscognition.com/p/misinformati…
Understandably, many leading misinformation researchers disagreed with Silver's assessment. They argued that misinformation *is* the top global threat because it causes or exacerbates all other threats, including war.
In my essay, I argue that on a narrow, technical understanding of dis/misinformation, the ranking seems completely wrong - bizarre, even. There is simply not good evidence that dis/misinformation, so understood, is a great threat, let alone the greatest.
In response to this, one might argue that this conclusion only follows from an overly strict definition of dis/misinformation. There is no doubt that many of humanity's problems arise in large part because people don't have good, accurate models of reality.
Maybe, then, we can just use the term 'misinformation' to refer to whatever factors cause people - whether ordinary people or those in positions of power - to have bad beliefs and make bad decisions.
As attractive as this line of reasoning it is, I argue that it is ultimately confused. It is - in the famous put-down of physicist Wolfgang Pauli - "not even wrong". There are three basic reasons for this:
1. Once we understand 'misinformation' to include subtle ways in which narratives, ideologies, and systems bias people's priorities and perceptions of reality, experts and elites at Davos are definitely not in a privileged position to identify misinformation.
2. Using labels like 'dis/misinformation' to refer to the extremely complex set of psychological, social, political, and institutional factors that cause people to hold bad beliefs and make bad decisions does not help us to understand the world or address global threats.
3. On extremely expansive understandings of the terms 'dis/misinformation', they are chronic features of the human condition and politics, and there is no reason to think they have gotten worse in recent years as a result of social media or AI. In some ways, they've improved!
That's all from me. Thanks for reading this far. You can subscribe to my weekly essays here: conspicuouscognition.com
• • •
Missing some Tweet in this thread? You can try to
force a refresh
In a new article, I document how claims about a sinister "censorship industrial complex" involve preposterous exaggerations reliant on misrepresentations, omissions, low-quality reporting, smear campaigns, and conspiracy theorising. (1/8) conspicuouscognition.com/p/there-is-no-…
Why does it matter if claims about a "censorship industrial complex" are true or false? Because these accusations are being used to justify Trump's Big Lie about 2020 election fraud and paint Democrats as the "real threat to democracy." (2/8)
Yes, social media companies have made bad censorship decisions (e.g., Hunter Biden laptop story, lab leak theory, heterodox views during the pandemic). Yes, government officials sometimes pressure platforms to remove content. These are real issues that should be criticized. (3/8)
Many claim Trump's anti-democratic threat is exaggerated. Hopefully they’re right. But if the threat turns out to be real, such people will not admit they were wrong because that would mean admitting their culpability. Instead, they will find ways to rationalise events.
That’s one of many reasons to be highly vigilant regarding authoritarian threats. Many Trump supporters seem to think, “If there’s a clear turn towards authoritarianism, I will oppose it.” But everything we know about psychology and history suggests that most supporters won’t.
That doesn't mean one should be paranoid or alarmist. But nobody could seriously claim there is nothing to be concerned about here. Trump quite literally tried to steal an election, and many supporters quickly found ways to downplay, excuse, and rationalise what happened.
- Points to people saying, believing, and doing bad things
- Assumes (without evidence) social media is main cause
- Assumes (without evidence) things are worse now than in the past
- Explains bad beliefs by claiming vast swathes of Americans don't care about truth or reality
All supported by a mix of anecdata, baseless speculation, alarmism, and the implicit assumption that exposure to misinfo = belief. Also, as part of its supporting evidence, it links to a tweet with ... 5 likes where most of comments are telling the person what an idiot they are.
Just as there is a market for unsupported alarmist narratives of the sort reported in the article, there is a market for unsupported alarmist narratives of the sort exemplified by the article - a never-ending stream of content published in elite outlets fitting the same template.
The idea that online censorship poses a greater threat to American democracy than Trump's literal attempt to steal an election by peddling baseless claims about election fraud - which he continues to do - is beyond absurd.
I've written multiple pieces criticising censorship, as well as bad content moderation policies by social media companies (which is really a different thing). And I agree Dems are naive and often bad on these issues.
But to suggest current or proposed government policies surrounding censorship are a greater threat to democracy than Trump and Trumpists's behaviour betrays a complete lack of intellectual or moral seriousness.
In this new post I highlight my five favourite academic articles from last year, which range over topics like political ideology, religion, misinformation, reputation management, and intellectual humility: . Thread: 1/6 conspicuouscognition.com/p/my-five-favo…
#1: 'Strange Bedfellows: The Alliance Theory of Political Belief Systems' by @DavidPinsof David Sears and @haselton. A brilliant, evolutionarily plausible, parsimonious - and deeply cynical - theory of political belief systems. 2/6tandfonline.com/doi/full/10.10…
#2: 'Prosocial religions as folk-technologies of mutual policing' by @LFitouchi @mnvrsngh @jbaptistandre and Nicolas Baumard. In my view the most plausible theory yet advanced for understanding religion and its connection to cooperation. 3/6osf.io/preprints/psya…
Should we trust misinformation experts to decide what counts as misinformation? In this new essay I give some reasons for scepticism: . Thread: (1/15) conspicuouscognition.com/p/should-we-tr…
First, some context. A few weeks ago I published an essay 👇 arguing that misinformation research confronts a dilemma when it comes to answering its most basic definitional question: What *is* misinformation? 2/15 conspicuouscognition.com/p/misinformati…
- On very narrow definitions, misinfo appears to be relatively rare and largely symptomatic of other problems, at least in western democracies.
- On broader definitions, misinfo is so widespread that applications of the concept will inevitably be highly selective & biased. 3/15