Dan Williams Profile picture
Philosopher, University of Sussex. Tweets in personal capacity. Interested in: Philosophy, Psychology, Society. Writes at: https://t.co/MniDhzFVe4
Nov 4 8 tweets 3 min read
In a new article, I document how claims about a sinister "censorship industrial complex" involve preposterous exaggerations reliant on misrepresentations, omissions, low-quality reporting, smear campaigns, and conspiracy theorising. (1/8) conspicuouscognition.com/p/there-is-no-…Image Why does it matter if claims about a "censorship industrial complex" are true or false? Because these accusations are being used to justify Trump's Big Lie about 2020 election fraud and paint Democrats as the "real threat to democracy." (2/8)
Oct 28 4 tweets 1 min read
Many claim Trump's anti-democratic threat is exaggerated. Hopefully they’re right. But if the threat turns out to be real, such people will not admit they were wrong because that would mean admitting their culpability. Instead, they will find ways to rationalise events. That’s one of many reasons to be highly vigilant regarding authoritarian threats. Many Trump supporters seem to think, “If there’s a clear turn towards authoritarianism, I will oppose it.” But everything we know about psychology and history suggests that most supporters won’t.
Oct 13 5 tweets 2 min read
- Points to people saying, believing, and doing bad things
- Assumes (without evidence) social media is main cause
- Assumes (without evidence) things are worse now than in the past
- Explains bad beliefs by claiming vast swathes of Americans don't care about truth or reality All supported by a mix of anecdata, baseless speculation, alarmism, and the implicit assumption that exposure to misinfo = belief. Also, as part of its supporting evidence, it links to a tweet with ... 5 likes where most of comments are telling the person what an idiot they are. Image
Image
Oct 2 4 tweets 1 min read
The idea that online censorship poses a greater threat to American democracy than Trump's literal attempt to steal an election by peddling baseless claims about election fraud - which he continues to do - is beyond absurd. I've written multiple pieces criticising censorship, as well as bad content moderation policies by social media companies (which is really a different thing). And I agree Dems are naive and often bad on these issues.
Feb 20 6 tweets 3 min read
In this new post I highlight my five favourite academic articles from last year, which range over topics like political ideology, religion, misinformation, reputation management, and intellectual humility: . Thread: 1/6 conspicuouscognition.com/p/my-five-favo…
Image #1: 'Strange Bedfellows: The Alliance Theory of Political Belief Systems' by @DavidPinsof David Sears and @haselton. A brilliant, evolutionarily plausible, parsimonious - and deeply cynical - theory of political belief systems. 2/6tandfonline.com/doi/full/10.10…
Feb 14 17 tweets 3 min read
Should we trust misinformation experts to decide what counts as misinformation? In this new essay I give some reasons for scepticism: . Thread: (1/15) conspicuouscognition.com/p/should-we-tr…
Image First, some context. A few weeks ago I published an essay 👇 arguing that misinformation research confronts a dilemma when it comes to answering its most basic definitional question: What *is* misinformation? 2/15
conspicuouscognition.com/p/misinformati…
Jan 17 12 tweets 5 min read
Last week the World Economic Forum published its 'Global Risk Report' identifying misinformation and disinformation as the *top global threats over the next two years*. In this essay I argue its ranking is either wrong or so confused it's not even wrong: conspicuouscognition.com/p/misinformati…
Image Some background: Since 2016 (Brexit and the election of Trump), policy makers, experts, and social scientists have been gripped by panic about the political harms of disinformation and misinformation. Against this backdrop, the World Economic Forum's ranking is not surprising. Image
Jan 3 7 tweets 3 min read
I'm going to be writing here 👇 ≈weekly this year. In this first post I introduce the blog, explain its title 'Conspicuous Cognition', and outline why the pursuit of social approval drives the evolutionary weirdness of human behaviour and thought. 1/7
conspicuouscognition.com/p/conspicuous-…
Image The blog is an experiment. I like writing, I like publicly engaging with ideas and debates, and now that I have a permanent academic job I can express and argue for controversial takes - which I will be doing - with less fear of consequences. 2/7
Dec 8, 2023 15 tweets 3 min read
New essay: I argue that misinformation is often better viewed as a symptom of deep societal problems rather than their cause. When that’s true, interventions like debunking and censorship are unlikely to help – and might make things worse. (1/15)
iai.tv/articles/misin… The central intuition driving the modern misinformation panic is that people – specifically *other* people 👇 – are gullible and hence easily brainwashed into holding false beliefs. This idea is wrong. (2/15) journals.sagepub.com/doi/abs/10.117…
Nov 3, 2023 12 tweets 2 min read
When it comes to dis/misinformation research, it confronts a dilemma: in order to be viewed as legitimate and win broad public support, it must restrict its focus to extremely clear-cut cases such as fake news, bizarre conspiracy theories and easily-demonstrable falsehoods. 1/11 The problem is that such content is rare and not very consequential. Many people ignore it or engage with it for reasons independent of belief, and the small minority of the population who seek it out do so because of pre-existing traits, beliefs and dispositions. 2/11
Sep 5, 2023 40 tweets 7 min read
According to this thread, anyone who is sceptical that emotionality (the tendency to evoke emotions) is a fingerprint of misinformation is simply ignorant of scientific consensus on the topic. I disagree and would like to explain why: First, it's important to be clear about which claim is at dispute. If emotionality is a fingerprint of misinformation, misinformation on average must exhibit higher rates of emotionality than reliable information. That's a striking claim which if true has important implications
Aug 28, 2023 5 tweets 3 min read
In many discussions distinguishing misinformation and disinformation, it’s assumed there is a clean distinction between innocent mistakes and intentional deception. This is wrong: people often sincerely embrace the self-serving beliefs they are motivated to propagate to others. That someone is pushing an unfounded but self-serving narrative does not mean they are *insincere*. Your sincere beliefs, including those you most strongly identify with, often result from deep-rooted psychological tendencies specialised for propaganda and impression management.
Jun 7, 2023 20 tweets 6 min read
Is misinformation a dangerous virus? Are we living through an infodemic? Is there a vaccine for misinformation? In this review of Sander van der Linden's (@Sander_vdLinden) new book 'Foolproof', I argue that the answer to these questions is "no". A thread:
bostonreview.net/articles/the-f… Before going into some objections, let me say that I recommend the book. It's extremely interesting, and it provides an exceptionally clear and informative overview of modern misinformation research. However, I do strongly disagree with its overall perspective.
May 10, 2023 4 tweets 1 min read
What I find striking about Tucker Carlson is this: It was recently revealed from private text messages that he hates Trump and thinks of him as a “demonic force”. Despite this, he *served as one of Trump's most loyal and effective public propagandists for years*. This information isn't difficult to find. A quick google search would suffice. In ordinary life, if you discovered that someone (a friend, a colleague) had engaged in that level of extreme, psychopathic, self-interested deception, you would lose all trust in them.
May 10, 2023 6 tweets 3 min read
Contra Naomi Klein, I think the simpler, more plausible explanation for why people use the term "hallucination" to describe the mistaken outputs of generative AI is because the outputs resemble hallucinations in some respects.
theguardian.com/commentisfree/… Image And I think the more likely explanation for why some people are optimistic about AI is because they are persuaded by its potential. And I am sceptical that training AI on existing human knowledge constitutes "the most consequential theft in human history." Image
Sep 12, 2022 7 tweets 2 min read
Lots of people seem to be really angry about this thread. (Some people liked the thread as well but naturally my brain ignores that). I stand by the thread, but I should have been clearer on the following: (1/7) I *don’t* think most people are ignorant when it comes to answering these questions, or that only an elite minority knows (e.g.) how many moons our planet has. As I said (in fact, the *first* thing I said) in the thread: I think these answers are *obviously cherry picked*.(2/7)
Sep 11, 2022 5 tweets 2 min read
I *hate* videos like this, designed to mock ordinary people & make them look stupid. The most obvious point is that the interviews are cherry picked from a much bigger sample but made to look as if they're representative of people’s responses. But setting that aside…(1/5) First, being informed about pointless abstract facts about the world is a status symbol among highly-educated exam takers that constitute society’s elite. Most people don’t care, and have way more important things going on in their lives to bother acquiring this knowledge. (2/5)
Sep 5, 2022 17 tweets 5 min read
I wrote a short piece on why the recent panic surrounding misinformation is misguided and how the concept of rationalisation markets is superior for understanding many forms of epistemic dysfunction in political and cultural media (1/17). blogs.lse.ac.uk/impactofsocial… The basic idea: In recent years - especially since 2016 - there has been widespread alarm that the general public are being overwhelmed by exposure to false claims, which is leading them to make bad political and life decisions. (2/17)
Aug 1, 2022 4 tweets 1 min read
Very interesting 2005 review of "The Economy of Esteem" by @tylercowen.
"The desire for esteem can induce us to give to charity, engage in path-breaking scientific research, join voluntary associations, or simply be a decent human being or parent."
journals.sagepub.com/doi/abs/10.117… "Esteem-seeking may be a central reason why voluntary institutions supply public goods to the extent they do. General steps to magnify the importance of esteem therefore may have beneficial social consequences."
Mar 7, 2022 16 tweets 3 min read
New open access paper: "The Marketplace of Rationalizations". The basic idea: Motivated reasoning is subject to a rationalization constraint: people can only convince themselves of beliefs for which they can find appropriate rationalizations. (1/16)

cambridge.org/core/journals/… When preferences for similar beliefs are widespread, this constraint gives rise to *rationalization markets* in which agents compete to produce justifications of widely desired beliefs in exchange for money and social rewards such as attention and status. (2/16)
Jul 23, 2021 7 tweets 1 min read
The social world is unimaginably complex, mostly unpredictable at even short timescales, and viewed through extremely limited information & motivational biases. It's impossible to reconcile these banalities with the typical level of confidence we have in our political beliefs. So why the misalignment between the beliefs that we should hold if we were intellectually responsible and the beliefs that we actually hold?