In today’s #vatniksoup, I’ll discuss social media superspreaders. Due to their effectiveness, superspreader accounts are often used to spread "low credibility" content, disinformation and propaganda, and today this is more often done by hostile state actors such as Russia.
1/14
DeVerna et al. (2024) described superspreaders as "users who consistently disseminate a disproportionately large amount of low-credibility content," also known as bullshit. It’s worth noting, that some of these people may actually believe the lies they spread.
2/14
The numbers behind these accounts are astonishing – a study by Grinberg et al. (2019) found out that 0,1% of Twitter accounts were responsible for sharing approximately 80% of the mis/disinformation related to the 2016 US presidential election.
3/14
The same applies to COVID-19 related disinformation, as only 12 accounts the researchers referred to as the "dirty dozen", produced 65% of the anti-vaccine content on Twitter. The most famous of this group is the presidential candidate RFK Jr.:
These accounts are naturally amplified by often state-sponsored troll and bot farms. Inorganic amplifying can make the content seem more attractive to regular people through massive amount of likes and shares, a technique that’s based on basic behavioral sciences.
5/14
When it comes to geopolitics and especially the situation in Ukraine, we can easily name a few of the most prominent superspreader accounts who have no interest in the truth: Jackson Hinkle, Kim Dotcom, Ian Miles Cheong, Alex Jones, Tucker Carlson and Russell Brand.
6/14
Another good way to spot superspreaders is to check the "Community Notes leaderboard" website, where Jackson Hinkle holds the position number 4, Cheong is at 7th position, and Elon Musk himself can be found at spot #39.
7/14
Naturally, the platform’s owner also often comments and shares content from these people & even engages in conversations with them on Spaces, because apparently he wants to be surrounded by conspiratorial "Yes Men",instead of doing tough interviews with people like Don Lemon.8/14
Most superspreader accounts have very little interest in the truth, as the nature of social media encourages you to go for maximum engagement (likes, shares, comments). On X, this even affects your ad share revenue, basically allowing people to earn money through lies.
9/14
There are many examples of pro-Kremlin narratives being spread by these accounts. One of them is the lie that Zelenskyy "bought a mansion from King Charles". The news came from a AI-generated fake news blog, and was spread by large accounts like Liz Churchill’s.
10/14
Another fake story about the "US-funded Ukrainian bioweapons labs" that even made it to the mainstream was started by QAnon follower Jacob Creech AKA @WarClandestine, who later on bragged about making money from the ad share revenue system of X.
11/14
Most of the content promoted and made up by these large accounts draw inspiration from various conspiracy theories like QAnon, PizzaGate, or The Great Reset. They often also share photos in wrong context, for example photos from Syria are told to be from Gaza.
12/14
As I’ve stated many times before, there are no downsides to rage farming and spreading lies online, and after Elon took over it has actually become a viable monetization strategy that can make you relatively rich.
13/14
Hostile state actors have also figured out the potential of using superspreaders to amplify their false narratives. For example, Russia's embassy accounts often tag people like Jackson Hinkle in their posts, hoping they'd share the content to their large following.
In today’s Wumao Soup, I’ll introduce how and where the Chinese Communist Party’s (CCP) online propaganda and influence operations work. Due to China’s massive population and advances in AI, CCP-aligned online content has become increasingly visible.
1/20
Like Russia’s troll farms, China has its own troll army: the “50 Cent Party” or “Wumao” refers to state-linked online commentators who are reportedly paid ¥0.50 per post to steer discussions away from criticism and amplify CCP narratives on social media.
2/20
Back in 2017, a research paper estimated that the Wumao produced almost 500 million fabricated comments annually to distract readers and shift topics. In that sense, Wumao operates very similarly to the Russian “Firehose of Falsehood” model:
In today’s Vatnik Soup and the “Degenerate Russia” series, I’ll show you the brutal reality of Russian war crimes, in particular the horrific tortures and sexual abuses of children, women and men.
Buckle up, this one is not for the faint-hearted.
1/24
For over a decade now and as part of their “firehose of falsehood” propaganda strategy, Russia has been spreading false narratives targeted at right-wing/conservative audiences, portraying russia as a bastion of Christian, traditional,family values.
In the previous “degenerate Russia” series we discussed Russia’s insanely high divorce rates, rampant domestic violence, high murder rates, thriving neo-Nazi culture, corruption of the Orthodox Church, and their massive demographic problem:
In today’s Vatnik Soup, I’ll explore how Russia is working with Iran, and how the recent Israel–US strikes on Iran could affect the war in Ukraine. Iran has been one of Russia’s key allies in their genocidal war, but in reality the partnership is deeply one-sided.
1/21
Historically, Russia/USSR has been involved in numerous wars in the Middle East, invading Afghanistan for nearly a decade and desperately trying to keep Syria’s authoritarian leader, al-Assad, in power before his eventual downfall.
2/21
While initially supportive of Israel, the Soviet Union quickly pivoted to backing its enemies, fueling antisemitism, terrorism, and chaos in an already tense region. At times, this meant near-open war, like when Soviet Air Force MiG-21s were shot down by Israel over Egypt.
In today’s Vatnik Soup REBREW, I’ll re-introduce a Latvian politician and former MEP, Tatjana Ždanoka. She’s best-known for her history in the Communist Party of Latvia, for her pro-Russian politics in the country, and her connections to Russian intelligence.
1/22
Based on Ždanoka’s speeches and social media posts, she has a deep hatred towards the people of Latvia. The reason for this can only be speculated, but part of it could be due to her paternal family being killed by the Latvian Auxiliary Police,…
2/22
…a paramilitary force supported by the Nazis, during the early 1940s. Ždanoka became politically active in the late 80s. She was one of the leaders of Interfront, a political party that supported Latvia remaining part of the USSR.
In today’s Vatnik Soup, I’ll introduce the main themes of Russian disinformation on TikTok. Each day, there are thousands of new videos promoting pro-Kremlin narratives and propaganda.
It’s worth noting that Russians can only access European TikTok via VPN.
1/10
There is currently a massive TikTok campaign aimed at promoting a positive image of Russia. The videos typically feature relatively attractive young women and focus on themes of nationalism and cultural heritage.
2/10
Ironically, many of these videos from Moscow or St. Petersburg are deceptively edited to portray Ukraine in a false light — claiming there is no war and that international aid is being funneled to corrupt elites.
In today’s Vatnik Soup, I’ll talk about Finland and how pro-Kremlin propagandists have become more active in the Finnish political space since Russia launched its full-scale invasion of Ukraine. For the first time since 2022, they’ve gained some political power in Finland.
1/16
Russia’s political strategy in countries with Russian-speaking minorities (such as Finland and the Baltics) is typically quite similar: it seeks to rally these minorities around issues like language and minority rights, and then frames the situation as oppression.
2/16
At the same time, Russian speakers are extremely wary and skeptical of local media, and instead tend to follow Russian domestic outlets like Russia-1 and NTV, thereby reinforcing an almost impenetrable information bubble.