In today’s #vatniksoup, I’ll discuss social media superspreaders. Due to their effectiveness, superspreader accounts are often used to spread "low credibility" content, disinformation and propaganda, and today this is more often done by hostile state actors such as Russia.
1/14
DeVerna et al. (2024) described superspreaders as "users who consistently disseminate a disproportionately large amount of low-credibility content," also known as bullshit. It’s worth noting, that some of these people may actually believe the lies they spread.
2/14
The numbers behind these accounts are astonishing – a study by Grinberg et al. (2019) found out that 0,1% of Twitter accounts were responsible for sharing approximately 80% of the mis/disinformation related to the 2016 US presidential election.
3/14
The same applies to COVID-19 related disinformation, as only 12 accounts the researchers referred to as the "dirty dozen", produced 65% of the anti-vaccine content on Twitter. The most famous of this group is the presidential candidate RFK Jr.:
These accounts are naturally amplified by often state-sponsored troll and bot farms. Inorganic amplifying can make the content seem more attractive to regular people through massive amount of likes and shares, a technique that’s based on basic behavioral sciences.
5/14
When it comes to geopolitics and especially the situation in Ukraine, we can easily name a few of the most prominent superspreader accounts who have no interest in the truth: Jackson Hinkle, Kim Dotcom, Ian Miles Cheong, Alex Jones, Tucker Carlson and Russell Brand.
6/14
Another good way to spot superspreaders is to check the "Community Notes leaderboard" website, where Jackson Hinkle holds the position number 4, Cheong is at 7th position, and Elon Musk himself can be found at spot #39.
7/14
Naturally, the platform’s owner also often comments and shares content from these people & even engages in conversations with them on Spaces, because apparently he wants to be surrounded by conspiratorial "Yes Men",instead of doing tough interviews with people like Don Lemon.8/14
Most superspreader accounts have very little interest in the truth, as the nature of social media encourages you to go for maximum engagement (likes, shares, comments). On X, this even affects your ad share revenue, basically allowing people to earn money through lies.
9/14
There are many examples of pro-Kremlin narratives being spread by these accounts. One of them is the lie that Zelenskyy "bought a mansion from King Charles". The news came from a AI-generated fake news blog, and was spread by large accounts like Liz Churchill’s.
10/14
Another fake story about the "US-funded Ukrainian bioweapons labs" that even made it to the mainstream was started by QAnon follower Jacob Creech AKA @WarClandestine, who later on bragged about making money from the ad share revenue system of X.
11/14
Most of the content promoted and made up by these large accounts draw inspiration from various conspiracy theories like QAnon, PizzaGate, or The Great Reset. They often also share photos in wrong context, for example photos from Syria are told to be from Gaza.
12/14
As I’ve stated many times before, there are no downsides to rage farming and spreading lies online, and after Elon took over it has actually become a viable monetization strategy that can make you relatively rich.
13/14
Hostile state actors have also figured out the potential of using superspreaders to amplify their false narratives. For example, Russia's embassy accounts often tag people like Jackson Hinkle in their posts, hoping they'd share the content to their large following.
In today’s Vatnik Soup, I’ll explain the Alaska Fiasco and how it marks the peak of Trump’s two-year betrayal of Ukraine. What was sold as “peace talks” turned into a spectacle of weakness, humiliation, empty promises, and photo-ops that handed Putin exactly what he wanted.
1/24
Let’s start with the obvious: Trump desperately wants the gold medal of the Nobel Peace Prize, mainly because Obama got one. That’s why he’s now LARPing as a “peace maker” in every conflict: Israel-Gaza, Azerbaijan-Armenia, India-Pakistan, and of course Ukraine-Russia.
2/24
Another theory is that Putin holds kompromat — compromising material such as videos or documents — that would put Trump in an extremely bad light. Some have suggested it could be tied to the Epstein files or Russia’s interference in the 2016 US presidential election.
In today’s Vatnik Soup, I’ll talk about engagement farming: a cynical social media tactic to rack up likes, shares, and comments. From rage farming to AI-powered outrage factories, engagement farming is reshaping online discourse and turning division into profit.
1/23
Engagement farming is a social media tactic aimed at getting maximum likes, shares, and comments, with truth being optional. It thrives on provocative texts, images, or videos designed to spark strong reactions, boost reach, and turn online outrage into clicks and cash.
2/23
One subset of engagement farming is rage farming: a tactic built to provoke strong negative emotions through outrageous or inflammatory claims. By triggering anger or moral outrage, these posts often generate 100s or even 1,000s of heated comments, amplifying their reach.
In today’s Vatnik Soup, I’ll cover the autocratic concept of “Good Tsar, Bad Boyars”: the idea that the leader is wise and just, but constantly sabotaged by corrupt advisors. This narrative shields the ruler from blame, and it’s used by both Putin and Trump today.
1/20
The phrase “Good Tsar, Bad Boyars” (Царь хороший, бояре плохие), also known as Naïve Monarchism, refers to a long-standing idea in Russian political culture: the ruler is good and benevolent, but his advisors are corrupt, incompetent and responsible for all failures.
2/20
From this perception, any positive action taken by the government is viewed as being an accomplishment of the benevolent leader, whereas any negative one is viewed as being caused by lower-level bureaucrats or “boyars”, without the approval of the leader.
In today’s Vatnik Soup, I’ll introduce a Russian politician and First Deputy Chief of Staff of the Presidential Administration of Russia, Sergey Kiriyenko. He’s best known for running both domestic and foreign disinformation and propaganda operations for the Kremlin.
1/20
On paper, and in photos, Kiriyenko is just as boring as most of the Kremlin’s “political technologists”: between 2005-2016 he headed the Rosatom nuclear energy company, but later played a leading role in the governance of Russia-occupied territories in Ukraine.
2/20
What is a political technologist? In Russia, they’re spin doctors & propaganda architects who shape opinion, control narratives, and manage elections — often by faking opposition, staging events, and spreading disinfo to maintain Putin’s power and the illusion of democracy.
Let me show you how a Pakistani (or Indian, they're usually the same) AI slop farm/scam operates. The account @designbonsay is a prime example: a relatively attractive, AI-generated profile picture and a ChatGPT-style profile description are the first red flags.
1/5
The profile's posts are just generic engagement farming, usually using AI-generated photos of celebrities or relatively attractive women.
These posts are often emotionally loaded and ask the user to interact with them ("like and share if you agree!").
2/5
Then there's the monetization part. This particular account sells "pencil art", which again are just AI-generated slop.
In today’s Vatnik Soup, I’ll introduce an American lawyer and politician, Mike Lee (@BasedMikeLee). He’s best-known for opposing the aid to Ukraine, undermining NATO by calling the US to withdraw from the alliance, and for fighting with a bunch of braindead dogs online.
1/21
Like many of the most vile vatniks out there, “Based Mike” is a lawyer by profession. He hails from the holy land of Mormons, Utah, where he faces little political competition, allowing him to make the most outrageous claims online without risking his Senate seat.
2/21
Before becoming a senator, Mike fought to let a nuclear waste company dump Italian radioactive waste in Utah, arguing it was fine if they just diluted it. The state said no, the public revolted, and the courts told poor Mikey to sit down.