In today’s #vatniksoup, I’ll discuss social media superspreaders. Due to their effectiveness, superspreader accounts are often used to spread "low credibility" content, disinformation and propaganda, and today this is more often done by hostile state actors such as Russia.
1/14
DeVerna et al. (2024) described superspreaders as "users who consistently disseminate a disproportionately large amount of low-credibility content," also known as bullshit. It’s worth noting, that some of these people may actually believe the lies they spread.
2/14
The numbers behind these accounts are astonishing – a study by Grinberg et al. (2019) found out that 0,1% of Twitter accounts were responsible for sharing approximately 80% of the mis/disinformation related to the 2016 US presidential election.
3/14
The same applies to COVID-19 related disinformation, as only 12 accounts the researchers referred to as the "dirty dozen", produced 65% of the anti-vaccine content on Twitter. The most famous of this group is the presidential candidate RFK Jr.:
These accounts are naturally amplified by often state-sponsored troll and bot farms. Inorganic amplifying can make the content seem more attractive to regular people through massive amount of likes and shares, a technique that’s based on basic behavioral sciences.
5/14
When it comes to geopolitics and especially the situation in Ukraine, we can easily name a few of the most prominent superspreader accounts who have no interest in the truth: Jackson Hinkle, Kim Dotcom, Ian Miles Cheong, Alex Jones, Tucker Carlson and Russell Brand.
6/14
Another good way to spot superspreaders is to check the "Community Notes leaderboard" website, where Jackson Hinkle holds the position number 4, Cheong is at 7th position, and Elon Musk himself can be found at spot #39.
7/14
Naturally, the platform’s owner also often comments and shares content from these people & even engages in conversations with them on Spaces, because apparently he wants to be surrounded by conspiratorial "Yes Men",instead of doing tough interviews with people like Don Lemon.8/14
Most superspreader accounts have very little interest in the truth, as the nature of social media encourages you to go for maximum engagement (likes, shares, comments). On X, this even affects your ad share revenue, basically allowing people to earn money through lies.
9/14
There are many examples of pro-Kremlin narratives being spread by these accounts. One of them is the lie that Zelenskyy "bought a mansion from King Charles". The news came from a AI-generated fake news blog, and was spread by large accounts like Liz Churchill’s.
10/14
Another fake story about the "US-funded Ukrainian bioweapons labs" that even made it to the mainstream was started by QAnon follower Jacob Creech AKA @WarClandestine, who later on bragged about making money from the ad share revenue system of X.
11/14
Most of the content promoted and made up by these large accounts draw inspiration from various conspiracy theories like QAnon, PizzaGate, or The Great Reset. They often also share photos in wrong context, for example photos from Syria are told to be from Gaza.
12/14
As I’ve stated many times before, there are no downsides to rage farming and spreading lies online, and after Elon took over it has actually become a viable monetization strategy that can make you relatively rich.
13/14
Hostile state actors have also figured out the potential of using superspreaders to amplify their false narratives. For example, Russia's embassy accounts often tag people like Jackson Hinkle in their posts, hoping they'd share the content to their large following.
In today’s Vatnik Soup, I’ll cover the agenda-setting and flood of disinformation that spread on X and other platforms right after Charlie Kirk’s assassination. It’s far from the first or last time a tragedy has been weaponized for political purposes.
1/18
Every major political event, especially those involving violence, attracts massive attention. In the immediate aftermath, reliable information is scarce, making it highly vulnerable to both coordinated and improvised disinformation campaigns.
2/18
As I’ve mentioned in my previous soups and lectures, in disinformation campaigns, being first with a narrative is crucial, as people often remember the first version best — psychology studies show it sets the mental schema, and later updates rarely overwrite it.
In today’s Vatnik Soup, I’ll introduce American social media personality David Freeman, AKA Gunther Eagleman™ (@GuntherEagleman). He’s best known for spreading political disinformation on X and shamelessly sucking up to Trump, Putin, and other authoritarian leaders.
1/22
David is a textbook example of someone profiting from MAGA grievance politics. He uses extreme, provocative language to farm engagement on X and never hesitates to flatter anyone who might give him more exposure — or money.
2/22
But David wasn’t always like this. At some point, in his mid-40s, he even tried a real job: he trained to become a cop. He spent three years with the Metro Transit PD, but after that he either got fired or quit, and never looked back.
In today’s Vatnik Soup, I’ll introduce a Russian-Estonian businessman, Oleg Ossinovski. He is best-known for his deep ties to Russian rail and energy networks, shady cross-border dealings, and for channeling his wealth into Estonian politics.
1/14
Oleg made his fortune via Spacecom Trans & Skinest Rail, both deeply tied to Russia’s rail system. Most of this is through Globaltrans Investments PLC, a Cyprus-based firm with 62% held via Spacecom and tens of millions in yearly profits.
2/14
Ossinovski’s Russian-linked ventures made him Estonia’s richest man in 2014, with an estimated fortune of ~€300M. His business empire stretched across railways, oil via Alexela shares, and Russian bitumen imports from Help-Oil, a supplier to the Defense Ministry.
In today’s Vatnik Soup, I’ll introduce a Swiss/French writer, Alain Bonnet, aka Alain Soral (@officielsoral). He’s best known for his rabid antisemitism and for his pathetic support for all the worst authoritarian regimes from Russia to North Korea.
1/22
Alain’s childhood was problematic, as his father has been characterized as a “narcissistic pervert” who beat his children and did jail time for fraud. Alain himself has said he was “programmed to be a monster.” Born Alain Bonnet, he took the stage name of his sister,…
2/22
… actress Agnès Soral. She wasn’t too happy about this, commenting “How would you like to be called Agnès Hitler?”. Like many grifters, he became a pick-up/seduction artist writer, à la late Gonzalo Lira, writing books and even making a B-movie, “Confessions d’un dragueur”.
3/22
In today’s Vatnik Soup, I’ll explain the Alaska Fiasco and how it marks the peak of Trump’s two-year betrayal of Ukraine. What was sold as “peace talks” turned into a spectacle of weakness, humiliation, empty promises, and photo-ops that handed Putin exactly what he wanted.
1/24
Let’s start with the obvious: Trump desperately wants the gold medal of the Nobel Peace Prize, mainly because Obama got one. That’s why he’s now LARPing as a “peace maker” in every conflict: Israel-Gaza, Azerbaijan-Armenia, India-Pakistan, and of course Ukraine-Russia.
2/24
Another theory is that Putin holds kompromat — compromising material such as videos or documents — that would put Trump in an extremely bad light. Some have suggested it could be tied to the Epstein files or Russia’s interference in the 2016 US presidential election.
In today’s Vatnik Soup, I’ll talk about engagement farming: a cynical social media tactic to rack up likes, shares, and comments. From rage farming to AI-powered outrage factories, engagement farming is reshaping online discourse and turning division into profit.
1/23
Engagement farming is a social media tactic aimed at getting maximum likes, shares, and comments, with truth being optional. It thrives on provocative texts, images, or videos designed to spark strong reactions, boost reach, and turn online outrage into clicks and cash.
2/23
One subset of engagement farming is rage farming: a tactic built to provoke strong negative emotions through outrageous or inflammatory claims. By triggering anger or moral outrage, these posts often generate 100s or even 1,000s of heated comments, amplifying their reach.