In today's #vatniksoup, I'll be talking about the Russian style of online propaganda and disinformation, "Firehose of Falsehood". It's a commonly used Kremlin strategy for Russian information operations, which often prioritizes quantity over quality.
1/23
I have mentioned this particular strategy in many of my previous soups, but have never discussed it in more detail, so here goes. The term was originally coined by Paul & Matthews in their 2016 paper, The Russian "Firehose of Falsehood" Propaganda Model.
2/23
They based this name on two distinctive features: 1) high volume, multi-channel approach, and 2) shameless willingness to spread disinformation.
Academic Giorgio Bertolin described Russian disinformation as entertaining, confusing and overwhelming.
3/23
The high volume, multi-channel approach means that these operatives attempt to control the narrative on each major social media platform. Russia has conducted, and is conducting, these operations on Facebook, Twitter, TikTok, Telegram, VKontakte, YouTube, and even on Tinder.
4/23
The volume of these operations shouldn't be underestimated: already back in 2015, more than 1000 paid trolls worked at Yevgeny Prigozhin's Internet Research Agency (IRA), the most well-known troll farm in Russia, and each commentator had a daily quota of 100 comments.
5/23
These numbers have probably gone up a LOT since then, and many more countries are using troll farms to conduct political campaigns or to spread propaganda.
One of the most famous case of social manipulation was the social media influencing around the Khashoggi murder.
6/23
These trolls would work in shifts, and the work goes on daily around the clock. A better description of these sweatshops would be troll factories, since they have turned trolling into an assembly line of propaganda and disinformation.
7/23
The high volume is accompanied with the willingness to spread disinformation. Russia often utilizes the "throwing shit at the wall to see what sticks" strategy, pushing out hundreds of contradicting and false narratives, only to see if some of them starts gaining traction.
8/23
Some examples of forgotten narratives include Zelenskyy leaving Kyiv after the invasion started, secret NATO base in Mariupol, Poles trying to blow up a chlorine tank, birds as bioweapons, combat mosquitos, the use of dirty bomb, and Ukrainian Satan worshipping.
9/23
Troll farms also often "borrow" ideas and narratives from conspiracy theorists. One example of this was the "bioweapons lab" theory started by a QAnon follower, Jacob Creech. The narrative was spread, along with the Kremlin,by people like Tucker Carlson and Steve Bannon.
10/23
There is also no commitment to any kind of consistency and these narratives can naturally be contradictory - as I mentioned, the goal is not to persuade but to confuse and overwhelm.
11/23
A lot of the "argumentation" from these trolls focuses on anecdotal evidence or faked sources. A good example of this are the "Ukrainian Nazis" replies that flood the discussion with anecdotal image collages of Ukrainians waving Nazi flags or having Nazi tattoos.
12/23
The firehose also often utilizes non-sourced and out-of-context materials. Using (fake) imagery is an effective way to invoke strong reactions and emotions. Sometimes Russia produces false flag videos, but have done them less after various videos were geolocated to Russia.
13/23
This strategy works extremely well in so-called low trust environments, meaning countries or societies where the trust against politicians, journalists and authorities is relatively low. Naturally, the effective use of this method degrades this trust even further.
14/23
The sheer number of messages and comments drown out any competing arguments or viewpoints, and this also often makes any kind of fact-checking obsolete - after the information has been debunked, the topic has already changed many, many times.
15/23
And this is exactly why #NAFO has been so efficient against this particular strategy: it counters the strategy with similar measures. High volume, nonsensical replies from braindead cartoon dogs...
16/23
...shuts down the firehose of falsehood extremely well, and as a bonus ridicules the main sources of pro-Russian narratives, including the country's ex-president, the embassy and diplomat accounts.
17/23
Like with most production, propaganda has been outsourced to cheaper sources. These days many of these troll farms have been moved from places like Russia and Macedonia into various African countries, including Nigeria and Ghana.
18/23
China has utilized the firehose in their own propaganda, and their most famous troll farm is the 50 Cent Army. The biggest difference between Russian and Chinese operations was that the Chinese focused on national networks,mostly neglecting the online world outside of China.19/23
Russia also focuses more on bashing and blaming others, whereas China focuses on praising the CCP. After realizing the success of Russia in their info ops, though, China has also started using more aggressive strategies against their rivals, especially against the US.
20/23
Based on BBC, Russian and Chinese propaganda accounts are "thriving" on Twitter after @elonmusk sacked the team that was countering them. Allegedly the current system relies fully on automated detection systems.
21/23
@DarrenLinvill, an associate professor from Clemson University said that one of these networks appears to originate from IRA. They have also identified troll farm from the opposite camp, with tweets supporting Ukraine and Alexei Navalny.
22/23
Before Musk took over the site, Twitter was relatively effective in removing troll farm accounts, but one can only assume that this is not the case anymore.
As is tradition, social media giants prioritize profits over safety.
In today’s Vatnik Soup, I’ll introduce a Russian politician and First Deputy Chief of Staff of the Presidential Administration of Russia, Sergey Kiriyenko. He’s best known for running both domestic and foreign disinformation and propaganda operations for the Kremlin.
1/20
On paper, and in photos, Kiriyenko is just as boring as most of the Kremlin’s “political technologists”: between 2005-2016 he headed the Rosatom nuclear energy company, but later played a leading role in the governance of Russia-occupied territories in Ukraine.
2/20
What is a political technologist? In Russia, they’re spin doctors & propaganda architects who shape opinion, control narratives, and manage elections — often by faking opposition, staging events, and spreading disinfo to maintain Putin’s power and the illusion of democracy.
Let me show you how a Pakistani (or Indian, they're usually the same) AI slop farm/scam operates. The account @designbonsay is a prime example: a relatively attractive, AI-generated profile picture and a ChatGPT-style profile description are the first red flags.
1/5
The profile's posts are just generic engagement farming, usually using AI-generated photos of celebrities or relatively attractive women.
These posts are often emotionally loaded and ask the user to interact with them ("like and share if you agree!").
2/5
Then there's the monetization part. This particular account sells "pencil art", which again are just AI-generated slop.
In today’s Vatnik Soup, I’ll introduce an American lawyer and politician, Mike Lee (@BasedMikeLee). He’s best-known for opposing the aid to Ukraine, undermining NATO by calling the US to withdraw from the alliance, and for fighting with a bunch of braindead dogs online.
1/21
Like many of the most vile vatniks out there, “Based Mike” is a lawyer by profession. He hails from the holy land of Mormons, Utah, where he faces little political competition, allowing him to make the most outrageous claims online without risking his Senate seat.
2/21
Before becoming a senator, Mike fought to let a nuclear waste company dump Italian radioactive waste in Utah, arguing it was fine if they just diluted it. The state said no, the public revolted, and the courts told poor Mikey to sit down.
In today’s Vatnik Soup, I’ll introduce an American national security policy professional and the current under secretary of defense for policy, Elbridge Colby (@ElbridgeColby). He’s best-known for fighting with cartoon dogs online and for halting military aid to Ukraine.
1/21
Elbridge "Cheese" Colby earned his bachelor’s degree from Yale and a Juris Doctor from Harvard Law School. Before entering government, he worked at top think tanks and in the intelligence community, focusing on nuclear policy and strategic planning.
2/21
Cheese quickly became a key voice for a “China First” strategy, arguing the US must prioritize military buildup in Asia over commitments in Europe or the Middle East. He sees (or saw, rather) Taiwan as the core test of US credibility.
In today’s Vatnik Soup, I’m going to talk about… Vatnik Soup! As some of you know, we also have a website where you can find every soup ever published. The site also has other useful resources, making it the most comprehensive resource on Russian disinformation & vatniks.
1/15
Unfortunately, Elon has flagged the website as malware, as he might not be very happy about the soups I wrote about him - so far, they have garnered over 60 million views on X/Twitter.
The “freedom of speech” spokesperson doesn’t seem too keen on free speech, after all.
2/15
The heart & soul of the website is of course the soups page. There you can find all 360+ soups, which can be sorted chronologically, by popularity, etc. You can also search for soups by title or even in the soup text: