What did Russia’s digital propaganda look like during the invasion of Ukraine?

This isn’t a guess.

We now have data. A lot of it.

A 2023 study analyzed 368,000 pro-Russian tweets posted during the first 6 months of the war. Image
What they found is both familiar — and alarming.

#UkraineDisinfo7
The study: “Russian propaganda on social media during the 2022 invasion of Ukraine”

🔹 Timeframe: Feb–July 2022
🔹 Platform: Twitter
🔹 Accounts: 140,000+
🔹 Messages: ~368,000 pro-Russian tweets

arxiv.org/abs/2211.04154
Researchers used specific hashtags to collect pro-Russian content:

Examples:
🔸 # IStandWithRussia
🔸 # StandWithPutin
🔸 # DonbassWar
🔸 # Biolabs
🔸 # ZelenskyWarCriminal

These were often pushed to trend — even outside of Russia.
Scope of activity? Massive.

➡️ 368,000 tweets
➡️ ~251,000 retweets
➡️ Estimated reach: 14.4 million users

That’s not fringe. That’s engineered visibility.
The game-changer? Bots.

The study found that 20.3% of the accounts were likely bots.

That’s tens of thousands of inauthentic accounts — created to amplify, distort, and deceive.

And they were fast.
Many of these bot accounts were created right at the start of the invasion — February 2022.

That’s not organic.
That’s preparation.

A digital surge aligned with a physical invasion.
These bots weren’t designed to debate.

They had one job: amplify.

🔹 Retweet each other
🔹 Boost visibility of hashtags
🔹 Drive traffic to pro-Russian narratives
🔹 Make fringe ideas look “popular”

That’s influence laundering.
The bots created the illusion of consensus.

A trick that works on:

✔ Platform algorithms
✔ Casual users
✔ Journalists scanning trends
✔ Public sentiment in countries on the fence

Perception shapes reality.
One of the most revealing findings?

Bots were heavily active in India, South Africa, and Pakistan —

Countries that abstained from condemning Russia at the UN in March 2022.

That’s targeted influence.
Activity surged during the UN General Assembly vote.
→ March 2–4, 2022

This wasn’t random spam.

It was geopolitical manipulation in real time, using Twitter to shape how countries saw the war.
Bots also linked to themes meant to appeal to the Global South:

🔸 “Western hypocrisy”
🔸 NATO provocation
🔸 Biolabs conspiracies
🔸 Civilian casualties (blamed on Ukraine)
🔸 Anti-colonial rhetoric

Strategic narrative tailoring.
Pro-Russian accounts had fewer followers, fewer replies, and lower engagement compared to pro-Ukraine ones.

But with bot armies?

You don’t need organic support.

You just need to look loud enough to trend.
This mirrors what Russia did in 2016, 2018, and 2020 elections in the West.

But this time it wasn’t just about elections.

It was about justifying invasion.

And doing it in front of a global audience.
So what does this tell us?

Russia uses Twitter not to win arguments — but to:
✔ Drown out critics
✔ Seed doubt
✔ Confuse fence-sitters
✔ Make their position look mainstream

It’s disinfo by saturation.
This is also a lesson in how platforms become proxies.

When states go to war, so do their narratives.

And platforms like Twitter — with weak moderation and amplification mechanics — become weapons.
And while this study ends in July 2022…

These tactics haven’t stopped.
They’ve just evolved.

With AI-generated content, deeper mimicry, and cross-platform seeding now in play.

This was the proof-of-concept phase.
Why this matters:

Because the lie doesn’t need to be believed.

It just needs to trend.

To seed enough confusion that truth becomes “opinion.”

And aggression becomes “debate.”
The data tells the story:

➡️ Bots surged with the tanks
➡️ Hashtags were weaponized
➡️ Influence targeted swing states
➡️ Twitter was terrain in the invasion

This is not just propaganda.

It’s strategy.
We’ll never win this fight with facts alone.

We need:
🔹 Faster detection
🔹 Transparent algorithms
🔹 Public education
🔹 Narrative fluency
🔹 Civic resistance

Because perception is battlefield space now.
This thread was based on Geissler et al. (2023):


One of the best forensic snapshots of how Russia uses bots and hashtags to support war.

And a preview of what’s coming next.

#UkraineDisinfo7 #UkraineDisinfoarxiv.org/abs/2211.04154
Want the human story behind the data?

We broke it down on the Forum — how this bot-fueled war played out on your feed, and why it matters for the future of democracy and conflict.

Read more:


#UkraineDisinfo7 #UkraineDisinfonafoforum.org/magazine/postm…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦

Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @DucuGavril

May 27
We called this series disinformation archaeology.

Not just to look back.

But to excavate the methods still buried in today’s political chaos.

Because every viral lie, AI-generated fake, and troll campaign has roots.

And we just dug them up.

#Disinformation #MediaHistory
We started in ancient Rome, where smears helped end the republic.

Then traced the rise of:

📰 Hoax journalism
⚔️ Wartime propaganda
🎭 Satirical confusion
👤 Sock puppet armies
⚖️ Fake news laws
🧠 Data-fueled manipulation

History didn’t repeat. It mutated.
What did we learn?

🔹 Believability > truth
🔹 Format builds trust
🔹 Emotion bypasses logic
🔹 Every new media tool becomes a potential weapon
🔹 Regulation cuts both ways
🔹 Profiling turns info into a precision strike
Read 7 tweets
May 26
🧵 #DeceptionMachine7: Why This Matters

Disinformation isn’t just a tech issue.

It’s not just a media issue.
It’s a freedom issue.

Because when you can’t trust what you see, hear, or feel — you’ve lost more than clarity.

You’ve lost your liberty.
This series traced a long arc:

From Soviet mind control
→ to Russian information warfare
→ to fake personas and scam farms
→ to AI-driven digital hallucinations

Every step chipped away at your ability to know.
That’s what’s really under attack:

Cognitive liberty

Your right to interpret reality freely, with clarity, and without manipulation

If you don’t control your own mind, someone else will.
Read 11 tweets
May 26
🧵 #DeceptionMachine6: Weaponizing Confusion

Most people think disinformation is about getting you to believe a lie.

But the real tactic is simpler — and more effective.

It’s about breaking your ability to know what’s true at all.

Not persuasion. Collapse.
Soviet disinfo theorist Yuri Bezmenov once described the goal as "demoralization."

Make a person so confused they can’t trust their senses, their memory, or their peers.

That’s not misinformation. That’s cognitive sabotage.
Today’s deception machine takes that blueprint digital:

Fake personas

Contradictory narratives

Flood-the-zone tactics

AI hallucinations

Info overload on every side

It’s not just lying. It’s strategic disorientation.
Read 13 tweets
May 26
🧵 #DeceptionMachine3: The 2016 Inflection Point

In 2016, the mask slipped.

What had been theory — “what if foreign disinfo could shape an election?” — became fact.

And behind it all was a building in St. Petersburg.

The Internet Research Agency (IRA).
Let’s revisit the turning point.



The IRA wasn’t a think tank. It was a factory.

Inside, hundreds of workers operated fake accounts 24/7.
Each had a quota:

50–100 posts per day

Multiple fake identities

Different “voices” depending on the target group

📄 [ICFJ, p. 11]
Their job? Not just to lie — but to divide.

IRA teams targeted:

Black Lives Matter activists

Christian conservatives

Anti-vax groups

Gun rights forums

Bernie Sanders supporters

They posed as all sides to drive chaos.
Read 11 tweets
May 26
2016 isn’t that long ago.

Until you remember it’s before TikTok. Before Threads. Before AI-generated politicians.

In internet years, 2016 is ancient history.

And that makes this guide a kind of Rosetta Stone.

#Disinformation #InfoOps #MediaHistory
Back in 2016, the International Center for Journalists published:

📘 A Short Guide to the History of Fake News and Disinformation
Authors: Julie Posetti & Alice Matthews

It told a sweeping story—2000 years of lies, war, satire, and manipulation.

icfj.org/sites/default/…
We’re going to revisit that guide.

Not as a history lesson—but as a map.

Because if you want to understand today’s information wars, it helps to dig through yesterday’s tactics.

This is disinformation archaeology.
Read 11 tweets
May 26
Everyone’s asking:

How do we actually fight propaganda—without becoming censors or control freaks?

Here’s a grounded strategy, based on what would actually work—and what it would look like in real life.

#EUInfoSec #CivicResilience #RuleOfLaw Image
Empower the Public — Real-world projection:

— Critical media literacy in every EU secondary school
— City partnerships with libraries & NGOs to run “news fluency” workshops
— A public platform (like EUvsDisinfo 2.0) showcasing real-time manipulation case studies
Regulate Infrastructure — Real-world projection:

— Platforms like X/Twitter must disclose who paid for political ads and who was targeted and why
— Independent audits of algorithmic amplification
— Bots and fake accounts above a threshold must be labeled or banned by default
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(