What did Russia’s digital propaganda look like during the invasion of Ukraine?

This isn’t a guess.

We now have data. A lot of it.

A 2023 study analyzed 368,000 pro-Russian tweets posted during the first 6 months of the war. Image
What they found is both familiar — and alarming.

#UkraineDisinfo7
The study: “Russian propaganda on social media during the 2022 invasion of Ukraine”

🔹 Timeframe: Feb–July 2022
🔹 Platform: Twitter
🔹 Accounts: 140,000+
🔹 Messages: ~368,000 pro-Russian tweets

arxiv.org/abs/2211.04154
Researchers used specific hashtags to collect pro-Russian content:

Examples:
🔸 # IStandWithRussia
🔸 # StandWithPutin
🔸 # DonbassWar
🔸 # Biolabs
🔸 # ZelenskyWarCriminal

These were often pushed to trend — even outside of Russia.
Scope of activity? Massive.

➡️ 368,000 tweets
➡️ ~251,000 retweets
➡️ Estimated reach: 14.4 million users

That’s not fringe. That’s engineered visibility.
The game-changer? Bots.

The study found that 20.3% of the accounts were likely bots.

That’s tens of thousands of inauthentic accounts — created to amplify, distort, and deceive.

And they were fast.
Many of these bot accounts were created right at the start of the invasion — February 2022.

That’s not organic.
That’s preparation.

A digital surge aligned with a physical invasion.
These bots weren’t designed to debate.

They had one job: amplify.

🔹 Retweet each other
🔹 Boost visibility of hashtags
🔹 Drive traffic to pro-Russian narratives
🔹 Make fringe ideas look “popular”

That’s influence laundering.
The bots created the illusion of consensus.

A trick that works on:

✔ Platform algorithms
✔ Casual users
✔ Journalists scanning trends
✔ Public sentiment in countries on the fence

Perception shapes reality.
One of the most revealing findings?

Bots were heavily active in India, South Africa, and Pakistan —

Countries that abstained from condemning Russia at the UN in March 2022.

That’s targeted influence.
Activity surged during the UN General Assembly vote.
→ March 2–4, 2022

This wasn’t random spam.

It was geopolitical manipulation in real time, using Twitter to shape how countries saw the war.
Bots also linked to themes meant to appeal to the Global South:

🔸 “Western hypocrisy”
🔸 NATO provocation
🔸 Biolabs conspiracies
🔸 Civilian casualties (blamed on Ukraine)
🔸 Anti-colonial rhetoric

Strategic narrative tailoring.
Pro-Russian accounts had fewer followers, fewer replies, and lower engagement compared to pro-Ukraine ones.

But with bot armies?

You don’t need organic support.

You just need to look loud enough to trend.
This mirrors what Russia did in 2016, 2018, and 2020 elections in the West.

But this time it wasn’t just about elections.

It was about justifying invasion.

And doing it in front of a global audience.
So what does this tell us?

Russia uses Twitter not to win arguments — but to:
✔ Drown out critics
✔ Seed doubt
✔ Confuse fence-sitters
✔ Make their position look mainstream

It’s disinfo by saturation.
This is also a lesson in how platforms become proxies.

When states go to war, so do their narratives.

And platforms like Twitter — with weak moderation and amplification mechanics — become weapons.
And while this study ends in July 2022…

These tactics haven’t stopped.
They’ve just evolved.

With AI-generated content, deeper mimicry, and cross-platform seeding now in play.

This was the proof-of-concept phase.
Why this matters:

Because the lie doesn’t need to be believed.

It just needs to trend.

To seed enough confusion that truth becomes “opinion.”

And aggression becomes “debate.”
The data tells the story:

➡️ Bots surged with the tanks
➡️ Hashtags were weaponized
➡️ Influence targeted swing states
➡️ Twitter was terrain in the invasion

This is not just propaganda.

It’s strategy.
We’ll never win this fight with facts alone.

We need:
🔹 Faster detection
🔹 Transparent algorithms
🔹 Public education
🔹 Narrative fluency
🔹 Civic resistance

Because perception is battlefield space now.
This thread was based on Geissler et al. (2023):


One of the best forensic snapshots of how Russia uses bots and hashtags to support war.

And a preview of what’s coming next.

#UkraineDisinfo7 #UkraineDisinfoarxiv.org/abs/2211.04154
Want the human story behind the data?

We broke it down on the Forum — how this bot-fueled war played out on your feed, and why it matters for the future of democracy and conflict.

Read more:


#UkraineDisinfo7 #UkraineDisinfonafoforum.org/magazine/postm…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦

Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @DucuGavril

May 24
What does disinfo actually do to society?

It’s not just about lies or clickbait.
It’s about shaping memory, policy, law, science, and public identity.

We just finished a thread series breaking it all down.

Here are the takeaways:

#DisinfoConsequences Image
Based on a massive review of 4,000+ studies (Silva & Vaz, 2024):


The authors identified 23 categories where disinfo reshapes public life.

And the bottom line is this:

👉 Disinfo doesn’t just mislead. It rewires how society functions.arxiv.org/abs/2406.00951
It undermines science.

By reducing credibility, flooding the zone with falsehoods, and politicizing facts.

Result: public trust collapses
➡️ Evidence-based policy gets ignored
➡️ Experts are sidelined

#DisinfoConsequences1
Read 12 tweets
May 24
#DisinfoConsequences7: The Science No One Reads

Academics know how disinfo works.

They’ve mapped the tactics, timelines, platforms, effects.

They’ve written thousands of papers.
But here’s the twist:

Most people targeted by disinfo will never read a single one.

#DisinfoConsequences
Silva & Vaz (2024) reviewed 4,128 studies on disinfo and fake news.

The result:

23 categories of real-world impact

Across law, health, education, policy, and memory

All backed by evidence

arxiv.org/abs/2406.00951
Read 12 tweets
May 24
#DisinfoConsequences6: Weaponized Confusion

Not every disinfo campaign wants to convince you.

Some just want to wear you down.

Flood your feed.
Contradict itself.
Make you shrug and say:

“Who knows what’s true anymore?”

That’s the win.

#DisinfoConsequences
According to Silva & Vaz (2024), this tactic has a name: “epistemic saturation.”

A deliberate overload of conflicting claims that creates confusion, not clarity.

arxiv.org/abs/2406.00951
This is how it works:

You see:

6 versions of the same event

“Experts” contradicting each other
Accusations flying in every direction
Memes, fake stats, deepfakes, bots, trolls

It’s not meant to persuade.
It’s meant to exhaust.
Read 11 tweets
May 24
#DisinfoConsequences5: Memory Wars

Disinfo isn’t just about the now.

It’s about what people remember.

What they think they lived through.
Who they think was right — or wrong.

This is how disinformation fights not just facts, but history.

#DisinfoConsequences
Silva & Vaz (2024) identify this as one of the deepest impacts of disinfo:

“It reshapes public memory and collective identity — redefining how society understands itself.”

arxiv.org/abs/2406.00951
This isn’t about historical denialism alone.

It’s about:

Reframing past events with new motives
Recasting villains as heroes
Erasing the context around violence or injustice
Injecting pride or shame into selective fragments

Memory becomes myth.
Read 11 tweets
May 24
#DisinfoConsequences3: Policy in a Hall of Mirrors

What happens when governments start responding to lies instead of facts?

You get fake problems with real policies.

Confused citizens.
Performative fixes.

And budgets spent fighting ghosts.
Let’s talk about disinfo-driven policymaking.

#DisinfoConsequences
According to Silva & Vaz (2024), one of the most dangerous effects of disinfo is this:

“It leads to distorted public policies that address disinformation-based narratives, not real-world problems.”

arxiv.org/abs/2406.00951
Read 12 tweets
May 23
#ElectoralDisinfoOps6: The Far-Right Playbook

Disinfo wasn’t politically neutral in the 2024 EU elections.

It concentrated — and accelerated — around far-right campaigns.
Here’s what the data shows about how disinfo fed their narratives, built momentum, and blurred the line between content and conspiracy.
Researchers reviewed 916 hoaxes across 20 countries.

A pattern emerged:
🔸 Most hoaxes were supportive of the far right
🔸 Many targeted Greens, left-liberals, or mainstream center-right parties

doi.org/10.17645/mac.9…
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(