Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦 Profile picture
born by the KGB raised by the CIA mindreader digital ventriloquist #fella #WeAreNAFO Heavy Bonker Award 🏅⚡Every coffee helps #Edumacation and the @NAFOforum👇
4 subscribers
Jun 3 11 tweets 2 min read
What is cognitive warfare?

It’s not just disinfo or trolling.

It’s a campaign to break your will—without firing a shot.

Based on Taiwan’s 2024 military doctrine:



#CognitiveWarfare #InformationDefense #TikTokOpsmedia.defense.gov/2024/May/07/20… Cognitive warfare targets how you think.

What you believe.
What you fear.
What you’re willing to defend.

Its goal isn’t to persuade—it’s to disorient

To make you feel like nothing matters

So when pressure comes, you fold
Jun 2 10 tweets 2 min read
How can democracies push back against cognitive warfare—without becoming authoritarian?

TikTok’s not going away.
Disinfo will only get smarter.
So what can we do?

This thread outlines real-world countermeasures from the report👇

#DigitalDefense #MediaLiteracy #CognitiveWarfare The paper outlines a strategic response rooted in radical transparency and collaborative defense.

Core idea:

If we can’t stop the attack vectors, we must expose and defuse them—with speed, openness, and public trust.
Jun 2 10 tweets 2 min read
How is Russia’s GRU using AI to scale disinformation like never before?

We’re talking about bot armies.

Not crude spam bots—adaptive, persuasive, AI-driven agents that engage like real users.

Here’s how it works👇

#AIManipulation #InfoOps #CognitiveWarfare The GRU doesn’t just post fake content.

It uses AI to run influence campaigns at machine speed.

Key tools include:
🔹 Large Language Models (LLMs)
🔹 LangChains
🔹 Retrieval-Augmented Generation (RAG)
🔹 Web scraping + engagement tracking
Jun 2 10 tweets 2 min read
What is cognitive warfare—and why is TikTok at the center of it?

It’s not just about propaganda.
It’s about shaping how you think.

And TikTok’s design makes it the perfect weapon.

Let’s unpack what’s happening.👇

#CognitiveWarfare #InfoOps #DigitalResilience Cognitive warfare targets your mind, not your machines.

It’s about:
🔹 Undermining trust
🔹 Fracturing social cohesion
🔹 Disabling democratic response
🔹 Influencing perception—without you noticing

It’s subtle. And it’s strategic.
Jun 2 11 tweets 2 min read
Did you know the Russian GRU is running cognitive warfare campaigns on TikTok—powered by AI?

They’re not just pushing lies.

They’re weaponizing algorithms to erode trust, fracture democracies & shape what you believe.

Let’s break it down.👇

#CognitiveWarfare #Disinformation This thread series unpacks a 2024 analysis from the Information Professionals Association:

“Countering Cognitive Warfare in the Digital Age”



It details how TikTok became a frontline in the battle over your attention—and your trust.information-professionals.org/countering-cog…
Jun 2 11 tweets 2 min read
🧵Thread 4: The Overlooked Target Group—Older Adults

Most media literacy campaigns focus on youth.

But one of the most vulnerable groups to disinformation is often ignored:

Older adults.

Here’s what the 2024 OBS Journal says we’ve been missing—and how to fix it. #MediaLiteracy #Disinformation #DigitalInclusion #InfoOps #CognitiveSecurity
Jun 1 14 tweets 2 min read
🧵Thread 3: Using the Model to Defend Democracy

The Diamond Model isn’t just for intelligence analysts.

It’s a framework any defender of democracy can use—journalists, educators, civil society, even platforms.

Here’s how it works in practice. #Disinformation #InfoOps #CognitiveSecurity #DigitalResilience #InfluenceOps
Jun 1 8 tweets 2 min read
🧵How do professionals break down disinformation campaigns?

Not with vibes.
Not with hot takes.

With a structured analytic model used by intelligence and national security analysts.

It’s called the Diamond Model of Influence Operations.

#InfoOps #CognitiveSecurity The Diamond Model was developed by MITRE—a major U.S. research center behind key cyber and defense frameworks.

Originally built for cybersecurity, this version is designed to track information warfare.

You can learn it too.
And apply it.
Jun 1 11 tweets 2 min read
Imagine a news site that feels familiar.

Same tone. Same topics.
Same fears you’ve seen on real headlines.

Now imagine 94 of them—copy-pasted, AI-edited, and Russian-controlled.

Welcome to CopyCop.

#Germany2025 #Disinformation #InfoOps CopyCop isn’t one site.

It’s a network of 94+ fake German-language news pages exposed by Insikt Group.

Each site runs plagiarized content, doctored translations, AI-written scandal pieces, and fabricated whistleblower “leaks.”
Jun 1 10 tweets 2 min read
Election interference doesn’t always shout.

Sometimes it whispers:

"Is your vote even safe?"
"Will violence break out?"
"Should you just stay home?"

Welcome to Operation Overload.

#Germany2025 #Disinformation #AIThreats Operation Overload was designed to flood the information space with noise, fear, and doubt during Germany’s 2025 election.

Its toolkit:

🔹 Deepfake voices
🔹 Fake ballot videos
🔹 Impersonated police warnings
🔹 AI-generated stories of unrest
Jun 1 7 tweets 1 min read
Germany just held a national election.

But that wasn’t the only campaign running.

Behind the scenes, a complex web of Russian disinformation ops worked to shape perception, stir division, and undermine trust.

#Germany2025 #Disinformation #CognitiveSecurity This series breaks down how those operations worked.

Based on a Feb 2025 threat report from Insikt Group (Recorded Future):

Stimmen aus Moskau


This is how narrative warfare plays out—step by step.go.recordedfuture.com/hubfs/reports/…
May 31 11 tweets 2 min read
Disinformation doesn't always look like propaganda.

It can look like a headline.
A quote.
A respected article.
Even a work of art.

This thread gives you the tools to recognize how disinfo works—before it becomes “truth.”

#Disinformation #MediaLiteracy #InfoOps From forged letters to show trials and strategic plays, Communist disinfo wasn’t just about politics.

It was about narrative dominance.

Shape what people believe about the past—and you control what they’ll accept in the future.
May 31 10 tweets 2 min read
If a lie is written down, does it become real?

This thread shows how Communist disinformation operatives forged documents to frame enemies, shape trials, and rewrite memory—one fake at a time.

#Disinformation #DocumentForgery #ColdWar #MediaLiteracy The show trial of Cardinal Mindszenty wasn’t just based on forced confession.

It was propped up by forged documents—internal memos, bank letters, communications—designed to “prove” his crimes on paper.

Rychlak cites them directly.
May 29 12 tweets 4 min read
We said we’d unpack how political manipulation became a global system—from Russia to your feed.

This is the recap of everything we covered:

Concept, tactics, spread, psychology, resistance.

No speculation. Just systems.

#PoliticalTechnology #Disinformation #CivicResilience Our core concept: Political Technology

Defined by Andrew Wilson as:

“The supply-side engineering of the political system for partisan advantage.”

Not propaganda. Not spin.

It’s the design of democracy to prevent change.

cambridge.org/core/books/pol…
May 29 10 tweets 2 min read
You can’t fact-check your way out of a system engineered for control.

But resistance is possible.

This thread maps real strategies for disrupting political technology—at the personal, institutional, and civic level.

#PoliticalTechnology #CivicResilience #MediaLiteracy Let’s be clear:

You’re not just fighting misinformation.
You’re resisting a system built to confuse, exhaust, and polarize.

That means resistance must be structural, strategic, and shared.
May 28 9 tweets 2 min read
Political technology doesn’t work because it’s smart.

It works because we’re human.

This thread is about the cognitive levers behind modern manipulation—and why we keep falling for it.

#PoliticalTechnology #CognitiveVulnerabilities #Disinformation Emotional manipulation > factual argument

Humans process threats faster than truth.

We react before we verify.

That’s not stupidity—it’s survival instinct.

Political tech exploits it.

📘 Wilson (via Grishkyan), 🧾 Hoferer et al. (2019)
May 28 9 tweets 2 min read
Disinfo isn’t just about lies.
And political tech isn’t just tools.

The real threat?

When tactics become infrastructure—shaping not just elections, but entire realities.

Let’s walk through how that happens.

#PoliticalTechnology #Disinformation #NarrativeWarfare At first, manipulation looks tactical:

A fake news site.
A social media campaign.
A media scandal.

But if repeated—and left unchecked—those tactics become normal.

Then expected. Then invisible.
May 28 11 tweets 2 min read
“Political technology” sounds like sci-fi.

It isn’t.

It’s the professional, strategic manipulation of politics itself—for power, not policy.

This is where it started.

And how it spread.

#PoliticalTechnology #InfoOps #Disinformation #DemocracyUnderPressure The term emerged in post-Soviet Russia in the 1990s.

At the time, democracy was new—and fragile.
But elections still had to happen.

So elites hired political technologists to control the outcome without banning the vote.
May 28 12 tweets 2 min read
In just 7 threads we unpacked one of the largest EU-wide disinformation studies ever published.

278 confirmed falsehoods.
20 countries.

One goal: manipulate public belief during a democratic vote.

Here’s what we found—and what it means. #Disinformation #EUElections2024 #CivicResilience

The source:

Spreading False Content in Political Campaigns: Disinformation in the 2024 European Parliament Elections

By Casero-Ripollés, Alonso-Muñoz & Moret-Soler (2025)
doi.org/10.17645/mac.9…
May 28 9 tweets 2 min read
Everyone’s worried about AI-generated disinformation.

But in 2024, most election lies were handcrafted, not machine-made.

The tactics weren’t futuristic.
They were familiar.

Old tools, new context. Let’s unpack.

#Disinformation #AI #EUElections2024 #MediaLiteracy According to the 2025 study:


Out of 278 fact-checked falsehoods, only 1.1% were AI-generated.

That means 98.9% were human-made or recycled using classic manipulation techniques.doi.org/10.17645/mac.9…
May 28 9 tweets 2 min read
“Climate tyranny.”
“Meat bans.”
“EU green plots to control your life.”

In 2024, climate change wasn’t just debated—it was distorted.

Disinformation turned green policy into political fuel.

Let’s look at how.

#Disinformation #ClimateDenial #EUElections2024 The 2025 study found:


Climate-related falsehoods were a growing presence in the disinformation ecosystem.

Often paired with migration or economic anxiety, they framed green policies as existential threats.doi.org/10.17645/mac.9…