Gavril Ducu 🇷🇴🇩🇪🇺🇲🇳🇱🇪🇺@🇺🇦 Profile picture
born by the KGB raised by the CIA mindreader digital ventriloquist #fella #WeAreNAFO Heavy Bonker Award 🏅⚡Every coffee helps #Edumacation and the @NAFOforum👇
3 subscribers
Apr 7 • 9 tweets • 1 min read
Infobesity — Too Much to Think.

We used to worry about censorship.
Now the threat is the opposite:
Too much information.
More than we can verify.
More than we can absorb.

And it’s making us vulnerable.

#PostTruth #Infobesity #Disinformation “Infobesity” = information overload.

A flood of headlines, posts, opinions, outrage, clickbait, fact checks, and noise.
All day. Every day.

More than we can process.
Apr 7 • 9 tweets • 1 min read
Disinformation ≠ Misinformation — The Language of Distortion.

We throw around terms like “fake news” and “disinfo.”
But if we don’t define them clearly, we help the problem spread.

Let’s break down the differences—and why they matter. #Disinformation #PostTruth #MediaLiteracy

Not all falsehoods are created equal.

Some are accidental.
Some are deliberate.
Some are designed to do damage.

The distinctions aren’t just academic—they’re strategic.
Apr 6 • 11 tweets • 2 min read
From Bikers to Broadcasters — The Night Wolves and Russia’s Cultural Influence Machine

This isn’t a troll farm story.
It’s a story about identity, spectacle, and narrative power. This series unpacks how the Kremlin uses the Night Wolves biker gang as political micro-influencers—across borders and platforms.

#NightWolves #NarrativeWarfare
Apr 5 • 9 tweets • 2 min read
Writers of the Storm — Who’s Behind the Kremlin’s Ongoing Narrative Machine

Disinformation doesn’t just appear.
It’s authored. This series exposes the network—the people, platforms, and tactics behind a coordinated campaign to launder pro-Russian narratives into Western information spaces.

#WritersOfTheStorm #DisinformationNetwork
Apr 4 • 16 tweets • 2 min read
Beyond the Ballot: The Psychological Battlefield of Election Trust

Adversaries don’t need to hack votes.
They just need to make you believe the votes don’t matter.
That’s the real front line now:
Not ballots, but belief.

#TrustWarfare #ElectionIntegrity The FDD memo makes one thing clear:
Foreign election interference has shifted from attacking systems to attacking perception.

Confidence

Legitimacy

Shared reality
That’s the battlefield.
Apr 4 • 16 tweets • 2 min read
China’s Strategic Patience: Eroding U.S. Trust Without Taking Credit

China’s election interference strategy is quieter than Russia’s subtler than Iran’s but no less real
It works by amplifying U.S. dysfunction—not inventing it
And targets what China cares about most: perception #ChinaNarrativeOps #DisinfoByDesign

The FDD memo puts it plainly:
China’s disinformation aims to:
Undermine confidence in U.S. democracy
Deflect criticism of China

Promote authoritarian stability as a contrast to “Western chaos”
Apr 4 • 18 tweets • 2 min read
Russia’s Roadmap for Chaos: From the IRA to Today’s Influence Networks
Russia didn’t stop after 2016.
It learned.
Adapted.
And refined its election interference strategy into a long-term psychological campaign.
Here’s how it works—and why it still does. #RussiaDisinfo #ElectionInterference

Russia’s goal isn’t to help a candidate win.
It’s to make voters lose faith—
In the system
In the media
In each other
As the FDD memo puts it: “Russia aims to destabilize democracy by undermining belief in electoral legitimacy.”
Apr 3 • 15 tweets • 2 min read
Media literacy works.
But not forever.

One study gave people tips to spot fake news—
It boosted their skills immediately
But within 3 weeks, the effect had dropped by half

This thread breaks down why disinfo resistance fades

#SpotTheFakeSeries Quick recap

The intervention was simple:
Participants read 10 “Tips to Spot False News”
Then rated headlines as true or false
It worked—discernment shot up

But how long did the effect last?
Apr 1 • 4 tweets • 1 min read
In an era where misinformation and propaganda run rampant, the manipulation of perception is no accident.

nafoforum.org/magazine/perce… Disinformation campaigns do not merely spread falsehoods; they engage in percepticide—the deliberate erasure of reality—and perspecticide—the systematic destruction of independent thought.
Mar 31 • 16 tweets • 2 min read
False Memories: When Misinformation Becomes “Something I Lived Through”

We assume memory is a record. It isn’t.
It’s a reconstruction.

And that makes it vulnerable to misinformation—not just as belief, but as experience.

This thread explores how lies become memories. #FalseMemory #MemoryManipulation

Dr. Yuran explains it like this:

Our brains aren’t hard drives.

They’re flexible, social, and vulnerable to influence.
When misinformation is repeated or emotionally charged, we may not just believe it.

We might remember it.
Mar 31 • 11 tweets • 2 min read
Truth Isn’t Enough: Why We Believe What We Want to Believe.

We like to think facts win.
That if people just saw the truth, they’d change their minds.
But decades of research say otherwise.

This series unpacks the psychology of misinformation belief—one bias at a time. #MisinformationPsychology #TruthVsBelief

People don’t spread lies because they’re stupid.
And they don’t believe falsehoods because they’re lazy.

Most of the time, they believe misinformation because it feels right.
It fits.
It protects something.
Mar 31 • 18 tweets • 2 min read
Can Fact-Checkers Keep Up?

In Indonesia’s 2024 election, fact-checkers worked around the clock.
They flagged lies
Posted corrections
Ran digital literacy campaigns
And still—the lies kept spreading
Here’s what worked, what didn’t, and why it matters

#ElectionMisinformation Indonesia has one of the most active fact-checking ecosystems in Southeast Asia.

Mafindo
TurnbackHoax
KPU & Bawaslu (Election Commission & Supervisors)

But they were outpaced—and sometimes outmatched.
Mar 31 • 17 tweets • 3 min read
When Trust Erodes: How Disinfo Polarized Communities in Jember

What happens when voters stop trusting facts?
Stop trusting each other?

In Jember, Indonesia, digital lies didn’t just mislead.
They divided neighbors.
They frayed democratic trust.

Here’s what the study found
1/17 #ElectionMisinformation #Polarization

Jember is a district in East Java with over 2 million people.
Diverse, semi-rural, politically active.

It became a ground zero for observing how election disinformation affects everyday relationships—not just national narratives.
2/17
Mar 31 • 19 tweets • 3 min read
Bots, Echo Chambers & the Algorithm: The Architecture of Manipulation.

In the 2024 Indonesian election, disinformation wasn’t just content.
It was a system.

Powered by code, shaped by behavior, and built to reward outrage.
1/19 #ElectionMisinformation

Let’s break down how the platforms themselves helped lies win.

Platforms don’t just show us what’s new.
They decide what we see first.
What we see often.
And what we never see at all.
2/19
Mar 31 • 16 tweets • 3 min read
Repetition Over Truth: The Psychology Behind Viral Lies.

In Indonesia’s 2024 election, lies didn’t need to be convincing.
They just needed to be repeated.
Again and again.
And again.
This is how falsehood becomes belief.
Here’s how disinformation hijacks the human mind ⬇️
1/16 #ElectionMisinformation #CognitiveBias

The researchers called it out plainly:

“Repeated exposure to misinformation reinforces cognitive biases.”
And the most powerful of those biases?
🔁 The illusion of truth effect
2/16
Mar 31 • 15 tweets • 2 min read
Why Indonesia’s 2024 Election Was a Disinfo Case Study for the World?

In 2024, a democracy of 204 million voters was hit with over 12,000 cases of digital disinformation.

Most of the world barely noticed.
But what happened in Indonesia may be the clearest warning we have. #ElectionMisinformation #DigitalThreats

Here’s why ⬇️
Indonesia is the world’s third-largest democracy, behind India and the U.S.

It’s also one of the most online:
📱 213M internet users
📱 170M active on social media
⏱️ Avg 7+ hours/day online
Mar 30 • 10 tweets • 2 min read
When belief becomes law, and law becomes delusion.

Fixations don’t just live in minds.
They scale into systems.

From cults to countries, when enough people fixate—belief becomes reality.
Even if it kills.

#FixationTheory #Disinformation #CollectiveDelusion Dielenberg (2024) warns:

🔸 Fixations are adaptive—until they aren’t
🔸 Maladaptive fixations can spread via culture
🔸 If adopted by institutions, they become normal

This is belief, weaponized at scale.
Mar 30 • 10 tweets • 2 min read
Fixations make us stable—but manipulable.

You can’t eliminate them.
But you can weaken the wrong ones.

To resist disinformation and reflexive control, you need a strategy:
Disrupt from the edges.
Don’t argue—unravel.

#FixationTheory #CognitiveImmunity #Disinformation Dielenberg (2024) gives us the blueprint:

🔸 Fixations form networks
🔸 Core beliefs are protected by auxiliary ones
🔸 Attacking core fixations directly can backfire
🔸 But auxiliary nodes? They’re the weak links
Mar 30 • 10 tweets • 2 min read
Reflexive Control: The mind game that turns your beliefs against you.

What if manipulation didn’t require lies—just nudges?
What if your beliefs could be guided to trap you?

That’s reflexive control.

It’s one of the sharpest tools in cognitive warfare. Reflexive control is a strategy that:

🔹 Exploits existing fixations
🔹 Triggers predictable projections
🔹 Guides targets to make self-defeating decisions

All while thinking it was their own idea.

(Russian mil doctrine, Dielenberg, 2024)
Mar 30 • 10 tweets • 2 min read
Fixation Networks: Why belief systems defend themselves.

Ever try to change someone’s mind—and they double down?
You didn’t fail.
You hit the core node of a fixation network.
To break the system, you have to start at the edges.

#FixationTheory #Disinformation #CognitiveSecurity According to Dielenberg (2024):

🔹Fixations don’t exist in isolation.
🔹They cluster into networks of beliefs.
🔹These networks protect core fixations by surrounding them with auxiliary ones.

Think: mental immune system—but backwards.
Mar 30 • 10 tweets • 2 min read
In 2019, a forged letter from Greenland’s foreign minister to U.S. Senator Tom Cotton appeared online.

It praised U.S. support and hinted at fast-tracking independence — with a request for funding.

It was fake. But its goals were real. Let’s break it down. 👇

1/10 Image The letter used official Greenland letterhead, referenced case numbers, and claimed to speak “on behalf of the government.”

It praised U.S.–Greenland cooperation, mentioned an “organized non-aligned territory,” and requested a 30% financing increase.
2/10