One sidelight on the Russian protests today: #Navalny is probably the single most consistent target of Russian disinfo and influence operations.
He's been a target for at least 8 years, by ops including the Internet Research Agency, Secondary Infektion, and the Kremlin.
Way back in September 2013, @Soshnikoff investigated the then newly founded Internet Research Agency, and reported that it had been trolling Navalny when he ran for Mayor of Moscow.
January 2014: op Secondary Infektion set up its most prolific persona, with a pic of Navalny’s face painted blue. It started out by attacking the Russian opposition.
The username, bloger_nasralny, is a toilet pun on his name.
Secondary Infektion kept on coming back, posting screenshots to communications that exposed Navalny as [insert pejorative here].
Weirdly, the only places those screenshots showed up was posts planted by the operation.
August 2017...
July 2018: forged letter that purportedly shows the EU Commission calling Navalny an "odious nationalist."
A narrative you'll still hear from Russian state outlets and employees today.
And yet another SI forgery from July 2019, this time trying to link Navalny to the US.
Secondary Infektion loved faking letters from the Senate.
It wasn't just Secondary Infektion. According to Twitter's archive of influence ops, the Internet Research Agency mentioned Navalny close on 6,000 times.
(Screenshot from scan of the earliest archive, up to 2017.)
In late 2020, the network of fake websites first exposed by @alexejhock and @DanielLaufer ran fake stories smearing Navalny, too. One was picked up by mainstream Russian media.
Add to that the various Russian state outlets that either make, or amplify, claims about Navalny as a tool of the West - and the claims from the Kremlin itself that he has CIA handlers.
So overall, Navalny looks like the single most consistent target of Russian and pro-Russian IO since 2013.
But despite the attacks, he just got over 70 million views on his latest video on official corruption in Russia.
Despite... or because of?
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Question for the #OSINT community: can anyone else find TikTok videos about protests for Navalny that become unavailable if you watch via a Russian server?
If you check TikTok for key hashtags about Navalny and the protests, some of the most popular videos don’t show up when browsing through a Russian VPN.
Just out: @Facebook's latest update on influence op (IO) takedowns. Fourteen new ones in this report, from nine countries. @Graphika_NYC did a write-up on one of them, from separatist-held Ukraine.
A cluster of inauthentic assets on FB, boosting a network of fake websites focused on Europe and the former USSR: pro-Kremlin, anti-Ukraine, anti-Navalny, anti-EU.
H/t @alexejhock and @DanielLaufer for the first reporting on parts of this network, based around a fake outlet called Abendlich Hamburg ("evening Hamburg").
A couple other sites had "evening" in their names, others had "echo of [country]".
We came across part this botnet in the summer, when it was boosting the pro-Chinese network "Spamouflage."
This, from @conspirator0, is a typical profile. Note the broken sentence and word in the bio. No human typed that... at least not on that Twitter account.
Now compare the bio with the version of Dracula that's online at Tallinn Technical University: lap.ttu.ee/erki/failid/ra…
BREAKING: @Facebook just took down two foreign influence ops that it discovered going head to head in the Central African Republic, as well as targeting other countries.
There have been other times when multiple foreign ops have targeted the same country.
But this is the first time we’ve had the chance to watch two foreign operations focused on the same country target *each other*.
In the red corner, individuals associated w/ past activity by the Internet Research Agency & previous ops attributed to entities associated w/ Prigozhin.
In the blue corner, individuals associated w/ the French military.
ELECTION THREAD: Today and tonight are going to be a wild time online.
Remember: disinformation actors will try to spread anger or fear any way they can, because they know that people who are angry or scared are easier to manipulate.
Today above all, keep calm.
A couple of things in particular. First, watch out for perception hacking: influence ops that claim to be massively viral even if they’re not.
Trolls lie, and it’s much easier to pretend an op was viral than to make a viral op.