Just out: @Facebook's latest update on influence op (IO) takedowns. Fourteen new ones in this report, from nine countries. @Graphika_NYC did a write-up on one of them, from separatist-held Ukraine.
A cluster of inauthentic assets on FB, boosting a network of fake websites focused on Europe and the former USSR: pro-Kremlin, anti-Ukraine, anti-Navalny, anti-EU.
H/t @alexejhock and @DanielLaufer for the first reporting on parts of this network, based around a fake outlet called Abendlich Hamburg ("evening Hamburg").
A couple other sites had "evening" in their names, others had "echo of [country]".
In this operation, social media were secondary. The main carriers were websites focused on countries from the UK to Central Asia, linked by registration emails, admins, analytics etc.
We made it 25 sites in total, some already down; there are probably more.
In Russian, most of the articles were original. Different sites focused on Ukraine (inevitably), Moldova, Kyrgyzstan and Kazakhstan.
The editorial line was pro-Russia, anti-Western and, in Central Asia, anti-China.
In other languages, almost all the articles on these "news" sites were copied from real outlets.
The few original ones - the payload - were poorly written, conspiratorial, and very much aligned with Kremlin messaging.
They didn't get much traction on social media, or in the countries they focused on.
The main pickup was in Russian media. In the best example, an article on the fake website Abendlich Hamburg - actually run from Luhansk, Ukraine - was cited by outlets like gazeta[.]ru.
Whether by accident or design, the main impact the operation had - which is not much - was by pretending to be "Western" media, and being cited in Russian media.
Disinformation laundering.
Plenty more takedowns in the Facebook report:
Iran
Morocco
Ukraine x 3
Kyrgyzstan x 3
Kazakhstan
Argentina
Brazil x 2
Pakistan
Indonesia
Not to mention the French and Russian takedowns last month, where two troll operations went head to head.
Of the 17 takedowns in this announcement, about two-thirds targeted domestic audiences.
Disinfo begins at home.
And seven different ops either pretended to be news outlets, or tried to land their articles in real news outlets.
Influence ops probably target journalists more consistently than any other group, because that's how they can move from a fake media ecosystem into a real one.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
We came across part this botnet in the summer, when it was boosting the pro-Chinese network "Spamouflage."
This, from @conspirator0, is a typical profile. Note the broken sentence and word in the bio. No human typed that... at least not on that Twitter account.
Now compare the bio with the version of Dracula that's online at Tallinn Technical University: lap.ttu.ee/erki/failid/ra…
BREAKING: @Facebook just took down two foreign influence ops that it discovered going head to head in the Central African Republic, as well as targeting other countries.
There have been other times when multiple foreign ops have targeted the same country.
But this is the first time we’ve had the chance to watch two foreign operations focused on the same country target *each other*.
In the red corner, individuals associated w/ past activity by the Internet Research Agency & previous ops attributed to entities associated w/ Prigozhin.
In the blue corner, individuals associated w/ the French military.
ELECTION THREAD: Today and tonight are going to be a wild time online.
Remember: disinformation actors will try to spread anger or fear any way they can, because they know that people who are angry or scared are easier to manipulate.
Today above all, keep calm.
A couple of things in particular. First, watch out for perception hacking: influence ops that claim to be massively viral even if they’re not.
Trolls lie, and it’s much easier to pretend an op was viral than to make a viral op.
Having studied IO for longer than I care to remember, one of the most frequent comments I’ve heard, and agreed with, is that we need better ways to assess impact on multiple levels and timescales.
As part of that, we need a way to assess live IO in real time.
This paper suggests a way to approximate impact in the moment, when we don’t have the full picture, including the IO operators’ strategic objectives, or the luxury of taking the time to run polls to measure effect on public sentiment (hard even in normal circumstances).
This field is rapidly developing, but we need to start somewhere. Without clear context and a comparative scale, there's a danger of IO capitalising on fear and confusion to claim an impact they never had.