Ben Nimmo Profile picture
15 Dec, 23 tweets, 9 min read
BREAKING: @Facebook just took down two foreign influence ops that it discovered going head to head in the Central African Republic, as well as targeting other countries.

More-troll Kombat, you might say.

Report by @Graphika_NYC and @stanfordio: graphika.com/reports/more-t…
There have been other times when multiple foreign ops have targeted the same country.

But this is the first time we’ve had the chance to watch two foreign operations focused on the same country target *each other*.
In the red corner, individuals associated w/ past activity by the Internet Research Agency & previous ops attributed to entities associated w/ Prigozhin.

In the blue corner, individuals associated w/ the French military.

@Facebook report here: about.fb.com/news/2020/12/r…
Two households, both alike in dignity… well ok, two troll ops.

Generally low followings. Russian op mainly targeted CAR, South Africa, Cameroon. French op focused on CAR and West Africa.

And they trolled each other.

Details in our report. Highlights to follow here.
So, background: in October 2019, Facebook took down a bunch of Prigozhin-linked assets across Africa, including CAR.

Our friends at @stanfordio wrote it up at the time.

cyber.fsi.stanford.edu/io/news/prigoz…
This fabulous cartoon, about a bear saving a lion from French/U.S. hyenas, was sponsored by Prigozhin firm Lobaye Invest at the time. (h/t @dionnesearcey, who wrote it up for @nytimes in September 2019)

Troll assets boosted it.
What we didn't know was that there was a recent, fledgling French operation, mostly praising France, sometimes criticising Russia and Prigozhin.

After the FB takedown, they set up a few fake assets to, um, "fight fake news."
This was definitely a "fight fire with fire" effort.

Their portrayals of Russians were, well... troll-y.
It was also a direct and explicit reaction to the earlier Russian operation.

Note the cartoon on "Tatiana"'s screen. Familiar?
It also focused on the Wagner group and Russian mercenaries. But the methods it used were ... unfortunate.

Left, post by a French asset on "Russian mercenaries arriving in CAR," October.

Right, news report on Russian medics landing in Kyrgystan, July.
If you reverse the French photo and compare the backpacks in the background... oops.

Same event, shot from slightly different angle.

That, folks, is disinformation.
Meanwhile, from January the Russian operation created a bunch of new CAR-focused assets. Trying to rebuild the network.

Unlike the French operation, this one seemed primarily focused on domestic politics, including support for the president.
The Russian operation also played up how much good Russia was doing in CAR...
... and spread negative content about the French , including by accusing them of fake news.

Oh the irony.
Thing is, the French and Russian ops were posting into the same groups, and liking the same pages.

They even liked *each other*.

Pink = FR accounts
Green = RU accounts.
Red = RU pages.
Blue = unaffiliated pages.
So it wasn't long before they started noticing each other.

Here's a French account calling out a "fake" post by a Russian account. January this year.
Here's a Russian asset calling out a French asset, calling out a different Russian asset.
Then it started getting weird.

The French assets started sharing the Russian assets' posts...
... and then they friended them.

Fishing for clues?
"I'm not really a troll"?
"I can see your mouse from here"?
This was the weirdest moment. Russian asset accusing French asset of DM'ing him *to recruit him*.

Never take a troll's word at face value, but if true, this would be fascinating. Who was trolling whom?
This activity didn't generally get much traction. One Russian asset focused on CAR politics had 50k followers, the rest on both sides were in the hundreds or, at best, low thousands. On YouTube, most videos had a few dozen views; on Twitter, small numbers of retweets.
And here's the thing: we're writing about these operations because they got caught and taken down.

Which is what happens when fakes get caught.

Even - perhaps especially - "anti-fake-news" fakes.
This is why fighting fakes with fakes is *not a good idea*.

Expose trolls and fakes, yes. But don't do it with more trolls and fakes. Because then you get things like this.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Nimmo

Ben Nimmo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @benimmo

3 Nov
ELECTION THREAD: Today and tonight are going to be a wild time online.

Remember: disinformation actors will try to spread anger or fear any way they can, because they know that people who are angry or scared are easier to manipulate.

Today above all, keep calm.
A couple of things in particular. First, watch out for perception hacking: influence ops that claim to be massively viral even if they’re not.

Trolls lie, and it’s much easier to pretend an op was viral than to make a viral op.

Remember 2018? nbcnews.com/tech/tech-news…
There have been huge improvements in our collective defences since 2016. Teams like @Graphika_NYC, @DFRLab and @2020Partnership; takedowns by @Facebook, @Twitter and @YouTube; tip-offs from law enforcement.

Trolls have to spend more effort hiding.
Read 8 tweets
1 Oct
NEW: A Russian operation posed as a far-right website to target U.S. divisions and the election.

Most active on Gab and Parler.
A few months old.
Looks related to the IRA-linked PeaceData (which targeted progressives).

@Graphika_NYC report: public-assets.graphika.com/reports/graphi…
Credit to @jc_stubbs of @Reuters, who tipped us off to this.

A legend in his own byline.

reuters.com/article/us-usa…
This op was based on a website called the Newsroom for American and European Based Citizens, NAEBC.

@Alexey__Kovalev might enjoy this name: it’s close to the Russian swear word “наёбка”.

Just like PeaceData sounded like the Russian epithet “пиздато.”

There’s a theme there.
Read 30 tweets
26 Sep
Having studied IO for longer than I care to remember, one of the most frequent comments I’ve heard, and agreed with, is that we need better ways to assess impact on multiple levels and timescales.

As part of that, we need a way to assess live IO in real time.
This paper suggests a way to approximate impact in the moment, when we don’t have the full picture, including the IO operators’ strategic objectives, or the luxury of taking the time to run polls to measure effect on public sentiment (hard even in normal circumstances).
This field is rapidly developing, but we need to start somewhere. Without clear context and a comparative scale, there's a danger of IO capitalising on fear and confusion to claim an impact they never had.

Remember the midterms in 2018?
Read 8 tweets
25 Sep
One of the biggest challenges with influence ops is measuring their impact.

Here's a way to do it.

Six categories, based on IO spread through communities and across platforms.

Designed to assess and compare ops in real time.

H/t @BrookingsFP.

brookings.edu/research/the-b…
It assesses info ops according to two questions:

1. Did their content get picked up outside the community where it was originally posted?

2. Did it spread to other platforms or get picked up by mainstream media or high-profile amplifiers?
Category One ops stay on the platform where they were posted, and don't get picked up beyond the original community.

Most political spam and clickbait belong here. So does bot-driven astroturfing, like the Polish batch we found with @DFRLab.

medium.com/dfrlab/polish-…
Read 21 tweets
24 Sep
BREAKING: Multiple platforms took down assets from various Russian info ops today.

The ops did *not* primarily target the US election. Much more on RU strategic concerns.

@Facebook kicked this off. Reports by @Graphika_NYC and @DFRLab to follow.

about.fb.com/news/2020/09/r…
The FB investigation took down several different sets of inauthentic assets, including Russian military and individuals associated with the IRA.

They have a track record of election interference. Cleaning their assets out before the U.S. election seems… prudent. Image
The @Graphika_NYC team looked at the Russian military assets. About 300 of them, activity ranging from 2013 to 2020.

It wasn’t one coherent set: more like different clusters at different times and looking in different directions, north, south, east and west. Image
Read 25 tweets
22 Sep
BREAKING: @Facebook announced a takedown of assets run by individuals in China.

Operation Naval Gazing:

Focus on maritime security and the South China Sea;
Lot of content on the Philippines and Taiwan;
Small, apparently bipartisan volume on the US.

about.fb.com/news/2020/09/r…
Here's the @Graphika_NYC report.

Overall, 155 accounts on FB, 11 pages, 9 groups, 6 Instagram accounts.

2017: mostly Taiwan
2018-19: + Philippines and South China Sea
2020: + US-centric content.

The biggest audience was in the Philippines.

graphika.com/reports/operat…
Operation Naval Gazing was mostly *not* about U.S. domestic politics.

But I suspect that's where the most urgent questions will be.

So...
Read 17 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!