Ben Nimmo Profile picture
Sep 25, 2020 21 tweets 8 min read Read on X
One of the biggest challenges with influence ops is measuring their impact.

Here's a way to do it.

Six categories, based on IO spread through communities and across platforms.

Designed to assess and compare ops in real time.

H/t @BrookingsFP.

brookings.edu/research/the-b…
It assesses info ops according to two questions:

1. Did their content get picked up outside the community where it was originally posted?

2. Did it spread to other platforms or get picked up by mainstream media or high-profile amplifiers?
Category One ops stay on the platform where they were posted, and don't get picked up beyond the original community.

Most political spam and clickbait belong here. So does bot-driven astroturfing, like the Polish batch we found with @DFRLab.

medium.com/dfrlab/polish-…
Iran's efforts to target the Republican primaries on Facebook in 2012 were Category One, too.

@Graphika_NYC reported that here. graphika.com/reports/irans-…
Category 2 are either posted on multiple platforms but don't spread beyond the insertion point, or stay on one platform but get picked up by multiple communities.
The Russian IRA's effort in 2019 was a Category 2. It was only on Instagram, but landed content in multiple communities at both ends of the political spectrum.
The pro-China operation Spamouflage Dragon was a Category Two as well.

It posted on YouTube, Facebook and Twitter, but we've not yet found it getting picked up by authentic users on any of them.

graphika.com/reports/spamou…
Category Three ops get picked up by multiple communities on multiple platforms, but don't make the jump to mainstream media.

It's a transient category, because ops that make it this far are likely to get picked up by media.

Journalists: be careful what you amplify.
There was a time when QAnon and Pizzagate were both Category Threes.

If researchers identify a Category Three, it's important to deal with it fast, before it gets worse.
Category Four ops break out of social media entirely, and get picked up by the mainstream media.

Many ops try to achieve this by reaching out directly to journalists by email or social media.
The Russian IRA hit Category Four many times. Jenna Abrams and Crystal Johnson were really good at getting mainstream pickup.

This piece in the @latimes quoted two alt-right tweets Both were IRA trolls.

latimes.com/nation/la-na-b…
Iranian operation Endless Mayfly hit Category Four when Reuters picked up a fake story it created.

H/t @citizenlab for their work in breaking this.

citizenlab.ca/2019/05/burned…
Category Five is when celebrities, politicians, candidates or other high-profile influencers share and amplify an influence operation.

This gives the op both much greater reach and much greater credibility.

Take care when you share.
When Sputnik ran an already-debunked theory about Google rigging its autocomplete suggestions to favour Hillary Clinton back in September 2016, Donald Trump ended up amplifying it.

Category Five.

nytimes.com/2016/09/29/us/…
When Jeremy Corbyn publicised leaks that had originally been posted online by Russian operation Secondary Infektion, that was a Category Five.

H/t @jc_stubbs for the great work on this.

(Most SI stories were Category Two.)

uk.reuters.com/article/uk-bri…
And then there's Category Six. That's when an influence operation either causes a real-world change of some kind, or else carries the risk of real-world harm.

There haven't been many. Let's keep it that way.
The Russian DCLeaks operation was a Category Six.

At one end, leaks. At the other end, Debbie Wasserman Schultz resigns.

nytimes.com/2016/07/25/us/…
The IRA's "Stop Islamization of Texas" effort in 2016 was a Category Six too.

Get two opposing groups. Organise them into simultaneous protests. Tell them to bring their guns.

Nobody got hurt, but the potential was there.

H/t @mrglenn.

chron.com/news/houston-t…
One operation can hit different categories at different times.

For example, Secondary Infektion was usually Category Two, but jumped to Category Five.

By the same token, the 2016 IRA reached Category Six, but the 2019 IRA only reached Category Two.
I developed this scale for operational researchers to be able to assess, compare and prioritise info ops.

For example, how do Spamouflage Dragon (China), Endless Mayfly (Iran) and DCLeaks (Russia) compare?

Spamouflage: 2
Endless Mayfly: 4
DCLeaks: 6
But it's also a reminder: info ops are not just on social-media platforms, and they *do* target journalists and influencers directly.

As I've said so often: stay calm - but stay watchful.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Nimmo

Ben Nimmo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @benimmo

Sep 27, 2022
🚨BREAKING🚨
@Meta took down two covert influence ops:
Big one from Russia🇷🇺 targeting Europe with spoofed media websites like the Guardian and Spiegel
First one from China 🇨🇳 to focus on both sides of domestic US 🇺🇸 politics and Czech-China relations.
about.fb.com/news/2022/09/r…
@Meta The operations were very different, but both worked on multiple social media platforms and petitions sites.
The Russian op was even on LiveJournal (cute).
List of domains, petitions etc in the report. #OSINT community, happy hunting!
@Meta China: this was the first Chinese network we’ve disrupted that focused on US domestic politics ahead of the midterms and Czech foreign policy toward China and Ukraine.
It was small, we took it down before it built an audience, but that’s a new direction for Chinese IO.
Read 11 tweets
Aug 4, 2022
🚨JUST OUT🚨
Quarterly threat report from @Meta’s investigative teams.
Takedowns from around the world:
Cyber espionage in South Asia;
Harassment in India;
Violating networks in Greece, South Africa, India;
Influence ops from Malaysia & Israel
AND...
about.fb.com/news/2022/08/m…
A deep dive into a Russian troll farm, linked to people with ties to what’s known as the Internet Research Agency.
It used fake accounts across the internet to make it look like there’s support for Russia’s war in Ukraine - and to pretend the troll farm's doing a good job.
The operation called itself “Cyber Front Z”.

We think of it as the Z Team, because it was about as far from being the A Team as you can get.
Read 11 tweets
Apr 7, 2022
🚨JUST OUT🚨
Quarterly threat report from @Meta’s investigative teams.
Much to dig into:
State & non-state actors targeting Ukraine;
Cyber espionage from Iran and Azerbaijan;
Influence ops in Brazil and Costa Rica;
Spammy activity in the Philippines...
about.fb.com/news/2022/04/m…
I’ll focus this thread on Ukraine. For more on the rest, see the great @ngleicher and @DavidAgranovich.
We’ve seen state & non-state ops targeting Ukraine across the internet since the invasion, including attempts from:

🇧🇾 Belarus KGB
👹 A Russian “NGO” w/ some links to past IRA folks
👻 Ghostwriter

We caught these early, before they could build audience or be effective.
Read 15 tweets
Feb 28, 2022
🚨 TAKEDOWN 🚨
This weekend, we took down a relatively small influence operation that had targeted Ukraine across multiple social media platforms and websites. It was run by people in Russia and Ukraine: about.fb.com/news/2022/02/s…
It consisted of approx 40 accounts, Groups and Pages on FB and IG, plus on Twitter, YouTube, VK, OK, Telegram.

It mainly posted links to long-form articles on its websites, without much luck making them engaging. It got very few reactions, and under 4k followers.
It ran a few fake personas posing as authors. They had fake profile pics (likely GAN), and unusually detailed public bios - e.g. former civil aviation engineer, hydrography expert.

The op posted their articles on its websites and social media, & amplified them using more fakes.
Read 6 tweets
Feb 27, 2022
Personal 🧵 based on years of OSINT research into influence operations since 2014.

Looking at the Russian official messaging on “de-nazification” and “genocide”, it’s worth putting them in context of the many different Russian IO that targeted Ukraine over the years.
Way back in 2014, Russian military intel ran a series of fake “hacktivist” personas that targeted Ukraine. Note the “Nazi” theme.

Screenshots from @Graphika_NYC research, based on Facebook takedown.
about.fb.com/news/2020/09/r…
public-assets.graphika.com/reports/graphi… Image
Still in 2014, one of the busiest days the Internet Research Agency had on Twitter was when it falsely accused Ukraine of shooting down flight MH-17 as a “provocation”.
Screenshot from @DFRLab /Twitter archives.
transparency.twitter.com/en/reports/inf…
medium.com/dfrlab/trolltr… Image
Read 10 tweets
Jan 20, 2022
JUST OUT: Report on coordinated inauthentic behaviour takedowns in December, and a look back over the past year & more.

Interesting: 2/3 of all ops we removed since 2017 were wholly or partially focused on domestic audiences.

about.fb.com/news/2022/01/d… Image
We took down three operations last month:

* Iran, targeting the UK, focusing on Scottish independence;
* Mexico, a PR firm targeting audiences across LATAM;
* Turkey, targeting Libya, and linked to the Libyan Justice and Construction Party (affiliated w/Muslim Brotherhood).
It’s not the first time for an Iranian op to pose as supporters of Scottish independence.
In the past, FB found a page that copied and posted political cartoons about independence as far back as 2013.
@Graphika_NYC writeup here (pages 26-27)
graphika.com/reports/irans-…
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(