Ben Nimmo Profile picture
Apr 7 15 tweets 7 min read
🚨JUST OUT🚨
Quarterly threat report from @Meta’s investigative teams.
Much to dig into:
State & non-state actors targeting Ukraine;
Cyber espionage from Iran and Azerbaijan;
Influence ops in Brazil and Costa Rica;
Spammy activity in the Philippines...
about.fb.com/news/2022/04/m…
I’ll focus this thread on Ukraine. For more on the rest, see the great @ngleicher and @DavidAgranovich.
We’ve seen state & non-state ops targeting Ukraine across the internet since the invasion, including attempts from:

🇧🇾 Belarus KGB
👹 A Russian “NGO” w/ some links to past IRA folks
👻 Ghostwriter

We caught these early, before they could build audience or be effective.
Typically, these efforts only pivoted to posting about Ukraine close to when the invasion happened.

For example, the Belarus op was posting about Poland and migrants right up to invasion day.

It doesn’t look like these influence ops were prepared for the war.
The timing and lack of prep are important.

We know from the 2018 Mueller indictment that the IRA gave itself 2.5 years to target the US 2016 election (Apr 2014 -> Nov 2016)

An op that only lasts a few days is going to struggle for impact cross-internet

justice.gov/file/1035477/d…
We also took down an attempt by people linked to past Internet Research Agency activity.

They posed as an NGO and were active almost entirely off platform: they did try to create FB accounts in January, but were detected.
Throughout January and February, their website focused on accusing the West of human rights abuse. After the invasion, they focused more on Ukraine.
Ghostwriter is an op that specialises in hacking people’s emails, then trying to take over their social media from there.

We saw a spike in Ghostwriter targeting of the Ukrainian military after the invasion.
In a few cases, Ghostwriter posted videos calling on the Army to surrender as if these posts were coming from the legitimate account owners.

We blocked these videos from being shared.
Non-state actors focused on Ukraine too. We took down an attempt to come back by an actor running two websites posing as news outlets. Those sites were linked to a network we disrupted in December 2020.

about.fb.com/wp-content/upl…
As typical for critical world events, spammers focused on the invasion too. They used war footage to leverage people's’ attention to this war to make money.

Automated and manual systems have caught thousands of those.
Finally, we also took down a group in Russia that tried to mass-report people in Ukraine and RU, to silence them. Likely in an attempt to conceal their activity, they coordinated in a group that was ostensibly focused on cooking.
It’s worth making a bigger point here: we as a society have come a long way since 2014. The info ops research community is far more developed than it was back then, when there were just a handful of us doing this research.

On platform, we caught the recidivist ops early.
In the #OSINT community, there's more forensic expertise than ever.

Look at #BBCRealityCheck, @malachybrowne / @NYTimes visual investigations, @ElyseSamuels / @washingtonpost visual forensics, old friends like @bellingcat, @dfrlab, @graphika_nyc, @stanfordio, @EUdisinfolab...
@malachybrowne @nytimes @ElyseSamuels @washingtonpost @bellingcat @DFRLab @Graphika_NYC @stanfordio That matters hugely. Threat actors will always try: it’s their job. But having a skilled community out there which is able to detect deception and influence ops fast makes for a much less conducive environment.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Nimmo

Ben Nimmo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @benimmo

Feb 28
🚨 TAKEDOWN 🚨
This weekend, we took down a relatively small influence operation that had targeted Ukraine across multiple social media platforms and websites. It was run by people in Russia and Ukraine: about.fb.com/news/2022/02/s…
It consisted of approx 40 accounts, Groups and Pages on FB and IG, plus on Twitter, YouTube, VK, OK, Telegram.

It mainly posted links to long-form articles on its websites, without much luck making them engaging. It got very few reactions, and under 4k followers.
It ran a few fake personas posing as authors. They had fake profile pics (likely GAN), and unusually detailed public bios - e.g. former civil aviation engineer, hydrography expert.

The op posted their articles on its websites and social media, & amplified them using more fakes.
Read 6 tweets
Feb 27
Personal 🧵 based on years of OSINT research into influence operations since 2014.

Looking at the Russian official messaging on “de-nazification” and “genocide”, it’s worth putting them in context of the many different Russian IO that targeted Ukraine over the years.
Way back in 2014, Russian military intel ran a series of fake “hacktivist” personas that targeted Ukraine. Note the “Nazi” theme.

Screenshots from @Graphika_NYC research, based on Facebook takedown.
about.fb.com/news/2020/09/r…
public-assets.graphika.com/reports/graphi… Image
Still in 2014, one of the busiest days the Internet Research Agency had on Twitter was when it falsely accused Ukraine of shooting down flight MH-17 as a “provocation”.
Screenshot from @DFRLab /Twitter archives.
transparency.twitter.com/en/reports/inf…
medium.com/dfrlab/trolltr… Image
Read 10 tweets
Jan 20
JUST OUT: Report on coordinated inauthentic behaviour takedowns in December, and a look back over the past year & more.

Interesting: 2/3 of all ops we removed since 2017 were wholly or partially focused on domestic audiences.

about.fb.com/news/2022/01/d… Image
We took down three operations last month:

* Iran, targeting the UK, focusing on Scottish independence;
* Mexico, a PR firm targeting audiences across LATAM;
* Turkey, targeting Libya, and linked to the Libyan Justice and Construction Party (affiliated w/Muslim Brotherhood).
It’s not the first time for an Iranian op to pose as supporters of Scottish independence.
In the past, FB found a page that copied and posted political cartoons about independence as far back as 2013.
@Graphika_NYC writeup here (pages 26-27)
graphika.com/reports/irans-…
Read 11 tweets
Dec 1, 2021
JUST OUT: Adversarial threat report on brigading, mass reporting and coordinated inauthentic behaviour.

With a deep dive into the Chinese operation that created a fake “Swiss biologist” back in July.

I think of that one as Operation Swiss Rôle.

about.fb.com/news/2021/12/m…
There’s a lot here:

* Expanding Crowdtangle IO archive to more researchers
* First public takedowns of brigading & mass reporting networks
* CIB takedown from Palestine (Hamas)
* Two CIB ops focused on Poland / Belarus migrant crisis (one from Belarus KGB)
* Op Swiss Rôle
First, deep dive: in July, a fake “Swiss biologist” persona on FB and Twitter accused the US of bullying the WHO over COVID origins, and was picked up by Chinese state media with amazing speed.

H/t @mradamtaylor and @BBCTrending for their reports.

washingtonpost.com/world/2021/08/…
Read 11 tweets
Nov 20, 2021
I appreciate this discussion bc it helps shine a light on the complexity of these problems. Two things to note as we all work to tackle inauthentic behavior & deception. 🧵

1. There’s a big behavioral difference between spammy amplification and complex IO;

2. Platforms traditionally approach each differently for a reason — each represents different behaviours and has different incentive structure.
Boosting shares and likes is a numbers game to make content look more popular than it is. It can be used on political content or fake sunglasses (or both).

Either way, it’s on the simpler end of the spectrum.
@markhansontoo discussed it last year about.fb.com/wp-content/upl…
Read 9 tweets
Nov 1, 2021
🚨 JUST OUT: We took down a troll farm in Nicaragua, run by the Nicaraguan government and the FSLN party.
Our team’s research here:
about.fb.com/news/2021/11/o…
Important terminology point: over the years, I’ve seen some confusion over what constitutes a “troll farm”, as opposed to clickbait/content farms.

Here’s how we understand it.
Two things to note on this operation:

1) This was the closest thing to a “whole-of-government” operation we’ve seen.

2) The troll farm lived across the internet: own media websites built on wordpress, blogspot, amplified on FB, IG, TikTok, Twitter, Telegram, YouTube, etc.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(