I appreciate this discussion bc it helps shine a light on the complexity of these problems. Two things to note as we all work to tackle inauthentic behavior & deception. 🧵

1. There’s a big behavioral difference between spammy amplification and complex IO;

2. Platforms traditionally approach each differently for a reason — each represents different behaviours and has different incentive structure.
Boosting shares and likes is a numbers game to make content look more popular than it is. It can be used on political content or fake sunglasses (or both).

Either way, it’s on the simpler end of the spectrum.
@markhansontoo discussed it last year about.fb.com/wp-content/upl…
The best way to tackle a numbers game like this is to counter it at scale via auto detection & removal — no matter the content.

Some accounts will always slip through the net, but billions don’t (FB reports that quarterly). Fakers adapt; the challenge is to keep adapting too.
Now back to complex IO: AFAIK, our research community has not seen examples of successful influence operations and foreign interference being built solely on fake likes and shares for years.

Efforts like that are basically one-trick ponies.
The IO we’ve seen recently have typically combined many different tactics.

Fake likes/shares can be one of the (less effective) ones, but there are so many others, e.g. backstopped cross-platform personas, fake media brands, or co-opting real people: about.fb.com/news/2021/05/i…
As @davidagranovich has already pointed out, it’s really important not to conflate the different kinds of inauthentic activity.

Even within influence ops, it’s important to assess their likely impact based on the available evidence.

This handy scale is designed to estimate the impact of IO based on observed evidence, categories 1 → 6.

If an op stays on one platform/community, it’s a category 1. It goes up if it reaches more communities, platforms, mainstream media or celebrities.

brookings.edu/research/the-b…
Always remember, some influence ops seek to overstate their own impact - remember the IRA in 2018?

The more we can scrutinise whether they *actually* got anywhere, based on the evidence, the stronger our defender community will be.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Nimmo

Ben Nimmo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @benimmo

1 Nov
🚨 JUST OUT: We took down a troll farm in Nicaragua, run by the Nicaraguan government and the FSLN party.
Our team’s research here:
about.fb.com/news/2021/11/o…
Important terminology point: over the years, I’ve seen some confusion over what constitutes a “troll farm”, as opposed to clickbait/content farms.

Here’s how we understand it.
Two things to note on this operation:

1) This was the closest thing to a “whole-of-government” operation we’ve seen.

2) The troll farm lived across the internet: own media websites built on wordpress, blogspot, amplified on FB, IG, TikTok, Twitter, Telegram, YouTube, etc.
Read 8 tweets
10 Aug
JUST OUT: In-depth report on the #Fazze case — a campaign from Russia targeting primarily India and LATAM, and to a lesser extent the US.
It was focused on the Pfizer and AstraZeneca COVID-19 vaccines, but got close to zero traction across the internet.
about.fb.com/news/2021/08/j…
There’s already been reporting on the Pfizer phase, in May (h/t @daniellaufer, @toniodaoust, @FloraCarmichael, @charliehtweets, @arawnsley).

Our investigation uncovered that in December, the same op targeted AstraZeneca.
We attributed this operation to Fazze, a marketing firm primarily operating from Russia.
Read 11 tweets
8 Jul
JUST OUT: Our monthly report on Coordinated Inauthentic Behaviour takedowns - June 2021 edition.

Eight networks, seven countries.

about.fb.com/news/2021/07/j…
Full details in the report, but a couple of thoughts here.

All but one of the networks focused on domestic targets. That’s not unusual: influence operations so often start at home — remember our recent IO Threat Report?
Historically, even some operations that became (in)famous for foreign interference started domestically.

E.g. the early Russian IRA posted critical commentary about Navalny back in 2013, often on LiveJournal (h/t @soshnikoff)

mr-7.ru/articles/90769/
Read 11 tweets
6 May
JUST OUT: 9 takedowns in our April CIB report. Primarily domestic ops:

👉Palestine, linked to Fatah;
👉Azerbaijan, linked to individuals associated with defence ministry;
👉Central African Republic, linked to local NGO;

(More in next tweet...)

about.fb.com/news/2021/05/a…
👉Mexico, 1 network linked to local election campaigns, 1 linked to a local politician and a PR firm;
👉Peru, 1 linked to a local party and an advertising firm, 1 linked to a marketing entity;
👉Ukraine, 1 linked to people associated with the Sluha Narodu party,

And...
👉Ukraine, 1 network linked to individuals and entities sanctioned by the US Treasury — Andrii Derkach, Petro Zhuravel, and Begemot-linked media + political consultants associated with Volodymyr Groysman and Oleg Kulinich.

Deep dive in the report. about.fb.com/news/2021/05/a…
Read 10 tweets
3 Mar
Five takedowns for CIB from the @Facebook investigative team last month.

Thai military, domestic targeting
Iran, targeting Iraq, Israel, Afghanistan, UK
Iran, domestic + regional
Morocco, domestic focus
Russia, targeting the Navalny protests

Link: about.fb.com/news/2021/03/f…
A range of behaviours here. Influence ops take many forms.

Fake a/cs posting to multiple pages to make content look popular
In-depth personas to seed geopolitical content
Large numbers of fakes to spam hashtags and geotags
GAN-generated faces, in bulk, but sloppily done.
First, the Thai Military’s Internal Security Operations Command.

About 180 assets, esp. active in 2020, posting news, current events, pro-military and pro-monarchy content, anti-separatist.

Stock profile pics, some posing as young women.

Found by internal investigation.
Read 8 tweets
5 Feb
Some personal news: today’s my last day at @Graphika_NYC.

My team did amazing investigative work and research into influence ops from Russia, Iran, China and many other places.

We’ve broken new ground, and I couldn’t be more proud of the team @camillefrancois and I built.
Next week, I’m starting at Facebook, where I’ll be helping to lead global threat intelligence strategy against influence operations.

I’m very excited to join one of the best IO teams in the world to study, catch and get ahead of the known players and emerging threats.
As a community - platforms, researchers and journalists - we’ve all come a long way since the dawn of this field of research.
Read 13 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Thank you for your support!

Follow Us on Twitter!

:(