1/ Today we announced 10 CIB takedowns, including 6 networks we removed during the month of September, and 4 that we removed as recently as this morning. We had already announced most of the Sept networks. about.fb.com/news/2020/10/r…
2/More than half of these 10 networks targeted domestic audiences in their countries and many of them were linked to groups and people linked to politically affiliated actors in each country — the US, Myanmar, Russia, Nigeria, The Philippines and Azerbaijan.
3/ Half of the takedowns in this report began based on our own internal investigations, and the other half are based on information published/shared by external groups, including the FBI and investigative reporters.
4/ A few specific thoughts on these networks (there were a lot of enforcements this month, so this is a long one!):
5/ The US-based operation used the same deceptive techniques that we have previously seen from foreign interference. We've seen this from other domestic networks around the world over the past 3 years. We enforced here based on their deceptive behavior - not their content.
6/ I want to call out the excellent investigative reporting by @isaacstanbecker from WashingtonPost, who first identified pieces of this op. Based on his work, our teams conducted our own investigation to identify the full scope of the violating behavior. washingtonpost.com/technology/202…
7/ After we completed our analysis, we shared information with Stanford’s Internet Observatory, who wrote a detailed analysis of this takedown: cyber.fsi.stanford.edu/io/news/oct-20…
8/ We’ve banned Rally Forge from our platforms. It is a marketing firm working on behalf of a number of clients including Turning Point USA and Inclusive Conservation Group. We’ve removed all assets we see engaged in the deception.
9/ Enforcements like these rely on on-platform evidence. We will continue to investigate and if we find additional deceptive behavior we will take action on that as well.
10/ This raises the question of what consequences should apply to orgs that hire firms to engage in deception, but don’t directly engage themselves. We enforce where we see on-platform evidence of deception, but this is an important question that goes beyond any single company.
11/ My team has been thinking about this problem for some time, and today we are also sharing several recommendations for how legislation or regulation could be most effective at tackling IO. about.fb.com/news/2020/10/r…
12/ One interesting detail: although their activity goes back to 2018, many of their fake accounts were blocked throughout their history by our automated systems. Perhaps b/c of this enforcement, they recently changed tactics to also use “thinly veiled personas.”
13/ These are accounts that mix some elements of the real people behind them with fake details. Imagine a fake ID with an accurate picture but different name or vice versa. Individual operators often controlled multiple of these thinly-veiled persona accounts.
14/ While this change may have helped them evade our automated systems, it also exposed them to both our expert investigators and the @isaacstanbecker, investigative journalist that first noticed this behavior.
15/ This emphasizes the rock-and-a-hard-place problem that we increasingly see threat actors face: they can try to hide from our automated systems, but that exposes them to our expert investigators and external researchers.
16/ The Azerbaijan takedown is an example of another tactic we have seen before: actors using Pages to act as fake profiles. They primarily commented on posts, trying to create the impression of broad support for their issues. Our team linked this to the government of Azerbaijan.
17/ This takedown came from our internal investigations - in particular, one of our fake engagement researchers. Many teams across FB then worked to build out the investigation, map the full scope of deceptive behavior, and ensure we could take action and share it publicly.
18/ The network in Nigeria targeted domestic audiences. We found links to the Islamic Movement in Nigeria as we investigated suspected CIB in the region, and limited links to a network we removed (about.fb.com/news/2019/03/c…) in March 2019.
19/ Finally, the Myanmar takedown targeted public debate within Myanmar, incl. some activity around the upcoming election. Our investigators found links to members of Myanmar military.
20/This is the 7th CIB takedown in Myanmar since 2018, and we’ll continue hunting for and exposing deception as we find it in Myanmar and elsewhere in the world.
21/ We began this investigation after learning of local public reporting about some elements of this activity, as part of our proactive work ahead of the election in Myanmar.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Nathaniel Gleicher

Nathaniel Gleicher Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @ngleicher

21 Jun
1/ There’s been an important debate today about an online campaign to inflate ticket sales at the Tulsa rally, and whether this constitutes deceptive behavior (cc @persily @evelyndouek). Based on public reporting, this isn’t CIB as we define it. #thread nytimes.com/2020/06/21/sty…
2/ First off, it’s critical to analyze this based on the behavior, not the content. However one might feel about the intent here, what was the behavior this campaign engaged in, and is that harmfully deceptive or simply coordinated?
3/ Second, I’m going to address this from a platform perspective. For FB, the key question would be: did the people behind it engage in on-platform behavior that systemically deceived users?
Read 10 tweets
5 May
1/ Today we published our 3rd monthly CIB report: we removed 8 networks for coordinated inauthentic behavior in April. about.fb.com/news/2020/05/a…
2/ Six of the eight networks were domestic and targeted audiences in their own countries — in the US, Georgia, Myanmar and Mauritania.
3/ Two of them were foreign and targeted audiences outside of their countries: We linked one to individuals in Russia, Crimea and the Donbass regions of Ukraine and two media firms in Crimea; another was linked to the Islamic Republic of Iran Broadcasting Corporation.
Read 9 tweets
5 Jan
@DavidClinchNews @alexstamos Some rapid response systems exist and more could be built, but the broader point here is that for any sufficiently complex system, most unanticipated outcomes look strategic (complexity means observers infer intention), but many/most are actually the result of systemic entropy.
@DavidClinchNews @alexstamos Put another way: if something strange happens in a complex system, people often infer intent (it’s so sophisticated — they must have meant this to happen!), but often it’s an unintended result of the complexity itself. And this mismatch gets stronger for more complex systems.
@DavidClinchNews @alexstamos It’s almost a strange reformulation of the entropy in Murphy’s law: the more complex a system, the more opportunity for unintended consequences, AND the more likely observers will perceive those consequences as intentional.
Read 4 tweets
12 Nov 19
Good piece analyzing a recent Russian influence operation targeting Madagascar. The operation is linked to online assets we removed (and Stanford analyzed and helped expose) several weeks ago, but it's broader than that, and rife with incompetence. /thread nytimes.com/2019/11/11/wor…
Here's our takedown of the online operation: newsroom.fb.com/news/2019/10/r…
And here's Stanford's analysis (courtesy of @shelbygrossman, @alexstamos, and @noUpside): cyber.fsi.stanford.edu/io/news/prigoz…
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!