1/ This is excellent and much-needed analysis. A 🧵
2/ At FB we look at factors like the length of an op and how many followers it has, but these have their limits -- especially as tactics evolve. An op can be long-lived but unsuccessful, and can reach only a few significant people and still generate significant reaction.
3/ Defining the goal of an IO campaign is the first (and sometimes hardest) step -- hard to measure how well they did if you don't understand their goal.
4/ Good that the report distinguishes between "long-term" & "short-term" ops. I call these "fast-twitch" & "slow-twitch": ops designed to appear shortly before a civic event with fast, significant impact v ops that build for months/years to broadly shift societal perspectives.
5/ A few callouts from the research: (1) We have a 🍩problem. Many CIB campaigns run for months before being shut down, but research focuses only on years-long & days-long campaigns. So there's a gap right where many of the ops we see in the wild operate. Image
/6 (2) IO is 🌎🌍🌏. Most research focuses on foreign interference from Russia, but they are only one of many gov'ts & non-gov'ts that use IO. Tactics vary across actors and goals almost certainly do as well. Tackling a global, diverse problem space requires global understanding. Image
/7 (3) Of the research into IO on social media, much "focused solely on FB & Twitter." IO is increasingly multi-platform, & its impact relies on the interplay between mass and social media. Studying ops across multiple platforms & mass media is critical to measuring impact. Image
8/ (4) While time horizon is important, it's key to also distinguish between ops looking to affect a specific outcome (e.g., an election) v those looking to degrade trust over time. These are very different goals, and so measuring progress toward them will look very different.
9/ It's great to see this draft paper -- it is well done and much needed! Thank you to @lageneralista and @IOpartnership for enabling this, and to @ESoConflict @lacourchesne, Isra Thange, & Jacob N. Shapiro for this excellent work!

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Nathaniel Gleicher

Nathaniel Gleicher Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @ngleicher

16 Jul
For today's purported Russia leak, I'd argue the primary question isn't whether it's authentic, b/c the substance of it doesn't actually add much to the conversation (which itself is a sign of the intent behind it). The primary question should be who dropped it and why.
Remember that whether this is authentic or not, it's almost certainly *someone's* influence operation. The more we engage with it as an effort to manipulate public debate and focus on how to respond, who is behind it, and where they're headed next, the better off we'll be!
Analyzing authenticity can inform assessments of whether this was an op, what its goal was, and who was behind it. Focusing on that (and including those caveats into future reporting) will help limit any impact. @josephmenn broke this down helpfully:
Read 4 tweets
15 Jul
1/ Today we removed a cyber-espionage op that originated from Iran and targeted military personnel and individuals in the defense and aerospace industries primarily in the US, and some in the UK and Europe. about.fb.com/news/2021/07/t…
2/ This was a cross-platform op, targeting social media, email/collab service providers, and using malicious websites and domains to compromise their targets.
3/ Activity on Facebook was primarily social engineering and driving people off-platform. Ultimately, they seemed to be trying to compromise targets’ devices & accounts w/malicious links, phishing & credential theft.
Read 12 tweets
15 Jul
One thing we should all have learned from the last four years is that whenever juicy info drops amidst a moment of tension, the first question we should ask is "who dropped this?" the second is "why?" and the third is "is it real?" (in that order). theguardian.com/world/2021/jul…
I wish this report answered these questions (or even raised them), but it doesn't. It takes the doc at face value and focuses on the exciting details. Please be cautious when retweeting or commenting: well-timed "leaks" are one of the most effective forms of influence ops.
As you judge it, remember to question the *behavior* behind this, and not just the content itself. It may well be a forgery (in whole or in part), but we should also ask who wanted this out there at this moment and why.
Read 6 tweets
31 May
Thoughtful piece from @SusanBenesch on how to look beyond the content of individual posts to assess both impact and appropriate remedy -- a useful way to apply behavioral analysis of threat actors to more diffuse networks.
Defenders are grappling with how to adapt protocols designed for clearly delineated “bad guys” (ISIS, “The Russians”) who hide their identity online to tackle diffuse, blurred threats where witting deceivers mobilize large, authentic communities w/out hiding their identity.
@SusanBenesch’s proposal is a good way to think about this, and we need more approaches like this. But it’s important to be careful here as well. Protocols distinguish between deceptive “superspreaders” and unwitting “superjoiners” for a reason.
Read 6 tweets
3 Mar
1/ This is excellent analysis from @2020Partnership on misinfo during the 2020 election. Having a team of independent researchers focused on election protection and online deception is a *huge* boon for the defender community. atlanticcouncil.org/in-depth-resea…
2/ We saw many of the trends that EIP called out in this report, including cross-platform spread -- narratives often originate with a few accounts, spread across multiple platforms as they gain popularity, and are even further amplified through traditional media coverage.
3/ I particularly appreciate their calls for clarity and consistency in rules from platforms, government and legislators, and their emphasis on the importance of getting proactive accurate information out ahead of deceptive narratives.
Read 9 tweets
3 Mar
1/ Today we’re announcing 5 networks removed for Coordinated Inauthentic Behavior in February: 2 networks from Iran targeting multiple countries, and domestic networks in Thailand, Morocco, and Russia about.fb.com/news/2021/03/f…
2/ The two Iranian networks focused on the middle east, as well as the UK and Afghanistan. They engaged on a range of topics, using tactics we’ve seen from other operations, and had limited reach — one operation had less than 15K followers for their assets, the other under 500.
3/ The Thai operation exhibited links to the Thai Military’s Internal Security Operations Command. It used fake accounts posing as individuals from the southern provinces of Thailand to criticize separatist movements and support the monarchy and military.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(