Ben Nimmo Profile picture
3 Nov, 8 tweets, 4 min read
ELECTION THREAD: Today and tonight are going to be a wild time online.

Remember: disinformation actors will try to spread anger or fear any way they can, because they know that people who are angry or scared are easier to manipulate.

Today above all, keep calm.
A couple of things in particular. First, watch out for perception hacking: influence ops that claim to be massively viral even if they’re not.

Trolls lie, and it’s much easier to pretend an op was viral than to make a viral op.

Remember 2018? nbcnews.com/tech/tech-news…
There have been huge improvements in our collective defences since 2016. Teams like @Graphika_NYC, @DFRLab and @2020Partnership; takedowns by @Facebook, @Twitter and @YouTube; tip-offs from law enforcement.

Trolls have to spend more effort hiding.
That doesn't mean the influence ops have stopped, but it does make it harder for them to break through.

If trolls claim to have had massive impact, demand the evidence, and measure it against known operations.

brookings.edu/wp-content/upl…
Second, one thing we’ve seen in elections around the world is false or exaggerated claims of election fraud, designed to de-legitimize the outcome.

Russian assets have been pushing a “2020 fraud” narrative for a long time - but not getting traction.

Again, with claims like that, keep calm. Check the evidence. Ask how easily it could be manipulated. Ask if the claim actually originated today: there have been plenty of cases of repackaged claims that use footage from years ago.

Any election is a complex exercise with millions of moving parts. This one’s more complex than most, and the results will take longer.

That's the window of opportunity that influence ops will try to use to sow false claims, fear and anger.

Don't be taken in.
Disinfo feeds on fear and anger. It has many targets, and many people can spread it unwittingly, especially in hyper-tense times.

If claims of interference or fraud come in, keep perspective. Keep watch. And keep calm.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Ben Nimmo

Ben Nimmo Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @benimmo

1 Oct
NEW: A Russian operation posed as a far-right website to target U.S. divisions and the election.

Most active on Gab and Parler.
A few months old.
Looks related to the IRA-linked PeaceData (which targeted progressives).

@Graphika_NYC report: public-assets.graphika.com/reports/graphi…
Credit to @jc_stubbs of @Reuters, who tipped us off to this.

A legend in his own byline.

reuters.com/article/us-usa…
This op was based on a website called the Newsroom for American and European Based Citizens, NAEBC.

@Alexey__Kovalev might enjoy this name: it’s close to the Russian swear word “наёбка”.

Just like PeaceData sounded like the Russian epithet “пиздато.”

There’s a theme there.
Read 30 tweets
26 Sep
Having studied IO for longer than I care to remember, one of the most frequent comments I’ve heard, and agreed with, is that we need better ways to assess impact on multiple levels and timescales.

As part of that, we need a way to assess live IO in real time.
This paper suggests a way to approximate impact in the moment, when we don’t have the full picture, including the IO operators’ strategic objectives, or the luxury of taking the time to run polls to measure effect on public sentiment (hard even in normal circumstances).
This field is rapidly developing, but we need to start somewhere. Without clear context and a comparative scale, there's a danger of IO capitalising on fear and confusion to claim an impact they never had.

Remember the midterms in 2018?
Read 8 tweets
25 Sep
One of the biggest challenges with influence ops is measuring their impact.

Here's a way to do it.

Six categories, based on IO spread through communities and across platforms.

Designed to assess and compare ops in real time.

H/t @BrookingsFP.

brookings.edu/research/the-b…
It assesses info ops according to two questions:

1. Did their content get picked up outside the community where it was originally posted?

2. Did it spread to other platforms or get picked up by mainstream media or high-profile amplifiers?
Category One ops stay on the platform where they were posted, and don't get picked up beyond the original community.

Most political spam and clickbait belong here. So does bot-driven astroturfing, like the Polish batch we found with @DFRLab.

medium.com/dfrlab/polish-…
Read 21 tweets
24 Sep
BREAKING: Multiple platforms took down assets from various Russian info ops today.

The ops did *not* primarily target the US election. Much more on RU strategic concerns.

@Facebook kicked this off. Reports by @Graphika_NYC and @DFRLab to follow.

about.fb.com/news/2020/09/r…
The FB investigation took down several different sets of inauthentic assets, including Russian military and individuals associated with the IRA.

They have a track record of election interference. Cleaning their assets out before the U.S. election seems… prudent. Image
The @Graphika_NYC team looked at the Russian military assets. About 300 of them, activity ranging from 2013 to 2020.

It wasn’t one coherent set: more like different clusters at different times and looking in different directions, north, south, east and west. Image
Read 25 tweets
22 Sep
BREAKING: @Facebook announced a takedown of assets run by individuals in China.

Operation Naval Gazing:

Focus on maritime security and the South China Sea;
Lot of content on the Philippines and Taiwan;
Small, apparently bipartisan volume on the US.

about.fb.com/news/2020/09/r…
Here's the @Graphika_NYC report.

Overall, 155 accounts on FB, 11 pages, 9 groups, 6 Instagram accounts.

2017: mostly Taiwan
2018-19: + Philippines and South China Sea
2020: + US-centric content.

The biggest audience was in the Philippines.

graphika.com/reports/operat…
Operation Naval Gazing was mostly *not* about U.S. domestic politics.

But I suspect that's where the most urgent questions will be.

So...
Read 17 tweets
4 Sep
Latest (and last?) word from the "PeaceData" operation, run by people linked to the Russian Internet Research Agency: after their statement of aggrieved innocence, they're shutting down.

This follows a pattern of earlier Russian troll ops.
For example, the "Blue Man" from Secondary Infektion.

Operated 2014-19. Exposed June 2019 in our @DFRLab investigation, after a tip-off from @Facebook.

Made one last post trolling the exposers, then vanished.

secondaryinfektion.org/report/early-e…
Or "Jenna Abrams," queen of the IRA trolls.

Exposed and taken down, but stood up a new account and a surreal blog post claiming "I wasn't a Russian troll, I was just visiting."
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!