ELECTION THREAD: Today and tonight are going to be a wild time online.
Remember: disinformation actors will try to spread anger or fear any way they can, because they know that people who are angry or scared are easier to manipulate.
Today above all, keep calm.
A couple of things in particular. First, watch out for perception hacking: influence ops that claim to be massively viral even if they’re not.
Trolls lie, and it’s much easier to pretend an op was viral than to make a viral op.
Again, with claims like that, keep calm. Check the evidence. Ask how easily it could be manipulated. Ask if the claim actually originated today: there have been plenty of cases of repackaged claims that use footage from years ago.
Having studied IO for longer than I care to remember, one of the most frequent comments I’ve heard, and agreed with, is that we need better ways to assess impact on multiple levels and timescales.
As part of that, we need a way to assess live IO in real time.
This paper suggests a way to approximate impact in the moment, when we don’t have the full picture, including the IO operators’ strategic objectives, or the luxury of taking the time to run polls to measure effect on public sentiment (hard even in normal circumstances).
This field is rapidly developing, but we need to start somewhere. Without clear context and a comparative scale, there's a danger of IO capitalising on fear and confusion to claim an impact they never had.
Latest (and last?) word from the "PeaceData" operation, run by people linked to the Russian Internet Research Agency: after their statement of aggrieved innocence, they're shutting down.
This follows a pattern of earlier Russian troll ops.
For example, the "Blue Man" from Secondary Infektion.
Operated 2014-19. Exposed June 2019 in our @DFRLab investigation, after a tip-off from @Facebook.
Made one last post trolling the exposers, then vanished.