2/ We theorize tradeoffs states face in running disinfo campaigns in-house v. outsourced. Outsourcing helps states tap digital marketing experts, saves $$ & provides plausible deniability. Running things in-house can have benefits from an op security perspective.
3/ We test implications of the theory by comparing two 2014-2018 disinfo campaigns on Syria that originated in Russia: the outsourced IRA campaign & the in-house GRU campaign. We look across Instagram, Facebook, & Twitter, leveraging data the platforms turned over to the Senate.
4/ Big picture: The GRU did what they always do, narrative laundering. Narrative laundering is more amenable to one-size-fits-all content. The outsourced IRA operation tailored content for particular audiences, and made the content highly partisan.
5/ Topic findings: The GRU posted about more one-size-fits-all content, like battlefield deaths in Syria. Meanwhile, the IRA posted on more partisan topics, eg Hillary Clinton and Obama’s Syria policies.
6/ And the IRA customized their posts for particular audiences, creating more partisan content, particularly on Facebook and Instagram, than the GRU. The figure below illustrates this for one topic, refugees.
7/ We find that the IRA used more clickbait/emotionally resonant language, again particularly on Facebook and Instagram, compared to the GRU.
8/ One of the paper’s unique contributions is the way we compare engagement. It’s not fair to compare direct Facebook engagement for the IRA & GRU. The GRU’s Syria operation was a narrative laundering op. They wanted their content to get re-posted elsewhere on the internet.
9/ The GRU’s Facebook operation primarily involved posting articles from the fake Inside Syria Media Center (ISMC) think tank. We searched sentences from the ISMC articles on Google (thanks @ShelbyAPerkins), and found 634 ISMC articles that were re-posted.
10/ We are making public the 1,541 URLs that were re-posts of the GRU ISMC articles: docs.google.com/spreadsheets/d… We have triple checked this list, but please let us know if you spot any errors. *We believe many of the websites that reposed GRU content did so unwittingly.*
11/ These are the domains that re-posted GRU Syria articles the most:
12/ For each URL re-post, we used CrowdTangle to assess how many times the URL was shared on Facebook, and how much engagement each share got. Even taking this engagement into account, the GRU’s operation still did not get as much engagement as the IRA’s.
13/ We hope this research spurs more research comparing in-house v. outsourced influence operations, particularly across platforms. We will be posting replication data shortly.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
We hope this journal will become a home for cutting-edge research on how internet services are abused to cause harm, and how to address these harms. These will be our priority areas:
We have an incredible editorial board, with leading researchers from communication, computer science, criminology, law, political science, psychology, and more.
CW: Suicide and self-harm
Last week we published a report on self-harm policies on internet platforms. We gave Reddit a low rating, as we could not find any policies that referenced self-harm or suicide. cyber.fsi.stanford.edu/io/self-harm-p…
Reddit reached out on this, sending us a link to a blog post outlining their approach to self-harm. It is very thoughtful: redditblog.com/2020/03/04/red…
In our ratings, we only counted policies that appeared in the platforms' main policy documents. For example, we did not give Instagram credit for a similarly thoughtful policy outlined in a blog post, as we worried users might not come across it. about.instagram.com/blog/announcem…
🇮🇷🇦🇫 Tonight Facebook announced that they suspended a network that originated in Afghanistan and Iran and targeted Farsi/Dari speakers in Afghanistan. My Stanford Internet Observatory team has a report on this network here: cyber.fsi.stanford.edu/io/news/novemb…
This network was suspended not due to the content of its posts, but rather for coordinated inauthentic behavior; fake profiles were central to the operation.
This operation was novel in that it was oriented toward women, including promoting women’s rights. 53% percent of the Instagram accounts had profile photos of women (compared to 11% with photos of men), and the network shared stories about the educational success of women.
📑 Today Facebook announced the takedown of a Muslim Brotherhood-linked network. With so many disinfo ops linked to Saudi Arabia/UAE/Egypt, it’s interesting to have a network from the other side. Here is SIO’s report, co-authored with @maffsyy & @k_ramalicyber.fsi.stanford.edu/io/news/novemb…
This network was suspended not due to the content of its posts, but rather for coordinated inauthentic behavior; fake profiles were central to the operation.
This was a complex cross-platform operation with a substantial audience. The Facebook Pages we looked at had almost 1.5 million followers. There were Twitter accounts & YouTube & Telegram channels. Here are accounts linked to one anti-UAE Page:
🇸🇦Today Twitter announced the takedown of 33 accounts linked to the government of Saudi Arabia. Buckle up for this one 🎢 it’s not your standard “Qatar is the worst” Saudi disinfo operation. Here’s our report: cyber.fsi.stanford.edu/news/twitter-t…
The network had “Royal Sockpuppets”, 👑🧦 fake accounts for real dissident Qatari Royals living in Saudi. The biggest account, pretending to be Fahad bin Abdullah Al-Thani, had >1mil followers. There were also accounts pretending to be an exiled Qatari interim govt.
How did these accounts get such big followings? It’s hard to say, for two reasons. First, many of the accounts engaged in handle switching. The now-suspended @QtrGov was not always @QtrGov - its mentions only go back a few months even though it has existed for years.
🇳🇬Today Facebook announced the removal of a network of accounts run by the Islamic Movement in Nigeria. My Stanford Internet Observatory team analyzed the network before it was taken down. Our report: cyber.fsi.stanford.edu/news/islamic-m…
The network was suspended not because of the content of the posts, but rather because the Facebook Pages and Groups were run by fake accounts. Facebook calls this coordinated inauthentic behavior.
The Facebook Pages and Groups advocated for the release of IMN leader Sheikh Ibrahim El-Zakzaky.