Russia's efforts to intervene with the 2016 election have been widely documented by news media and investigators. But researchers' understanding of exactly how influential these campaigns were has been largely restricted due to data limitations. 2/
To investigate the relationship between Russia’s Twitter campaign and political attitudes, we paired results from a 3-wave longitudinal survey of US respondents (conducted by YouGov) with a collection of respondents' Twitter timelines during the 8 months prior to Election Day. 3/
We then used data released by Twitter to identify which posts in the respondents' timelines had originated from Russia's Internet Research Agency campaign, which intended to reach voters in the lead-up to the 2016 election. 4/
Based on our analysis, it appears unlikely that the Russian foreign influence campaign *on Twitter* could have had much more than a relatively minor influence on individual-level attitudes and voting behavior for four related reasons. 5/
1⃣ Exposure to Russian coordinated influence accounts was heavily concentrated among a small portion of the electorate -- 1 percent of users accounted for 70 percent of exposures. See Panel B of Figure 1👇.
6/
2⃣ Exposure to Russian foreign influence tweets was significantly eclipsed by content from domestic news media and politicians.
On avg, in 10/16, respondents saw:
4 posts/day from Russian foreign influence accts
106/day from national news media
35/day from politicians
7/
3⃣ 98% of exposures were concentrated among only 10% of respondents. Those who identified as "Strong Republicans" were exposed to roughly 9 times as many posts from Russian foreign influence accounts than were those who identified as Democrats or Independents. 8/
4⃣ Finally, we did not detect any relationships between exposure to posts from Russian foreign influence accounts and changes in respondents' attitudes, polarization, or voting behavior. 9/
Despite these results, it would be a mistake to conclude that simply because Russia's Twitter campaign wasn't meaningfully related to individual-level attitudes that it didn't have any impact on the election, or on faith in American electoral integrity. 10/
Indeed, debate about the 2016 US election continues to raise questions about the legitimacy of the Trump presidency & the electoral system, which in turn could turn out to be related to Americans’ willingness to accept claims of voter fraud in the 2020 election. 11/
Such beliefs may stem from speculation that Russian interference on social media influenced the election outcome. In a word, Russia's social media campaign may have had its largest effects *indirectly* by convincing Americans that its campaign was successful. 12/
Nevertheless, our results hopefully provide an important corrective to the view that Russia's foreign influence campaign on social media easily manipulated the attitudes & voting behavior of ordinary Americans. 13/
📢📣New publication @ScienceAdvances on echo chambers! Most users don't follow political elites on Twitter. But those who do show overwhelming preferences for ideological congruity.
Social media facilitates exposure to different perspectives & w/diverse people. It may also lead to insular online communities, where users only find info consistent w/their views. It is feared that such “echo chambers” fuel extremity & exacerbate inter-party hostility.
2/
Evidence on the prevalence of these political biases in social media behavior is inconclusive. Here, we examine this topic by using 4 yrs of data from 1.5m Twitter users to study how they follow and engage with content from political elites.
3/
📢We’ve got a new @CSMaP_NYU paper in @journalsafetech
Fraud & conspiracy narratives proliferated on social media around the 2020 election. Our analysis finds YouTube was more likely to recommend fraud videos to users already skeptical about legitimacy of the election. 🧵👇1/
@CSMaP_NYU@journalsafetech There's growing scholarly consensus that social media algorithms have little influence on online echo chambers, in which users only see content reaffirming pre-existing views. But what if that content is undermining democratic confidence? 2/
In our study, we sampled more than 300 Americans w/YouTube accounts in Nov/Dec 2020. Subjects were asked how concerned they were with election fraud and asked to install a browser extension that would record the list of recommendations they were shown.
3/
📣A Senate Judiciary subcommittee will hold a hearing at 2pm today on platform transparency/understanding the impact of social media.
The conversation about social media data access should start with what questions we want to answer with those data.🧵1/ judiciary.senate.gov/meetings/platf…
At @CSMaP_NYU, we've been researching social media and politics for more than a decade. But from the beginning, we've done so with one hand tied behind our backs because social media companies tightly control the data necessary to study the platforms' impact.
2/
As a result, we have limited understanding of the scale/character/causes of various phenomena attributed to the rise of social media. Instead, we’ve turned to alternative methods (surveys, experiments, scraping) to try to glimpse what internal analysts can easily see.
3/
After weeks of war, much has been written about the success and failures of Russia’s propaganda. It's worth stepping back to consider the various audiences for Russia’s disinformation and examine where it's working and where it's not. 🧵
First, where it's not: Ukraine & the West. Putin has spread a long line of lies -- denying Ukrainian statehood, claiming Nazis rule the country. Now they say Ukrainians are bombing their own people. These falsehoods quickly fell apart, largely because true info won out.
2/
The Biden admin prebunked Russian lies. Social media,satellite imagery, and 24/7 reporting have directly refuted Russian disinformation in real-time while sharing heart-wrenching stories of citizens caught in the crossfire.
3/
Important piece in @nytimes today about new tactics from Russian gov't to deal with online opposition. A short 🧵 on a conceptual framework for understanding these developments
In a 2018 Comparative Politics article with @SergeySanovich@DenisStukal, we introduced a three part framework for thinking about how non-democratic regimes respond to online opposition.
2/
1) Regimes can have offline responses to online opposition. This can involve arresting or physically threatening people who post online, but can also involve going after the companies that provide online services by changing legal/regulatory rules.
3/