Analyzing social media posts from news accounts and politicians (n = 2,730,215), we found that the biggest predictor of "virality" (out of all predictors we measured) was whether a social media post was about one's outgroup.
Specifically, each additional word about the opposing party (e.g., “Democrat,” “Leftist,” or “Biden” if the post was coming from a Republican) in a social media post increased the odds of that post being shared by 67%.
Negative and moral-emotional words also slightly increased the odds of a post being shared, positive words slightly decreased the odds, and in-group words had no effect.
Out-group words were by far the strongest predictor of virality that we measured.
Posts about the outgroup were almost exclusively negative (see examples below).
Out-group posts were very likely to receive “angry” reactions on Facebook, as well as “haha” reactions (likely indicating mockery), comments, and shares.
Posts about the ingroup received much less overall engagement, although they were slightly more likely to receive “love” and “like” reactions, reflecting in-group favoritism.
In other words, out-group negativity was a stronger driver of virality than in-group positivity.
Indeed, the “angry” reaction was the most commonly used reaction out of all six of Facebook’s reactions in our datasets.
This out-group effect was not moderated by political orientation or by social media platform. However, stronger effects were found among politicians than in the media.
These results are troubling in an attention economy where the social media business model is based on keeping us engaged in order to sell advertising.
This business model may be creating perverse incentives for polarizing content, rewarding people for "dunking" on the outgroup.
As an illustration of these perverse incentives, Facebook recently declined to implement features to reduce the amount of harmful content in the news feed because these features also made people open Facebook less.
Why do some ideas spread widely, while others fail to catch on?
@Jayvanbavel and I review the “psychology of virality,” or the psychological and structural factors that shape information spread online and offline.
Thread 🧵(1/n)
While studies suggest that outrage and negativity go viral online, social media may not be so unique:
-Negative gossip and word-of-mouth marketing is also likely to spread.
-Negativity went “viral” in early newspapers and books.
Similar to how some viruses are more “contagious” than others, some forms of information appear to be more contagious than others across contexts.
The information-as-virus metaphor can be extended even further:
It is unclear whether belief in (mis)information is driven by a lack of knowledge or a lack of motivation to be accurate.
To help answer this question, we experimentally manipulated people’s motivations to see how this impacted their judgements of news headlines.
We found that providing people with very small financial rewards of up to $1 improved people’s performance at discerning between true and false headlines.
It also reduced the partisan divide in belief between Republicans and Democrats by 30%.
We found that that following, retweeting, or favoriting low-quality news sources – and being central in a US conservative Twitter network – is associated with vaccine hesitancy (n = 2,064).
There has been speculation that an “infodemic” of misinformation on social media is contributing to vaccine hesitancy.
We set out to test how one’s online information diet is associated with vaccine hesitancy by linking survey data to Twitter data.
In Study 1, we looked at various Twitter “influencers” and computed the mean levels of vaccine confidence among participants who followed them in both the United States and the United Kingdom.
Our meta-analysis of all publicly available data on the "accuracy nudge" intervention found that accuracy nudges have little to no effect for US conservatives and Republicans. (1/9)
Replicating prior work, we found that accuracy nudges significantly improved the quality of articles shared for Democrats in nearly all samples, but no significant effects were found for Republicans in *any* of the samples.
In our recent @PNASNews paper, we suggested that Facebook's algorithm change in 2018, which gave more weight to reactions/comments, was rewarding posts expressing out-group animosity.
Recent reporting from the @WSJ finds that @Facebook was aware of this issue.
In our paper, we found that posts about the political outgroup (which tend to be very negative) receive much more overall engagement -- particularly in the form of "angry" reactions, "haha" reactions, comments and shares.
As shown below, the Facebook algorithm shift gave priority to the kind of engagement that we found was associated with out-group negativity (comments and reactions).