Steve Rathje Profile picture
Psychology PhD candidate @Gates_Cambridge @TrinCollCam studying political psychology & social media, previously @Stanford Psychology BA 🏳️‍🌈
ASSume the position Profile picture 1 subscribed
Mar 6, 2023 16 tweets 6 min read
🚨Out now in @NatureHumBehav 🚨

Across 4 experiments (n = 3,364), we found that motivating people to be accurate via a small financial incentive:

-Improved people’s discernment between true and false news
-Reduced the partisan divide in belief

nature.com/articles/s4156… It is unclear whether belief in (mis)information is driven by a lack of knowledge or a lack of motivation to be accurate.

To help answer this question, we experimentally manipulated people’s motivations to see how this impacted their judgements of news headlines.
Sep 30, 2022 15 tweets 12 min read
🚨 New paper in @PNASNexus 🚨

We found that that following, retweeting, or favoriting low-quality news sources – and being central in a US conservative Twitter network – is associated with vaccine hesitancy (n = 2,064).

doi.org/10.1093/pnasne… ImageImageImage There has been speculation that an “infodemic” of misinformation on social media is contributing to vaccine hesitancy.

We set out to test how one’s online information diet is associated with vaccine hesitancy by linking survey data to Twitter data. Image
Jan 4, 2022 10 tweets 6 min read
Now out in @PsychScience:

Our meta-analysis of all publicly available data on the "accuracy nudge" intervention found that accuracy nudges have little to no effect for US conservatives and Republicans. (1/9)

sage.figshare.com/articles/journ… Our paper (with @roozenbot @CecilieTraberg @jayvanbavel & @Sander_vdLinden) is in response to recent set of @PsychScience & @Nature papers that find that nudging people to think about accuracy can reduce misinformation sharing:

nature.com/articles/s4158…
journals.sagepub.com/doi/10.1177/09…
Sep 15, 2021 5 tweets 3 min read
In our recent @PNASNews paper, we suggested that Facebook's algorithm change in 2018, which gave more weight to reactions/comments, was rewarding posts expressing out-group animosity.

Recent reporting from the @WSJ finds that @Facebook was aware of this issue. Image In our paper, we found that posts about the political outgroup (which tend to be very negative) receive much more overall engagement -- particularly in the form of "angry" reactions, "haha" reactions, comments and shares.

Jun 23, 2021 18 tweets 8 min read
🚨 Now out in @PNASNews 🚨

Analyzing social media posts from news accounts and politicians (n = 2,730,215), we found that the biggest predictor of "virality" (out of all predictors we measured) was whether a social media post was about one's outgroup.

pnas.org/content/118/26… Image Specifically, each additional word about the opposing party (e.g., “Democrat,” “Leftist,” or “Biden” if the post was coming from a Republican) in a social media post increased the odds of that post being shared by 67%. Image
May 10, 2021 7 tweets 4 min read
Have you shared fake news on Twitter? I designed an app that will tell you!

It will also tell you how many right-leaning, left-leaning, or hyper-partisan/low-quality news sites you have shared.

Try it out here, and share your score:

newsfeedback.shinyapps.io/HaveISharedFak… You can also see how much fake/low-quality websites that other people with public Twitter handles have shared.

I calculated the fake news "scores" of all US congress-members. Use the app to see which congress-member shares the most low-quality news.