At #BYSTANDER22 @rvinsroom presents on whether we can reduce the spread of online misinfo by assigning user reputations that signal credibility.

Spoiler alert: No! But this null-finding is demonstrated using a cool new interactive open-source experimental platform. 1/7
A problem is that there are limited consequences of engaging with misinformation. Is it possible to leverage social identity processes? People have a desire to make positive impressions. These impressions can be curated online. 2/7
It is possible to change social media to disincentivize interaction with false news via reputation score systems. An individual will score points by spreading credible info and loss points by doing the opposite. This can create "necessary friction" between people & info. 3/7
To study this, the *mock social media website tool* has been created. Looks like real platforms (e.g., Facebook) and, hence, gives both high ecological validity & high experimental control. The tool is open source, so you can go use it! 4/7
In the Study 1, 181 undergrads participated in an experiment with a reputation system or without (i.e., the control condition). The reputation system dynamically computed participants credibility score on the basis of their behavior. 5/7
In the experimental condition, people liked more, shared and tried to verify more news - but there was no difference between false and real news. So, it doesn't seem to work. 6/7
In Study 2, a representative sample of Canadians participated in a slightly toned down version of Study 1. Here, they were also provided with media literacy tips (i.e., how to spot "fake news"). Again, alas, there was no effect. 7/7

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Michael Bang Petersen

Michael Bang Petersen Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @M_B_Petersen

Jun 10
The final keynote of #bystander22 is by @jayvanbavel on "Morality in the Anthropocene".

We evolved in small-scale groups but is now connected in vast, social media network. What are the consequences? 1/17
As EO Wilson noted: "We have paleolithic emotions, medieval institutions and godlike technology". How can we manage our ancient emotions in the modern technology environment? Contrast with Zuckerberg's notion of "moving fast and breaking things". 2/17
57 % of all humans use social media. The average user uses it 2.2 hours per day. On average, 5.7 years of your life will be used on social media. All this has emerged fast. 3/17
Read 17 tweets
Jun 10
At #bystander22 @CecilieTraberg talks about the role of social processes in judgments of misinformation.

Prior research mainly look at headlines & individual traits. Yet, we do not consume info in a social vacuum. We need to bring the social into the study of misinformation. 1/8
Source cues matter hugely according to work on persuasion. Social identity theory also suggest that factors like group membership is key. 2/8
In Study 1, accuracy of fake headlines is rated as well as the slant and credibility of the sources. Is the source linked to the perceived accuracy of the headline? The results show that this is the case. 3/8
Read 8 tweets
Jun 10
At #BYSTANDER22 @A_Marie_sci presents our research on how moralization of rationality increases (!) motivations to share hostile misinformation.

The spread of hostile misinfo is problematic. But would more valuation of rationality help? That is the research question. 1/8
Rationality is often seen as the antidote to misinformation. But many conspiracy theorists talk alot about how they advance "facts", "data", "rationality", "critical thinking" and are not gullible sheeple (like the rest of us). 2/8
Here, we particularly suggest that *moralization* of rationality can be problematic and often may be a form of moral grandstanding. Moral grandstanding is fueled by status-seeking rather than truth-seeking. 3/8
Read 8 tweets
Jun 9
Now keynote at #BYSTANDER22 by @DG_Rand on the problem of misinformation and how polarization might solve the problem.

The key question is: How do we fight misinformation at scale? 1/19
Currently, platforms are using technical solutions such as machine learning etc. But there are limits to this solution. These limits often entails that human fact-checkers are brought in. This *does* work. Warning labels limits false news. 2/19
The problem with fact-checking is that it doesn't scale. How can we deal with misinformation at scale?

The solution is to turn towards the wisdom of the crowds (i.e., the finding that aggregations of average people's opinions are often very accurate). 3/19
Read 19 tweets
Jun 9
At #BYSTANDER22 @jrpsau presents our research on how an intervention by @SSTSundhed during the pandemic decreased false news sharing by boosting people's competence in spotting "fake news". 1/5
One intervention often recommended is "accuracy nudges". These assume that people have an intrinsic motivation to be accurate but leave people on their own re: how to spot "fake news".

In risk communication, however, the recommendation is always to give *actionable* advice. 2/5
According to Protection Motivation Theory, actionable advice boosts feelings of competence and efficacy that drives behavior. 3/5
Read 5 tweets
Jun 9
At #BYSTANDER22 @zeaszebeni presents on the profiles of "fake news" believers in Hungary.

Many different factors shape people's beliefs in disinformation. But most research is variable-centered. Here, a *person-centered* approach is used. 1/7
A person-centered approach focuses on whether different types of disinfo speaks to different people. This approach is here used in the polarized Hungarian context, where the term "fake news" is often used to delegitimize the other side. 2/7
295 participants were recruited. They rated the accuracy of news stories (true and false). Multiple factors related to trust were measured and then cluster analysis was applied. 3/7
Read 7 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(