Confirmed: π Soft moderation interventions (e.g. fact-checked labels & blur filter) reduce engagement with misinformation about the Russia-Ukraine war among Facebook users. This finding held even when accounting for user-specific factors. #Misdoom
The study authored by @gruzd @PhMai @felipebsoares is published open access at doi.org/10.1007/978-3-β¦
Interestingly, the more intrusive intervention of adding the blur filter in front of a fact-checked image or video did not produce a stronger response in comparison to a simple fact-check footnote.
For this study we developed and used βModSimulatorβ β Open-Source Research Tool to test the effectiveness of soft content moderation.
The tool is available to other researchers to customize at
We find that irrespective of the intervention used, there are other predictors of engagement w/false claims.
For example, individualsβ preexisting beliefs in pro-Kremlin claims & trust in partisan sites for news re: the Russia-Ukraine war increases engagement w/flase claims.
On the other hand, trust in fact-checking organizations and being an active commenter on Facebook decreases engagement with false claims.