My Authors
Read all threads
There are good reasons to worry about Facebook's "fact checking" efforts, primarily that the company is a raging garbage fire that destroys everything it touches and only has two approaches to solving any problem:

1/
1. Make it fully automated in a way that guarantees that there will be innumerable false positives in which legitimate material is erroneously censored, with no effective means of appeal, even as dedicated trolls exploit the system's blind spot to carry on as normal;

2/
2. Hire boiler-rooms full of low-waged workers to review horrific materials and make judgment calls that require context they don't have and can't get, until they're so traumatized they literally develop PTSD and sue the company for psychiatric care.

theverge.com/2020/5/12/2125…

3/
But there's one reason NOT to worry about Facebook factchecking, and that's the "Backfire Effect," a discredited psychological principle that holds that when learn facts that challenge their worldview, they double down on their false beliefs.

4/
The original experiments that established the Backfire Effect as a bedrock of social psychology have spectacularly, repeatedly failed to replicate:

springerprofessional.de/en/taking-fact…

link.springer.com/article/10.100…

cambridge.org/core/journals/…

5/
Nevertheless, a Facebook exec told Stat that the reason the company is holding back on thorough factchecks of covid conspiracy theories is they're worried about the Backfire Effect.

statnews.com/2020/05/01/fac…

6/
Again, I don't trust Facebook to do anything well, let alone factchecking. But among the things Facebook does badly, apparently, is "understanding how factchecking works."

7/
This is well-put in an op ed by @EthanVPorter and @ThomasJWood, authors of "False Alarm: The Truth about Political Mistruths in the Trump Era," a peer-reviewed book from Cambridge University Press.

cambridge.org/us/academic/su…

8/
"By our count, across experiments involving more than 10,000 Americans, fact-checks increase the proportion of correct responses in follow-up testing by more than 28 percentage points."

wired.com/story/why-is-f…

9/
And they did a new study to show that this would work on FB, too: "Across all issues, people who had seen misinformation and then a related fact-check were substantially more factually accurate than people who had only seen the misinformation."

10/
"Prior research has found that, on social media, fake news is disproportionately shared by older, more conservative Americans. In our study this group did not show any special vulnerability to backfire effects. When presented with fact-checks they became more accurate too."

eof/
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Covered Dish People

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!