It's funny how 'fact-checkers' keep making themselves LESS trustworthy every day, and then are so confused as to why people don't pay attention to them.
The other day I posted in several places on Facebook the U-Haul van footage from Louisville. 1/
Here's what the post looked like. This is exactly the wording I used. 2/
Today I find a series of messages from Facebook:
'Independent fact-checkers at PolitiFact say information in your post is missing context and could mislead people. We've added a notice to your post."
So what does Politifact refute or add context to?
I didn't mention weapons. Or even imply it.
I didn't mention George Soros.
I just showed the video.
But because of the algorithm, FB slaps these notices on all the posts of the video. 5/
People who saw the video from my posts (and they've been shared over 500 times just on my posts in a couple of days) didn't get told about weapons or Soros, but after this 'clarification', guess what they'll be thinking about? 6/
I get that a generic 'debunk' article like this is designed to cover a lot of posts making a lot of different claims.
But the very act of slapping this on posts that make none of those claims probably does more harm than good. I was careful about how I worded that post. 7/
My purpose in posting the videos was for people to see for themselves, not have their exposure to it filtered through what I might believe was happening. Or through a legacy media outfit's spin. Just the source. 8/
But apparently now just posting the source isn't good enough.
Fact checkers are now 'adding context' to posts they believe aren't giving the whole story. I guess I'm supposed to be relieved that they didn't get the posts taken down for 'lack of context'. 9/
Of course, that's probably what effectively happens - this notice slapped on the posts, and then the algorithm puts the brakes on them showing up in people's feeds.
And as @less_tx points out, if that ain't acting as an editor, I don't know what is.
The current fact-check model is bad.
It doesn't work.
It hides these platforms' intent when they serve as editors.
It diminishes their credibility regularly.
It ignores human nature, too.
As founder of @Unfakery I'm disappointed beyond measure.
I say that as someone who is still arguing with people on You Tube over whether this is a real RGB quotation
I hate fakers who put out misinformation. I call people on it all the time.
But regardless of their intentions, I think fact-checkers are making it harder for us to find good information. Especially the way they structure their debunking.
@thevivafrei explains here how the fact checkers get it wrong, how they set up the 'claim' so that they can cast doubt on the entire subject and muddy the waters.
It's unethical.
I never thought we'd be in a world where 'fact-checkers' are arguably making information dissemination worse, but here we are.
Whether it's intentional or not, that's exactly what seems to be happening. /fin
Oh, addendum:
Politifact link so you can read it yourself politifact.com/factchecks/202…
Also did those random staged piles of bricks turning up in the spring and summer ever get investigated? Did those get memory-holed?
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.
