The WSJ's "Facebook Files" is the biggest scoop in the company's history. Internal documents prove:
-Facebook knew its algorithm incentivized outrage
-Instagram knew it hurt teen girls
-Facebook has been shielding VIPs from moderation
Here are the shocking revelations... 🧵
Facebook changed its algorithm in 2018, promote friends & family content to "improve well-being"
In actuality, it was an attempt to stop a multi-year decline in Likes and Sharing
Facebook's algorithm change incentivized hateful content, so political parties & news outlets made their posts angrier, driving polarization. Some shifted to make 80% of their posts negative, Fb's research found.
But execs refused to change back bc it would hurt usage & revenue
Instagram told the public and congress that its impact on teen well-being was "quite small" or there was no consensus
Internally, research showed that “32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse"
Instagram's own research found:
- 13% of British users and 6% of American users traced the desire to kill themselves to Instagram
-14% of boys in the U.S. said Instagram made them feel worse
-40% of users said Instagram made them feel unattractive and poor
Facebook's 'XCheck' system protects 5.8 million VIPs from having their policy-violating content removed
XCheck's purpose? To prevent “PR fires". Special treatment caused posts that violated Fb's rules to be viewed 16.4 billion times in 2020, including posts by Trump
When soccer star Neymar posted non-consensual intimate imagery (revenge porn) of a woman who accused him of sexual assault, XCheck delayed its removal.
That caused the video to be viewed by 56 million people, reposted 6000 times, and it lead to bullying of the accuser.
I spent 10 years reporting on Facebook at TechCrunch, but these Files finally prove it knew it was harmful, but hid the info from the public and refused safeguards that would hurt its profits.
Safely enabling communication at scale is hard, but society deserves better
WSJ's Facebook Files scoops continue:
-Fb failed to remove posts of human trafficking & drug cartel murders
-It only acted when Apple threatened to kick Fb out of the App Store
-Fb only spends 13% of moderator time on content from outside the US...where 90% of users live 🧵🧵🧵
Be sure to follow these incredible WSJ reporters if you care about the future of Facebook and social networks. Go read the Facebook Files here wsj.com/articles/the-f…
-@keachhagey
-@dseetharaman
-@JeffHorwitz
-@georgia_wells
-@ScheckWSJ
-@newley
-@samschech
-@EmilyGlazer
For more breakdowns of the biggest social network news, my interview with Mark Zuckerberg, and our upcoming podcast with The Facebook Files' lead reporter Fb's ex-Chief Security Officer, join my newsletter constine.substack.com Now, more from the scoops...
Facebook's own initial testing found 41% of comments on English posts about the COVID vaccine discouraged getting it. Comments like these were seen 775 million times per day
These “cesspools of anti-vaccine comments" are "a huge problem and we need to fix it” Fb's staff wrote.
Unfortunately, the AI that Facebook uses to detect misinformation often fails, and hardly works outside of English.
It missed a post with 53,000 shares that said the vaccines "are all experimental & you are in the experiment" because the AI thought it was written in Romanian.
Facebook failed to remove dangerous posts from drug cartels for 5 months including:
-Recruitment of hitmen
-Gold-plated guns & bloody crime scenes
-Videos of executions, torture, & trash bags of body parts
Fb's investigators flagged them, but the team assigned never followed up
Facebook's research found "human traffickers in the Middle East used the site to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work"
A trafficking group spent at least $152,000 on Facebook ads for massage parlors
Facebook only took limited action against cartels and traffickers... until Apple threatened to remove Fb & Instagram from the App Store.
Reminds me of how Fb only stopped paying teens to spy on them after my scoop led Apple to shut down Fb's internal & beta apps for a day.
Facebook operates in 110 languages but only has moderators who natively speak 50 of them.
Fb's own team wrote “most of our great integrity work . . . doesn’t work in much of the world. Our [AI] classifiers don’t work, and we’re largely blind to problems on our site.”
In 2020, Fb staff & contractors spent over 3.2 million hours hunting misinfo, but only 13% of that was on content outside the US where 90% of users live
It spent 3X as long outside the U.S. on “brand safety" -- ensuring they were happy with the content their ads appeared beside
Now Facebook is claiming the WSJ's report "mischaracterizes" its actions, but without specific citations of inaccuracy.
I'll be breaking down Fb's rebuttal, and the fallout of the scoops on this thread, so follow to get the next update.
"What the Wall Street Journal Got Wrong", Facebook's VP Nick Clegg starts his blog post...that doesn't say what the WSJ got wrong.
"We don’t shy away from scrutiny and criticism" he writes...before reducing the WSJ's reporting to "an attention-grabbing newspaper headline."
Fb claims misinfo didn't overwhelm vaccine content, but it initially found 41% of comments on vax posts in English discouraged vaccinations.
Then Fb takes credit for vaccine hesitancy of US users declining 50% since Jan…as if 2.5 billion safely getting vaxxed didn't contribute
Facebook says it spent $13 billion on safety since 2016, but it earned $96 billion in profits since then. Its share price has tripled.
Fb has the money to hire enough moderators or stop incentivizing hate in the feed even if usage drops. It's just refused to adequately invest.
"I don't think that there will actually be significant cultural change until Zuckerberg steps down" Facebook's ex-CSO says on my podcast. Links for all apps: constine.club/feed/can-faceb… podcasts.apple.com/us/podcast/can…
"You get what you measure and bonus". For Facebook, that's growth, not safety.
Fb's ex-CSO @alexstamos & WSJ's @JeffHorwitz discuss how making us more open & connected "held a mirror up to the world [so you] see the atrocities. Fb could "do a million times more" to safeguard us
How could Facebook fix its problems?
-Change algorithms to demote outrage, even if usage declines
-Hire native speakers to moderate all languages it operates in
-Staff more XCheck moderators so VIPs don't get a free pass
-Change Instagram to disrupt depressive scrolling patterns
We need a new norm in social networking: You can't operate where you don't adequately moderate.
These apps are absurdly profitable because users do the work of entertaining each other. Perhaps their owners should have to invest a minimum percentage of profits into safety.
Thank you to the whistleblowers who shared internal Facebook documents with the WSJ. Your courage will force it to build better.
Go read the Facebook Files and follow its authors like @JeffHorwitz and @dseetharaman wsj.com/articles/the-f…
Get the inside story of why Facebook underinvests in safety and what could make it mend its ways — from former Chief Security Officer @alexstamos on my podcast.
Facebook is "incredibly vulnerable to bad press" he says, in case you want share this thread podcasts.apple.com/us/podcast/can…
Attention forces change, so thank you for caring! I hope these tweets reveal why The Facebook Files matter.
Want more on how social networks shape our future? Join my newsletter & follow me for updates on this thread. I bet Congress will have questions… constine.substack.com
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.
