The WSJ's "Facebook Files" is the biggest scoop in the company's history. Internal documents prove:
-Facebook knew its algorithm incentivized outrage
-Instagram knew it hurt teen girls
-Facebook has been shielding VIPs from moderation
Here are the shocking revelations... 🧵
Facebook changed its algorithm in 2018, promote friends & family content to "improve well-being"
In actuality, it was an attempt to stop a multi-year decline in Likes and Sharing
Facebook's algorithm change incentivized hateful content, so political parties & news outlets made their posts angrier, driving polarization. Some shifted to make 80% of their posts negative, Fb's research found.
But execs refused to change back bc it would hurt usage & revenue
Instagram told the public and congress that its impact on teen well-being was "quite small" or there was no consensus
Internally, research showed that “32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse"
Instagram's own research found:
- 13% of British users and 6% of American users traced the desire to kill themselves to Instagram
-14% of boys in the U.S. said Instagram made them feel worse
-40% of users said Instagram made them feel unattractive and poor
Facebook's 'XCheck' system protects 5.8 million VIPs from having their policy-violating content removed
XCheck's purpose? To prevent “PR fires". Special treatment caused posts that violated Fb's rules to be viewed 16.4 billion times in 2020, including posts by Trump
When soccer star Neymar posted non-consensual intimate imagery (revenge porn) of a woman who accused him of sexual assault, XCheck delayed its removal.
That caused the video to be viewed by 56 million people, reposted 6000 times, and it lead to bullying of the accuser.
I spent 10 years reporting on Facebook at TechCrunch, but these Files finally prove it knew it was harmful, but hid the info from the public and refused safeguards that would hurt its profits.
Safely enabling communication at scale is hard, but society deserves better
WSJ's Facebook Files scoops continue:
-Fb failed to remove posts of human trafficking & drug cartel murders
-It only acted when Apple threatened to kick Fb out of the App Store
-Fb only spends 13% of moderator time on content from outside the US...where 90% of users live 🧵🧵🧵
For more breakdowns of the biggest social network news, my interview with Mark Zuckerberg, and our upcoming podcast with The Facebook Files' lead reporter Fb's ex-Chief Security Officer, join my newsletter constine.substack.com Now, more from the scoops...
Facebook's own initial testing found 41% of comments on English posts about the COVID vaccine discouraged getting it. Comments like these were seen 775 million times per day
These “cesspools of anti-vaccine comments" are "a huge problem and we need to fix it” Fb's staff wrote.
Unfortunately, the AI that Facebook uses to detect misinformation often fails, and hardly works outside of English.
It missed a post with 53,000 shares that said the vaccines "are all experimental & you are in the experiment" because the AI thought it was written in Romanian.
Facebook failed to remove dangerous posts from drug cartels for 5 months including:
-Recruitment of hitmen
-Gold-plated guns & bloody crime scenes
-Videos of executions, torture, & trash bags of body parts
Fb's investigators flagged them, but the team assigned never followed up
Facebook's research found "human traffickers in the Middle East used the site to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work"
A trafficking group spent at least $152,000 on Facebook ads for massage parlors
Facebook only took limited action against cartels and traffickers... until Apple threatened to remove Fb & Instagram from the App Store.
Reminds me of how Fb only stopped paying teens to spy on them after my scoop led Apple to shut down Fb's internal & beta apps for a day.
Facebook operates in 110 languages but only has moderators who natively speak 50 of them.
Fb's own team wrote “most of our great integrity work . . . doesn’t work in much of the world. Our [AI] classifiers don’t work, and we’re largely blind to problems on our site.”
In 2020, Fb staff & contractors spent over 3.2 million hours hunting misinfo, but only 13% of that was on content outside the US where 90% of users live
It spent 3X as long outside the U.S. on “brand safety" -- ensuring they were happy with the content their ads appeared beside
Now Facebook is claiming the WSJ's report "mischaracterizes" its actions, but without specific citations of inaccuracy.
I'll be breaking down Fb's rebuttal, and the fallout of the scoops on this thread, so follow to get the next update.
"What the Wall Street Journal Got Wrong", Facebook's VP Nick Clegg starts his blog post...that doesn't say what the WSJ got wrong.
"We don’t shy away from scrutiny and criticism" he writes...before reducing the WSJ's reporting to "an attention-grabbing newspaper headline."
Fb claims misinfo didn't overwhelm vaccine content, but it initially found 41% of comments on vax posts in English discouraged vaccinations.
Then Fb takes credit for vaccine hesitancy of US users declining 50% since Jan…as if 2.5 billion safely getting vaxxed didn't contribute
Facebook says it spent $13 billion on safety since 2016, but it earned $96 billion in profits since then. Its share price has tripled.
Fb has the money to hire enough moderators or stop incentivizing hate in the feed even if usage drops. It's just refused to adequately invest.
"You get what you measure and bonus". For Facebook, that's growth, not safety.
Fb's ex-CSO @alexstamos & WSJ's @JeffHorwitz discuss how making us more open & connected "held a mirror up to the world [so you] see the atrocities. Fb could "do a million times more" to safeguard us
How could Facebook fix its problems?
-Change algorithms to demote outrage, even if usage declines
-Hire native speakers to moderate all languages it operates in
-Staff more XCheck moderators so VIPs don't get a free pass
-Change Instagram to disrupt depressive scrolling patterns
We need a new norm in social networking: You can't operate where you don't adequately moderate.
These apps are absurdly profitable because users do the work of entertaining each other. Perhaps their owners should have to invest a minimum percentage of profits into safety.
Thank you to the whistleblowers who shared internal Facebook documents with the WSJ. Your courage will force it to build better.
Get the inside story of why Facebook underinvests in safety and what could make it mend its ways — from former Chief Security Officer @alexstamos on my podcast.
Attention forces change, so thank you for caring! I hope these tweets reveal why The Facebook Files matter.
Want more on how social networks shape our future? Join my newsletter & follow me for updates on this thread. I bet Congress will have questions… constine.substack.com
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Your careers page is bad and you should feel bad. It's wasting your team's 🕑+💸 by hurting your recruiting.
We looked at 100 top startups sites and found 7 easy ways to improve your careers page -- even if you're in a hiring freeze 😬...
1. Ditch your headshots and geeky "class photo" of your whole team. Invest in great photography of your founders, product, and office life. You'll end up using them everywhere
2. Make your equity look like a steal by adding metrics, not just feel-good marketing speak
The OnlyFans ban could put creators in danger by pushing them back towards in-person sex work.
In interviewed some OnlyFans stars & experts to get real talk on the ban 🧵 1/7
OnlyFans isn’t banning porn, but is betting it can thrive as softcore-only with fewer headaches, even if it hurts creators.
It’s quietly been pushing the most graphic content off already. 2/7
The OnlyFans hardcore ban was driven by puritanical payment processors like Visa & MasterCard threatening to cut them off. It’s moral censorship not a legal issue
The risk is real. PornHub got cut off and had to shift to crypto & ACH transfers only. 3/7
Startup fundraising legal-ese is absurdly confusing. Here are the key terms and how to understand them... 1/
Liquidity Preference: Who gets paid out first at exit. 2X or 3X means a VC gets multiple times their $ back before founders get any! If there's none left, sucks for everyone else. Keep pref at 1X or the team & other investors might get nothing unless the company becomes huge. 2/
Warrants: VCs will ask for warrants - the right to purchase stock at a specific price at a later date - as compensation for coming in early and catalyzing a round. This is not standard! Avoid it! Other investors will just ask for the same deal when they find out. 3/
Flymachine just launched the future of livestreamed concerts:
-Broadcast in the best angles from top venues
-Overlaid video chat with friends
-Started by the TicketFly & Bonnaroo founders
Here's why it's a win for artists, venues, and fans where others failed 🧵
Concert streaming was asocial, boring, and unsustainable.
Music lovers deserve more than sitting by themselves watching single-camera streams from the basement of an artist who's not getting paid
Streams felt nothing like an IRL show 2/
You need 3 things to make concert streaming work:
1. Artists to earn $ without piling on extra work 2. Venues to bring the stage/light show while paying staff 3. Fans to feel the camaraderie of attending with friends
I've made my first lead investment since becoming a VC!
Spore.build is the free, all-in-one tool for creators to build websites to truly own, grow, & monetize their relationship with fans.
Here's why creators deserve help escaping the algorithms & app store taxes 🧵
The top two trends in the creator economy beget the top two problems:
-Creators want to move top fans off misaligned social networks, but lack their own home on the web
-Creators need a ton of software to build a community, but are overwhelmed by fragmented, expensive tools
Creators are vulnerable to platform risk.
When you're dependent on big tech platforms that don't share your priorities, you live by their rules. If they want to cut you off from your audience, change the functionality you need, or charge you taxes, you're at their mercy.