@FrancesHaugen@OnlineSafetyCom Haugen: Even the differences between American English and British English presents a content moderation challenge for Facebook that they have yet to effectively tackle.
Haugen: 60% of Facebook news feed comes from Facebook Groups. Facebook Groups and Reshares combined with engagement-based ranking produce lots of content that doesn't necessarily get pushed out on newsfeed.
Haugen: Facebook Groups recommendations push people to extremes. Echo chambers within these groups "create social norms." This is the normalization of hate that leads to violence
Haugen: Facebook Groups with over a certain number of members should have to bring in their own moderators for content.
Haugen: "Even if removed microtargeting from ads, people would microtarget through Groups."
Haugen: Facebook should have to a publish a list of all research/experiments they're doing and publish the results.
🔴 Haugen: "When I worked on counterespionage, I saw things about national security and I had no idea how to escalate those. Because I didn't have faith in my chain of command because they had dissolved Civic Integrity...We were told just to accept under-resourcing."
Haugen: There are no incentives, internally, to rally for help because everyone is under water. Facebook's most important teams are understaffed and under-resourced.
Haugen: VPs and Directors at the company, this is the only job they've ever had. Mark [Zuckerberg] came in when he was 19.
Haugen: Twitter and Google are far more transparent than Facebook.
🔴 Haugen: Engagement-based ranking prioritizes polarizing, extreme, divisive content. It doesn't matter if you're on the left or the right, it pushes you to the extremes. It fans hate. Anger and hate are the easiest ways to grow on Facebook.
Haugen: It is cheaper to run an angry, hateful ad rather than an empathetic, compassionate ad on Facebook.
Haugen: Moving to systems that are human-scaled instead of having AI telling us where to focus, is the safest way to design social media.
🔴 Haugen: There are some countries in the world where 35% of all the content in the newsfeed is a reshare. The reason Facebook doesn't crack down on it is because they don't want to lose that growth.
@DamianCollins: "The Oversight Board doesn't have access to the information you've been publishing or discussing. Do you think the Oversight Board should insist on that transparency or disband itself?"
🔴 Haugen: The Oversight Board should ask the question 'why was Facebook able to lie to them in this way?' If Facebook can come in there and actively mislead the Oversight Board, which is what they did, I don't know what the purpose of the Oversight Board is.
@DamianCollins@OversightBoard 🔴 Haugen: "A problem that I'm really concerned about is that Facebook has a problem differentiating for in many languages is terrorism content and counterterrorism content."
76% of Arabic-counterterrorism content was labeled as terrorist content on Facebook.
@DamianCollins@OversightBoard Haugen: Bullying on Instagram follows children home after school.
"They don't get a moment's peace."
@DamianCollins@OversightBoard 🔴 Haugen: I am deeply worried that it may not be possible to make Instagram safe for a 14 year old and I sincerely doubt it is possible to make Instagram safe for a 10 year old.
@DamianCollins@OversightBoard 🔴 Haugen: US Senate that Facebook found that Facebook had estimated the ages of teenagers and found that 10-15% of 10-year-olds were on the platform. Facebook can very effectively predict how old a user is.
Haugen: Kids are learning that people they care about treat them cruelly because when they were moved with the feedback of watching someone cry or wince, they're much more hateful and meaner. Imagine what the domestic relationships will be like when they're 30.
Haugen: Facebook is negligent.
Haugen: Facebook has never had to prove that their product is safe for kids.
🔴🔵 The Committee will break for 10 minutes.
Haugen: "Facebook as a product was built by Harvard students for Havard students."
Haugen: "Countless employees said we have lots of solutions that don't involve picking good and bad ideas. It's not about cenorship. We could have a safer platform but it will cost little bits of growth."
Haugen: "Ethiopia has 100m people and speak six languages. Facebook only supports two. If we believe in linguistic diversity, the current design of the platform is dangerous."
Haugen: On Jan 6, most of Facebook's interventions for violence, hate, and misinformation were still turned off at 5pm ET.
Haugen: "Until incentives change, Facebook will not change"
Baroness Kidron: Facebook did not give access to the content seen on Instagram to the parents by a young woman who committed suicide.
Haugen: Facebook likely deleted her data within 90 days. It's as if, to them, their sins disappear after 90 days
Haugen: Facebook struggles on moderating comments because they are shorter and harder for the AI to detect.
Haugen: Says she was misrepresented by the Telegraph and says she does support the use of end-to-end encryption and uses end-to-end encryption services.
Damian Collins: Nick Clegg's claim that it takes "two to tango" is a massive misrepresentation of how the company and its platforms actually works.
Haugen: There is a problem known as SUMAs (same user, multiple accounts). Documents show that said for Facebook has put in controls "reach and frequency" advertising so the platform doesn't target the same people more than a set number but didn't take into account SUMAs.
Haugen: In the case of teenagers on Instagram, encouraging teens to make accounts so parents can't see their content is problematic and needs to change. #Finsta
🔴🔵 The evidence session has concluded! Thank you for following along with us.
"When we live in an information environment that is full of angry, polarizing content, it erodes our civic trust", @FrancesHaugen, when she spoke to @CBSNews earlier this month.
Frances Haugen left in March of 2021. A data scientist. She secretly copied thousands of documents. She says Facebook is lying about the progress it’s making on misinformation and hate speech.
“The version of Facebook today is causing ethnic violence around the world”
Today’s bombshell report in @WSJ [@JeffHorwitz] that Facebook allowed at least 5.8m VIP users to be exempt from its TOS enforcement is yet another example of Facebook’s complete failure to responsibly moderate content or oversee its own platforms. /1 wsj.com/articles/faceb…
The WSJ reporting outlines how Facebook "whitelists" or gives preferential treatment to VIP Facebook users making them (at times) exempt to immune from standard content moderation practices and terms of service guidelines. The list of VIP users is called "XCheck." /2
Those whitelisted included political figures such as the former President and his son, Candace Owens, Senator Elizabeth Warren, as well as athletes like Neymar and Mark Zuckerberg himself. /3
Ahead of Facebook's quarterly earnings, RFOB releases its "Facebook Quarterly Harms Report" for Q2. This report documents Facebook's harms around COVID-19, and human life, and democracy.
A Real Facebook Oversight Board analysis from 2021 Q2 data found that a majority of the “Number One posts” (most engagement on Facebook for that day) originated from just five known ‘disinformation superspreaders’.
RFOB, in our report, also identified those five known “disinformation superspreaders.”
These superspreaders are serial disinformation offenders yet their posts rank “Number 1” frequently, per our analysis.