Internal documents reveal that Facebook has privately tracked real-world harms exacerbated by its platforms, ignored warnings from employees about the risks of their design decisions and exposed vulnerable communities around the world to dangerous content. wapo.st/3Et94VZ
Disclosed to the SEC by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen’s legal counsel.
Here are key takeaways from The Post’s investigation: wapo.st/3Et94VZ
Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds.
But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook. wapo.st/3CeS6d9
During the run-up to the 2020 election, Facebook dialed up efforts to police content that promoted violence, misinformation and hate speech.
But after Nov. 6, Facebook rolled back many of the dozens of measures aimed at safeguarding U.S. users. wapo.st/3CeS6d9
According to one 2020 summary, the vast majority of Facebook's efforts against misinformation — 84 percent — went toward the United States, the documents show, with just 16 percent going to the “Rest of World,” including India, France and Italy. twitter.com/i/events/14522…
A 2019 report tracking a dummy account set up to represent a conservative mother in North Carolina found that Facebook’s recommendation algorithms led her to QAnon, an extremist ideology that the FBI has deemed a domestic terrorism threat, in five days. wapo.st/3CeS6d9
A mix of presentations, research studies, discussion threads and strategy memos, the Facebook Papers provide an unprecedented view into how executives at the social media giant weigh trade-offs between public safety and their own bottom line. washingtonpost.com/technology/202…
The internal debate over the “angry” emoji and the findings about its effects shed light on the highly subjective human judgments that underlie Facebook’s news feed algorithm. twitter.com/i/events/14529…
Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands.
Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. wapo.st/3bdEHpG
It’s in the spotlight thanks to waves of revelations from The Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems. wapo.st/3bdEHpG
Since 2018, the algorithm has elevated posts that encourage interaction, such as ones popular with friends. This broadly prioritizes posts by friends and family and viral memes, but also divisive content. wapo.st/3bdEHpG
Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal.
The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged. wapo.st/3GniSCx
Facebook’s own researchers were quick to suspect a critical flaw.
Favoring “controversial” posts could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. wapo.st/3GniSCx
The company’s data scientists confirmed that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity, and low-quality news.
For three years, Facebook had given special significance to the angry emoji. wapo.st/3GniSCx
Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets. washingtonpost.com/technology/202…
In Vietnam, upholding the free-speech rights of people who question government leaders could have come with a significant cost in a country where Facebook earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International. washingtonpost.com/technology/202…
Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on condition of anonymity to describe internal company discussions. washingtonpost.com/technology/202…
In 2019, a pair of Facebook employees set up a dummy account to better understand the experience of a new user in India.
Without any direction from the user, the Facebook account was soon flooded with pro-Modi propaganda and anti-Muslim hate speech. wapo.st/3GkowoM
An internal Facebook memo, reviewed by The Washington Post, called the dummy account test an “integrity nightmare” that underscored the vast difference between the experience of Facebook in India and what U.S. users typically encounter. wapo.st/3GkowoM
About the same time, in a dorm room in northern India, a Kashmiri student named Junaid told The Post he watched as his real Facebook page flooded with hateful messages.
One said Kashmiris were “traitors who deserved to be shot.” wapo.st/3GkowoM
Relief flowed through Facebook in the days after the 2020 presidential election.
The company had cracked down on misinformation, foreign interference and hate speech. wapo.st/3b4YIi6
Employees believed they had largely succeeded in limiting problems that, four years earlier, had brought on perhaps the most serious crisis in Facebook’s scandal-plagued history.
But the high fives, it soon became clear, were premature. wapo.st/3b4YIi6
Across the Andes, a region that has reported some of the world’s highest covid-19 death rates, teams are traversing deserts, mountains, rainforests and rivers to vaccinate isolated communities. wapo.st/3iYxC0E
In Colombia, a country of more than 48 million people, about 16 percent of the population lives in rural areas that were neglected by the government during more than five decades of armed conflict. wapo.st/3iYxC0E
In this remote part of the northern department of La Guajira, home to the country’s largest Indigenous population, there are no paved roads, no electricity, no running water and no other access to the vaccines that would protect their communities. wapo.st/3iYxC0E