Facebook researchers had deep knowledge of how coronavirus and vaccine misinformation moved through the company’s apps, according to documents disclosed by Facebook whistleblower Frances Haugen. wapo.st/3mmERBx
But even as academics, lawmakers and the White House urged Facebook for months to be more transparent about the misinformation and its effects on the behavior of its users, the company refused to share much of this information publicly.
Internally, Facebook employees showed that coronavirus misinformation was dominating small sections of its platform.
Other researchers documented how posts by medical authorities, like the WHO, were often swarmed by anti-vaccine commenters. wapo.st/3mmERBx
Documents show how extensively Facebook was studying coronavirus and vaccine misinformation on its platform, unearthing findings that concerned its own employees.
Yet, executives focused on more positive aspects of the social network’s pandemic response. wapo.st/3mmERBx
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Tens of thousands of diplomats, researchers, protesters and presidents from around the globe are scheduled to descend on Glasgow, Scotland, starting next week for a critical United Nations climate summit. wapo.st/3BiAeNa
The overarching goal of COP26 is to get countries to commit to more ambitious, detailed plans to cut their planet-warming emissions and collectively slow climate change.
The Post spoke to people around the world — including youth activists, scientists, government leaders and people whose livelihoods are threatened by climate change — to hear in their own words why COP26 matters and what is at stake if countries fail. wapo.st/3BiAeNa
Most political parties in Poland have complaints about Facebook’s algorithms, the obscure formulas that decide which posts pop up on a user’s news feed and which fade into the ether.
The Confederation’s content generally does well, including a slew of anti-lockdown, anti-immigration, vaccine-skeptic posts often punctuated with large red exclamation marks.
It’s a “hate algorithm,” said the head of the party’s social media team. wapo.st/3GvqiDC
That Facebook might be amplifying outrage — while driving polarization and elevating more extreme parties around the world — has been ruminated inside the company for years, according to the internal documents known as the Facebook Papers. wapo.st/3GvqiDC
Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. wapo.st/3bdEHpG
It’s in the spotlight thanks to waves of revelations from The Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems. wapo.st/3bdEHpG
Since 2018, the algorithm has elevated posts that encourage interaction, such as ones popular with friends. This broadly prioritizes posts by friends and family and viral memes, but also divisive content. wapo.st/3bdEHpG
Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal.
The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged. wapo.st/3GniSCx
Facebook’s own researchers were quick to suspect a critical flaw.
Favoring “controversial” posts could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. wapo.st/3GniSCx
The company’s data scientists confirmed that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity, and low-quality news.
For three years, Facebook had given special significance to the angry emoji. wapo.st/3GniSCx
Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets. washingtonpost.com/technology/202…
In Vietnam, upholding the free-speech rights of people who question government leaders could have come with a significant cost in a country where Facebook earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International. washingtonpost.com/technology/202…
Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on condition of anonymity to describe internal company discussions. washingtonpost.com/technology/202…
Internal documents reveal that Facebook has privately tracked real-world harms exacerbated by its platforms, ignored warnings from employees about the risks of their design decisions and exposed vulnerable communities around the world to dangerous content. wapo.st/3Et94VZ
Disclosed to the SEC by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen’s legal counsel.
Here are key takeaways from The Post’s investigation: wapo.st/3Et94VZ
Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds.
But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook. wapo.st/3CeS6d9