Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets. washingtonpost.com/technology/202…
In Vietnam, upholding the free-speech rights of people who question government leaders could have come with a significant cost in a country where Facebook earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International. washingtonpost.com/technology/202…
Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on condition of anonymity to describe internal company discussions. washingtonpost.com/technology/202…
Ahead of Vietnam’s party congress in January, Facebook significantly increased censorship of “anti-state” posts by local users, giving the state near-total control over the platform, according to local activists and free-speech advocates. washingtonpost.com/technology/202…
Zuckerberg’s role in the decision, which has not been previously reported, exemplifies his relentless determination to ensure Facebook’s dominance, often at the expense of his stated values, according to interviews with more than a dozen former employees. washingtonpost.com/technology/202…
That ethos has come under fire in a series of whistleblower complaints filed with the SEC by former Facebook product manager Frances Haugen. The allegations represent arguably the most profound challenge to Zuckerberg’s ironclad leadership. washingtonpost.com/technology/202…
Experts said the SEC — which has the power to depose Zuckerberg, fine him and even remove him as chairman — is likely to dig more deeply into what he knew and when. washingtonpost.com/technology/202…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. wapo.st/3bdEHpG
It’s in the spotlight thanks to waves of revelations from The Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems. wapo.st/3bdEHpG
Since 2018, the algorithm has elevated posts that encourage interaction, such as ones popular with friends. This broadly prioritizes posts by friends and family and viral memes, but also divisive content. wapo.st/3bdEHpG
Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal.
The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged. wapo.st/3GniSCx
Facebook’s own researchers were quick to suspect a critical flaw.
Favoring “controversial” posts could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. wapo.st/3GniSCx
The company’s data scientists confirmed that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity, and low-quality news.
For three years, Facebook had given special significance to the angry emoji. wapo.st/3GniSCx
Internal documents reveal that Facebook has privately tracked real-world harms exacerbated by its platforms, ignored warnings from employees about the risks of their design decisions and exposed vulnerable communities around the world to dangerous content. wapo.st/3Et94VZ
Disclosed to the SEC by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen’s legal counsel.
Here are key takeaways from The Post’s investigation: wapo.st/3Et94VZ
Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds.
But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook. wapo.st/3CeS6d9
In 2019, a pair of Facebook employees set up a dummy account to better understand the experience of a new user in India.
Without any direction from the user, the Facebook account was soon flooded with pro-Modi propaganda and anti-Muslim hate speech. wapo.st/3GkowoM
An internal Facebook memo, reviewed by The Washington Post, called the dummy account test an “integrity nightmare” that underscored the vast difference between the experience of Facebook in India and what U.S. users typically encounter. wapo.st/3GkowoM
About the same time, in a dorm room in northern India, a Kashmiri student named Junaid told The Post he watched as his real Facebook page flooded with hateful messages.
One said Kashmiris were “traitors who deserved to be shot.” wapo.st/3GkowoM
Relief flowed through Facebook in the days after the 2020 presidential election.
The company had cracked down on misinformation, foreign interference and hate speech. wapo.st/3b4YIi6
Employees believed they had largely succeeded in limiting problems that, four years earlier, had brought on perhaps the most serious crisis in Facebook’s scandal-plagued history.
But the high fives, it soon became clear, were premature. wapo.st/3b4YIi6
Across the Andes, a region that has reported some of the world’s highest covid-19 death rates, teams are traversing deserts, mountains, rainforests and rivers to vaccinate isolated communities. wapo.st/3iYxC0E
In Colombia, a country of more than 48 million people, about 16 percent of the population lives in rural areas that were neglected by the government during more than five decades of armed conflict. wapo.st/3iYxC0E
In this remote part of the northern department of La Guajira, home to the country’s largest Indigenous population, there are no paved roads, no electricity, no running water and no other access to the vaccines that would protect their communities. wapo.st/3iYxC0E