Most political parties in Poland have complaints about Facebook’s algorithms, the obscure formulas that decide which posts pop up on a user’s news feed and which fade into the ether.

The far-right Confederation party does not. wapo.st/3GvqiDC
The Confederation’s content generally does well, including a slew of anti-lockdown, anti-immigration, vaccine-skeptic posts often punctuated with large red exclamation marks.

It’s a “hate algorithm,” said the head of the party’s social media team. wapo.st/3GvqiDC "I think we are good w...
That Facebook might be amplifying outrage — while driving polarization and elevating more extreme parties around the world — has been ruminated inside the company for years, according to the internal documents known as the Facebook Papers. wapo.st/3GvqiDC
In one April 2019 document detailing a research trip to the European Union, a Facebook team reported feedback from European politicians that an algorithm change the previous year had changed politics “for the worse.” wapo.st/3GvqiDC "Facebook wants blood....
That change was billed by Facebook’s chief executive Mark Zuckerberg as an effort to foster more “meaningful” interactions on the platform.

The team reported back specific concerns from Poland, where parties had described a “social civil war” online. wapo.st/3GvqiDC

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with The Washington Post

The Washington Post Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @washingtonpost

28 Oct
Tens of thousands of diplomats, researchers, protesters and presidents from around the globe are scheduled to descend on Glasgow, Scotland, starting next week for a critical United Nations climate summit. wapo.st/3BiAeNa
The overarching goal of COP26 is to get countries to commit to more ambitious, detailed plans to cut their planet-warming emissions and collectively slow climate change.

That lofty aim can sometimes seem abstract. wapo.st/3BiAeNa
The Post spoke to people around the world — including youth activists, scientists, government leaders and people whose livelihoods are threatened by climate change — to hear in their own words why COP26 matters and what is at stake if countries fail. wapo.st/3BiAeNa
Read 8 tweets
28 Oct
Facebook researchers had deep knowledge of how coronavirus and vaccine misinformation moved through the company’s apps, according to documents disclosed by Facebook whistleblower Frances Haugen. wapo.st/3mmERBx
But even as academics, lawmakers and the White House urged Facebook for months to be more transparent about the misinformation and its effects on the behavior of its users, the company refused to share much of this information publicly.

From August: washingtonpost.com/technology/202…
Internally, Facebook employees showed that coronavirus misinformation was dominating small sections of its platform.

Other researchers documented how posts by medical authorities, like the WHO, were often swarmed by anti-vaccine commenters. wapo.st/3mmERBx
Read 4 tweets
26 Oct
Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. wapo.st/3bdEHpG
It’s in the spotlight thanks to waves of revelations from The Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems. wapo.st/3bdEHpG
Since 2018, the algorithm has elevated posts that encourage interaction, such as ones popular with friends. This broadly prioritizes posts by friends and family and viral memes, but also divisive content. wapo.st/3bdEHpG
Read 7 tweets
26 Oct
Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal.

The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged. wapo.st/3GniSCx
Facebook’s own researchers were quick to suspect a critical flaw.

Favoring “controversial” posts could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. wapo.st/3GniSCx
The company’s data scientists confirmed that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity, and low-quality news.

For three years, Facebook had given special significance to the angry emoji. wapo.st/3GniSCx
Read 6 tweets
25 Oct
Late last year, Mark Zuckerberg faced a choice: Comply with demands from Vietnam’s ruling Communist Party to censor anti-government dissidents or risk getting knocked offline in one of Facebook’s most lucrative Asian markets. washingtonpost.com/technology/202…
In Vietnam, upholding the free-speech rights of people who question government leaders could have come with a significant cost in a country where Facebook earns more than $1 billion in annual revenue, according to a 2018 estimate by Amnesty International. washingtonpost.com/technology/202…
Zuckerberg personally decided that Facebook would comply with Hanoi’s demands, according to three people familiar with the decision, speaking on condition of anonymity to describe internal company discussions. washingtonpost.com/technology/202…
Read 7 tweets
25 Oct
Internal documents reveal that Facebook has privately tracked real-world harms exacerbated by its platforms, ignored warnings from employees about the risks of their design decisions and exposed vulnerable communities around the world to dangerous content. wapo.st/3Et94VZ
Disclosed to the SEC by whistleblower Frances Haugen, the Facebook Papers were provided to Congress in redacted form by Haugen’s legal counsel.

Here are key takeaways from The Post’s investigation: wapo.st/3Et94VZ
Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds.

But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook. wapo.st/3CeS6d9 Public claims often conflict with internal research. In Marc
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(