Why Facebook Can't Fix Itself: excerpts from a New Yorker article by Andrew Marantz, with input from Real Facebook Oversight Board member, Rashad Robinson, and The Citizens member, Cori Crider.
1/ "Facebook’s stated mission is to “bring the world closer together.” It considers itself a neutral platform, not a publisher, and so has resisted censoring its users’ speech, even when that speech is ugly or unpopular."
2/ "In its early years, Facebook weathered periodic waves of bad press, usually occasioned by incidents of bullying or violence on the platform. Yet none of this seemed to cause lasting damage to the company’s reputation, or to its valuation."
3/ "Again and again, Facebook has erred on the side of allowing politicians to post whatever they want, even when this has led the company to weaken its own rules, to apply them selectively, to creatively reinterpret them, or to ignore them altogether."
4/ "Some of Facebook's detractors argue that, given the public’s widespread skepticism of the company, it should have less power over users’ speech, not more. “That’s a false choice,” Rashad Robinson said. “Facebook already has all the power. They’re just using it poorly.”"
5/ "He pointed out that Facebook consistently removes recruitment propaganda by ISIS and other Islamist groups, but that it has been far less aggressive in cracking down on white-supremacist groups."
6/ "It seems that the company’s strategy has never been to manage the problem of dangerous content, but rather to manage the public’s perception of the problem. Nick Clegg wrote that “with so much content posted, rooting out the hate is like looking for a needle in a haystack.”
7/ "A more honest metaphor would posit a powerful set of magnets at the center of the haystack—Facebook’s algorithms, which attract and elevate whatever content is most highly charged. If there are needles anywhere nearby the magnets will pull them in."
8/ "Facebook moderators have scant workplace protections and little job security. The closest they have to a labor organizer is Cori Crider, a lawyer and an activist based in London."
9/ "In July, 2019, Crider was introduced to a Facebook moderator in Europe who was able to map out how the whole system worked. This moderator put her in touch with other moderators, who put her in touch with still others."
10/ "The content moderators were not yet ready to form a union—“not even close,” Crider told me—but she hoped to inculcate in them a kind of latent class consciousness, an awareness of themselves as a collective workforce."
11/ "Last October, Crider met Chris Gray at a conference in London. She started introducing him to journalists and activists, helping to spread his story."
12/ "Two months later, Gray hired a local law firm and sued Facebook in Irish High Court, alleging that his “repeated and unrelenting exposure to extremely disturbing, graphic and violent content” had caused him lasting psychological trauma."
13/ "Shortly thereafter, about twenty more former Facebook moderators in Dublin contacted the law firm representing Gray to ask about possible lawsuits against the company."
14/ "Last October, the Trump campaign made an ad featuring blatantly false allegations about Joe Biden. CNN and other networks refused to run it; YouTube, Twitter, and Facebook did not."
15/ "“Our approach is grounded in Facebook’s fundamental belief in free expression,” Katie Harbath, the company’s public-policy director for global elections wrote. “Thus, when a politician speaks or makes an ad, we do not send it to third party fact-checkers.”"
16/ "On May 29th, on Twitter and Facebook, Trump mused about sending the National Guard to quell protests in response to George Floyd’s death. “When the looting starts, the shooting starts,” Trump wrote, a phrase that was widely seen as an incitement to violence."
17/ "Prominent segregationists had used these words, in the nineteen-sixties, to justify vicious attacks against Black people, including civil-rights protesters. Twitter didn’t remove Trump’s tweets but did append warning labels to them. Facebook, by contrast, did nothing."
18/ "On August 19th, Facebook announced changes to its guidelines. Chief among them was a new policy restricting the activities of “organizations and movements that have demonstrated significant risks to public safety,” including “US-based militia organizations.”
19/ "Four days later, in Kenosha, a police officer shot a Black man named Jacob Blake in the back, in front of his children. The Kenosha Guard, a self-described militia, put up a “call to arms” on its Facebook page, expressing intention to commit vigilante violence."
20/ "Within a day, according to BuzzFeed, more than 400 people had reported the page to Facebook’s content moderators, but the moderators decided that it did not violate any of Facebook’s standards and they left it up. Mark Zuckerberg later called this “an operational mistake.”
21/ "Last week, Facebook banned content relating to QAnon, the far-right conspiracy theory. It also took down a post by Trump that contained misinformation about the coronavirus, and announced plans to ban all political ads for an indefinite period starting on Election Night."
22/ "The restrictions are likely to feed into the notion that social media discriminates against conservatives. As Trump tweeted in May, “The Radical Left is in total command & control of Facebook, Instagram, Twitter and Google.” the bulk of the evidence suggests the opposite."
Two years ago Zuck told @karaswisher “I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different..."
On Monday, #Zuckerberg said his thoughts on holocaust denial had changed "with rising anti-Semitism, we're expanding our policy to prohibit any content that denies or distorts the Holocaust.”
#Zuck says "I've struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust.”
"It wasn't until early 2018 that Facebook published even basic information on Facebook ads." - @wrklsshrd
"We do see ads that feel like they are over the edge, even in terms of Facebook’s policies, particularly during the pandemic about health misinfo. Trump is quite happy to say that 'I am immune to Covid' and 'I’ve been cured' and potentially putting people at risk.” @wrklsshrd
"We're certainly seeing ads running for a long time after Facebook's own policy says they should be taken down. Facebook says they've got a policy. They're not enforcing that policy. I find that really unclear." - @wrklsshrd