This is a strong, important & timely decision, esp in light of Facebook's enforcement mistakes around #ResignModi yesterday.

There's some really important recommendations here. Facebook has 30 days to respond and that'll be worth watching.

A few things worth highlighting /1
The case concerned a video that was critical of Modi & the BJP. Facebook removed it under its Dangerous Orgs policy. It was a mistake, & when the Board selected the case, but before it decided, fb said it got it wrong and reversed the decision. /2
This is not the first time the Board has done this. It's telling Facebook "you can't moot a case by simply reversing decisions when we pick a case. You wouldn't have found the mistake otherwise." /3
This was a case of human error. It was flagged for terrorism, a human looked at it and removed it. Facebook said the length of the video (17 mins), the complexity of the content could have contributed to the mistake—reviewers don't always have time to watch videos in full /4
The user's account was automatically locked, and they did not get a review. Facebook said this was due to reduced appeal capacity due to COVID. The Board says "yeah, we get that, unusual times etc. but you really need to get back to full capacity asap" /5
The Board was very conscious of the political context. It expressed concern about mistakes that especially impact minority language speakers or religious minorities, and noted that the political context in India right now "underscores the importance of getting decisions right" /6
The Board asked for, and Facebook refused to provide (on the basis that it was not "reasonably required" or for legal reasons), answers to the Board's qs re: possible communications from Indian authorities. It's possible fb is legally restricted from doing so, but still (!) /7
The Board acknowledges that "mistakes are inevitable when moderating content at scale" but that without knowing error rates it's impossible to tell from one case whether this is a systemic problem or a one-off. /8
This is real progress from the Board!! Its previous decisions were WAY too case-specific, & didn't reckon with the way CoMo is different from offline speech cases

I talk about why that's important here (columbialawreview.org/content/govern…) & am stoked to see the Board talking about this /9
The recommendations then.

1. Fb should translate its Community Standards into Punjabi because, you know, 30 million ppl speak it in India and more around the world.

Uh... INSANE this had to come from the Board, but here we are. Geez, fb. /10
2. Fb should restore human review to pre-pandemic levels asap, while protecting health of staff.

Yes. And (this is me now) fb in its response really should disclose how far off those levels it is, and its timeline and plan for getting back to full capacity. /11
(me still) This has been going on for a while now, and looks set to continue in certain areas for even longer. There's only so long a company with the resources of fb should be able to keep pleading "pandemic" when the "pandemic" also makes adequate CoMo even more important /12
3. Finally, the Board says fb should report increase public info on "error rates"

😊😊😊 this recommendation is like Christmas come early for me, and is just obviously the way the conversation about CoMo needs to go

(Again, sorry, can't resist: columbialawreview.org/content/govern…) /13
The Board says fb shd do this by "making this info viewable by country and language" & "underscores that more detailed transparency will help the public spot areas where errors are more common, incl potential specific impacts on minority groups, & alert fb to correct them." /14
These recommendations are targeted, strong, and important. They are not binding. How fb responds in 30 days is the most critical part of this process (& also the part ppl tend to pay least attention to). /15
There's a lot of (in many respects, justified!) skepticism of the Board. It can't fix everything. But these recommendations show why I'm still hopeful it can make some meaningful impact. If fb plays ball. /16
As always, we're tracking this @lawfareblog FOBblog, and will be watching next steps.

The team @tia_sewell, @jacob_r_schulz and @qjurecic get this stuff up before you can say FOBblog 3 times fast. /17

lawfareblog.com/welcome-fob-bl…
@lawfareblog @tia_sewell @jacob_r_schulz @qjurecic Okay, I think that's it from me. The decision is worth reading.

Your move, Facebook. /18
Actually, one more thing. This mistake happened bc the human reviewer did not have time to watch the whole video & make a considered decision.

Yesterday, the EU Parliament approved a reg requiring platforms to remove terrorist content w/in 1 hour.

edri.org/our-work/europ…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with evelyn douek

evelyn douek Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @evelyndouek

21 Apr
I wrote about something Facebook should break more often: the glass in front of its toxic content dial

theatlantic.com/ideas/archive/…
Facebook can turn down the distribution on "borderline content" that approaches the line of breaking the rules. And it does, in emergency situations. But it's never explained why it doesn't do that all the time.
Indeed, in 2018, one Mark Zuckerberg argued they should

facebook.com/notes/75144900…
Read 4 tweets
16 Mar
Content Moderation in the stack watch

This is an incredibly useful article. The obvious fact that we need different approaches for different services/layers goes underappreciated.

We've been here before: a Universalizing approach does not work
That's the easy part, though. Working out the different obligations in different contexts is the hard part, and we've barely started.
Read 4 tweets
17 Feb
"Content-moderation decisions are momentous but they are as momentous as they are bc of fb’s engineering decisions & other choices that determine which speech proliferates... & in what context [users] see it"

Great op-ed by @JameelJaffer & @KGlennBass

nytimes.com/2021/02/17/opi…
I think it's unlikely the @OversightBoard will take their recommendation to refuse to answer the question abt Trump's account until fb commissions & publishes a study abt the lead up to Jan. 6

(Altho I think it should and likely will recommend such a study in their decision)
But what if they do? A fun hypothetical for this wannabe law professor to imagine.

A possible "constitutional" show down!
Read 7 tweets
11 Feb
It's quite remarkable to see the extent of serious engagement with the public comment process at the fb @OversightBoard re: Trump's account

Here's a strong letter from, amongst others, @rickhasen, @davidakaye and @alexstamos

politico.com/f/?id=00000177…
Things I'm curious about:
1. If the decision goes against public comment, will that discourage future participation?
2. Level of intl engagement
3. If future overseas cases can also garner such considered engagement (I sure hope so!)
Really starting to regret not putting my comment on letterhead...

Read 17 tweets
26 Jan
"The First Amendment of the era aided us. The guarantee of free speech is for democracy; it is worth little, in the end, apart from it."

Great @emilybazelon piece.

Fun fact from this humble foreign observer...

nytimes.com/2021/01/26/mag…
The US has a rich tradition of seeing the 1A as existing to facilitate democracy and self-government. Australia drew on that thinking in implying a freedom of political communication into its Constitution which, famously, has no right to free speech.
During the same period (as Emily documents, drawing on @glakier's work), the US itself moved way from that tradition, adopting an increasingly libertarian view of the 1A instead.
Read 5 tweets
11 Jan
I want Facebook to refer its suspension of Trump's account to the @OversightBoard. You should too.

lawfareblog.com/facebook-overs…
absolutely nailed the spelling of "too" this time ☺️
@OversightBoard Checks and balances shouldn't exist only for decisions taken against the winds of public opinion. Facebook should allow oversight of its most high-profile and controversial content moderation decision yet.
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!