I want Facebook to refer its suspension of Trump's account to the @OversightBoard. You should too.

lawfareblog.com/facebook-overs…
absolutely nailed the spelling of "too" this time ☺️
@OversightBoard Checks and balances shouldn't exist only for decisions taken against the winds of public opinion. Facebook should allow oversight of its most high-profile and controversial content moderation decision yet.
In one sense, it's amazing to me that there haven't been more calls for this or discussion given this is kind of *precisely* where many envisioned the Board experiment was heading
It shows how much of content moderation is still responsive to the Discourse, which I think is a problematic model of speech governance
Platforms should be forced to live up to the sentiments expressed in their figleaf rationales for the Great Deplatforming.

A Board referral is a good place to start.

theatlantic.com/ideas/archive/…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with evelyn douek

evelyn douek Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @evelyndouek

10 Jan
After thinking about it, I think Facebook should refer its Trump suspension to the @OversightBoard, and it should use the expedited process to do so.

You should to.
This is one of the most consequential and high profile decisions in content moderation, and we don't trust that it was made on principal rather than business expediency.

This is *exactly* what the @OversightBoard and its expedited process is for. If not now, when?
The usual calls for @OversightBoard intervention are quiet, but we should not only want checks and balances for decisions we agree with.
Read 10 tweets
9 Jan
In two weeks, Trump will be out of power, but platforms won’t be. They should be forced to live up to the sentiments in their fig-leaf rationales.

My thoughts on the Great Deplatforming.

theatlantic.com/ideas/archive/…
The title suggests I'm calling for Mass Deplatformings, which is not my point at all. What I want is for platforms to live up to the myth of content moderation they tell, that their decisions are Principled and In The Public Interest; that they will be consistent and contextual.
I believe there are speech interests at stake in the decisions platforms make. I don't buy that these are companies so just let them do whatever, whenever. We deserve better than that.

theatlantic.com/ideas/archive/…
Read 5 tweets
7 Oct 20
I have literally no idea what Facebook's new policy is on QAnon or what it will apply to in future, and so I would like you to please read this post but replace "Twitter" with "Facebook"

lawfareblog.com/twitter-brings…
It seems pretty sui generis, which makes sense because no one is really sure what QAnon is. Not even many of its adherents

wired.com/story/qanon-su…
Hard not to think that the House condemnation played a role here, given timing. I hope so: that seems a more accountable and democratic way for this to work. I wish that had been made explicit.

washingtonpost.com/powerpost/elec…
Read 8 tweets
23 Aug 20
Watching Trump continually test platforms' voter suppression policies, instinctively trying to find ambiguities and loopholes, I'm always reminded of this @kevinroose piece, which to me will be a classic of this era: The President vs. The Mods

nytimes.com/2020/05/29/tec…
"if the mods are afraid to hold them accountable when they break the rules, they will keep pushing the limits again and again — until ultimately, the board is theirs to run."
As an Australian, as in all things, I'm in favor of a purposive interpretation of platform voter suppression and election misinformation policies, rather than a purely textualist one.
Read 5 tweets
25 May 20
Lots of imp pushback abt misconceptions re: "bots" here yesterday, so I'm going to collect them in a thread (mainly so I can find them later)

Bot myths, like echo chambers & backfire effects, are an area where myth & reality don't match up, & will cause policy misfires /1
What caused this was an NPR story claiming *wild* bot figures that confirmed a lot of people's priors about social media so spread far and wide.

Here's a good thread breaking down some of the things wrong with that story /2

Twitter then (I mean, there's no other word for it) subtweeted the story here /3

Read 11 tweets
13 Dec 19
This is an important and careful review of the Oversight Board, with robust recommendations that provide important markers now for how Facebook responds and develops the institution. THREAD on some of the things we should watch coming out of this: /1
One of the most important, echoing a constant theme in my own work, is that the "subject matter jurisdiction" (as I call it here ssrn.com/abstract=33653…) needs to expand over time beyond mere take-down/leave up decisions, if the Board is to be a meaningful check /2
A huge and welcome theme of the report is the need to focus on vulnerable groups, noting the difficulty, intersectionality and contextuality of determining this. Imp. to note that this (& other recs) is a rec for the Board, which Facebook cannot and should not control /3
Read 12 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!