This is one of the most consequential and high profile decisions in content moderation, and we don't trust that it was made on principal rather than business expediency.
This is *exactly* what the @OversightBoard and its expedited process is for. If not now, when?
The usual calls for @OversightBoard intervention are quiet, but we should not only want checks and balances for decisions we agree with.
The strongest argument against is that this is a sui generis decision with no general applicability, so why bother.
I don't think that's true.
First, there is always going to be the issue of how to balance the public interest in holding leaders to account with safety interests. This sadly will not be the last time this issue comes up.
Next, this decision raises a very important question of how much Facebook should look off its platform to surrounding social context in making its decisions.
There's the issue of remedy: the "indefinite" and at large suspension is strange. It leads to extraordinary uncertainty, and we're still just waiting to see how Zuck feels when Trump's term is over. That's a pretty shitty way for this to play out!
The @OversightBoard was created to be a check and balance on Facebook's decision-making processes. It was created to give Facebook's decisions legitimacy, and make them more than just "Mark decides".
And so Mark should decide to refer this decision to them.
And the original tweet should say "too" obviously.
Damnit.
It would be a way for Facebook to show that this was exactly the kind of principled decision-making they said it was in their fig-leaf rationale. We should make them live up to those sentiments, not only when it suits us.
absolutely nailed the spelling of "too" this time ☺️
@OversightBoard Checks and balances shouldn't exist only for decisions taken against the winds of public opinion. Facebook should allow oversight of its most high-profile and controversial content moderation decision yet.
The title suggests I'm calling for Mass Deplatformings, which is not my point at all. What I want is for platforms to live up to the myth of content moderation they tell, that their decisions are Principled and In The Public Interest; that they will be consistent and contextual.
I believe there are speech interests at stake in the decisions platforms make. I don't buy that these are companies so just let them do whatever, whenever. We deserve better than that.
I have literally no idea what Facebook's new policy is on QAnon or what it will apply to in future, and so I would like you to please read this post but replace "Twitter" with "Facebook"
Hard not to think that the House condemnation played a role here, given timing. I hope so: that seems a more accountable and democratic way for this to work. I wish that had been made explicit.
Watching Trump continually test platforms' voter suppression policies, instinctively trying to find ambiguities and loopholes, I'm always reminded of this @kevinroose piece, which to me will be a classic of this era: The President vs. The Mods
"if the mods are afraid to hold them accountable when they break the rules, they will keep pushing the limits again and again — until ultimately, the board is theirs to run."
As an Australian, as in all things, I'm in favor of a purposive interpretation of platform voter suppression and election misinformation policies, rather than a purely textualist one.
This is an important and careful review of the Oversight Board, with robust recommendations that provide important markers now for how Facebook responds and develops the institution. THREAD on some of the things we should watch coming out of this: /1
One of the most important, echoing a constant theme in my own work, is that the "subject matter jurisdiction" (as I call it here ssrn.com/abstract=33653…) needs to expand over time beyond mere take-down/leave up decisions, if the Board is to be a meaningful check /2
A huge and welcome theme of the report is the need to focus on vulnerable groups, noting the difficulty, intersectionality and contextuality of determining this. Imp. to note that this (& other recs) is a rec for the Board, which Facebook cannot and should not control /3