absolutely nailed the spelling of "too" this time ☺️
@OversightBoard Checks and balances shouldn't exist only for decisions taken against the winds of public opinion. Facebook should allow oversight of its most high-profile and controversial content moderation decision yet.
In one sense, it's amazing to me that there haven't been more calls for this or discussion given this is kind of *precisely* where many envisioned the Board experiment was heading
It shows how much of content moderation is still responsive to the Discourse, which I think is a problematic model of speech governance
Platforms should be forced to live up to the sentiments expressed in their figleaf rationales for the Great Deplatforming.
This is one of the most consequential and high profile decisions in content moderation, and we don't trust that it was made on principal rather than business expediency.
This is *exactly* what the @OversightBoard and its expedited process is for. If not now, when?
The usual calls for @OversightBoard intervention are quiet, but we should not only want checks and balances for decisions we agree with.
The title suggests I'm calling for Mass Deplatformings, which is not my point at all. What I want is for platforms to live up to the myth of content moderation they tell, that their decisions are Principled and In The Public Interest; that they will be consistent and contextual.
I believe there are speech interests at stake in the decisions platforms make. I don't buy that these are companies so just let them do whatever, whenever. We deserve better than that.
I have literally no idea what Facebook's new policy is on QAnon or what it will apply to in future, and so I would like you to please read this post but replace "Twitter" with "Facebook"
Hard not to think that the House condemnation played a role here, given timing. I hope so: that seems a more accountable and democratic way for this to work. I wish that had been made explicit.
Watching Trump continually test platforms' voter suppression policies, instinctively trying to find ambiguities and loopholes, I'm always reminded of this @kevinroose piece, which to me will be a classic of this era: The President vs. The Mods
"if the mods are afraid to hold them accountable when they break the rules, they will keep pushing the limits again and again — until ultimately, the board is theirs to run."
As an Australian, as in all things, I'm in favor of a purposive interpretation of platform voter suppression and election misinformation policies, rather than a purely textualist one.
This is an important and careful review of the Oversight Board, with robust recommendations that provide important markers now for how Facebook responds and develops the institution. THREAD on some of the things we should watch coming out of this: /1
One of the most important, echoing a constant theme in my own work, is that the "subject matter jurisdiction" (as I call it here ssrn.com/abstract=33653…) needs to expand over time beyond mere take-down/leave up decisions, if the Board is to be a meaningful check /2
A huge and welcome theme of the report is the need to focus on vulnerable groups, noting the difficulty, intersectionality and contextuality of determining this. Imp. to note that this (& other recs) is a rec for the Board, which Facebook cannot and should not control /3