"Content-moderation decisions are momentous but they are as momentous as they are bc of fb’s engineering decisions & other choices that determine which speech proliferates... & in what context [users] see it"
I think it's unlikely the @OversightBoard will take their recommendation to refuse to answer the question abt Trump's account until fb commissions & publishes a study abt the lead up to Jan. 6
(Altho I think it should and likely will recommend such a study in their decision)
But what if they do? A fun hypothetical for this wannabe law professor to imagine.
Things I'm curious about: 1. If the decision goes against public comment, will that discourage future participation? 2. Level of intl engagement 3. If future overseas cases can also garner such considered engagement (I sure hope so!)
Really starting to regret not putting my comment on letterhead...
The US has a rich tradition of seeing the 1A as existing to facilitate democracy and self-government. Australia drew on that thinking in implying a freedom of political communication into its Constitution which, famously, has no right to free speech.
During the same period (as Emily documents, drawing on @glakier's work), the US itself moved way from that tradition, adopting an increasingly libertarian view of the 1A instead.
absolutely nailed the spelling of "too" this time ☺️
@OversightBoard Checks and balances shouldn't exist only for decisions taken against the winds of public opinion. Facebook should allow oversight of its most high-profile and controversial content moderation decision yet.
This is one of the most consequential and high profile decisions in content moderation, and we don't trust that it was made on principal rather than business expediency.
This is *exactly* what the @OversightBoard and its expedited process is for. If not now, when?
The usual calls for @OversightBoard intervention are quiet, but we should not only want checks and balances for decisions we agree with.
The title suggests I'm calling for Mass Deplatformings, which is not my point at all. What I want is for platforms to live up to the myth of content moderation they tell, that their decisions are Principled and In The Public Interest; that they will be consistent and contextual.
I believe there are speech interests at stake in the decisions platforms make. I don't buy that these are companies so just let them do whatever, whenever. We deserve better than that.