Uh... Doesn't Facebook permanently ban people all the time? I would have thought that was normal.
And does this kind of vindicate YouTube?
OMG wait you guys. Once that Florida law is in effect, will the FB Oversight Board decision be nullified (bc the law requires leaving up posts by candidates)?
Or... does the Board have to decide if that law is constitutional in order to determine its own authority??
I LOVE this.
Continuing with this anti-Marbury moment... I feel like Facebook's implicit question to the Board was, "if the suspension isn't indefinite, then how long should it be, or should it be permanent?"
Remanding to make Facebook decide that very thing feels like a major punt.
To be clear I HAVE NOT READ THE WHOLE DECISION. Snarking about it as I go is more fun. I'll read it eventually, probably.
Do they always opine on ten tangentially related issues like this?
(Example: Demanding better public disclosures about FB's policies for disclosing user data to law enforcement. I'm all for that but... it's not exactly at issue here.)
With great reluctance, I decided to save my copy of this Oversight Board decision in the "Cases and Statutes" folder. But then in an apparent Freudian file saving slip, I put it in "Secondary Lit." Where I think it will stay.
An oddity: while the FB Oversight Board's job, as I understand it, is to apply Facebook's actual policies (inflected by human rights norms), the referred question was instead about FB's "values."
This isn't a dig about values, just textual analysis.
Does the Board have authority to order Facebook to do things like this? Serious question, I am not an expert in the Board the way @Klonick or @evelyndouek are.
How about a parole hearing every two years before bored and hostile officials. Would that fix this?
OK update on this "values" question. Apparently under the Board's Charter, it has to apply three sets of standards. One is FB's "content policies," and a separate one is FB's "values." "Values" are listed in its Community Standards. Which... I think are its content policies.
The third is human rights standards. There's this interesting reference to FB's "corporate human rights policies," which muddies the water a little about what the source of authority is, but I *think* this is saying IHR law itself is the authority.
Oh, man. It's like this from the Trump team's "brief" was calculated to antagonize most of the Board.
Questions the Board asked, but Facebook declined to answer. You could make a whole law school exam -- or a panel -- out of these.
(If I were the Board I'd absolutely go fishing like this, too.)
Again with the confusion about the source of authority for applying international human rights standards.
Sorry, here's the list of questions FB declined to answer. I had the wrong image before.
And... the Board formally endorses the idea that for some speech, the appropriate remedy is limiting how many people see it.
I feel like that should have been briefed before they weighed in. Like, extensively briefed.
Oooh, a name check for the Berkeley Protocol for researcher access to data! @hrcberkeley@KAlexaKoenig
Back on the point about the Board endorsing demotion for certain speech... It seems pretty inconsistent to say that out of the blue, but then at the same time insist that the question actually presented cannot be answered unless Facebook first sets a clear policy to be reviewed.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The meme that “platforms algorithmically amplify polarizing content because engagement drives ad revenue” has gotten seriously out of hand. Someone needed to burst that bubble. It’s such a bummer that the someone is Facebook VP Nick Clegg. 1/ nickclegg.medium.com/you-and-the-al…
People will give his points less credence because of who they’re coming from. (And posting them on Medium, as if they were independent musings and not a message crafted by Facebook’s Comms and Policy teams is an interesting branding gesture but isn’t going to fool anyone.) 2/
He’s saying so many things that are right, though. 3/
This part of Zuckerberg’s testimony is a feat of geopolitical dexterity. 18 months ago, Facebook lost a major case about global content filtering in the EU. So now it’s telling Congress that *every* platform should be held to the standard imposed on FB by European courts. 1/
Platforms are geopolitical vectors. They take laws, including speech laws, from one country and impose them everywhere else. 2/
Historically that meant exporting U.S. 1st Amendment values, to the dismay of countries with different constitutional systems. That’s reversed now. Platforms are net importers of more restrictive speech rules from other countries to the U.S. 3/
What are we talking about, when we talk about transparency? It's time for civil society, researchers, technologists and others to figure out -- before an unprecedented moment of political opportunity passes us by. A new blog post from me. cyberlaw.stanford.edu/blog/2021/03/s…
By the way, here was my list of questions I wanted content moderation transparency data to answer as of a couple of years ago.
Poland’s case arguing that the “upload filter” provisions of DSM/Copyright Directive Article 17 violate Internet users’ fundamental rights, which has been quietly sitting before the CJEU for months, is a ticking time bomb. 1/
One day the ruling is going to come out and change everything. Or maybe nothing. 2/
This thread is about that case, and how it relates to some other looming political and legal questions about filters. 3/
I respect the Facebook Oversight Board and wish them the best. but “sucked into the private power vortex” is a good description of what’s going on. Do we want to support and reinforce pseudo-governance with pseudo-rights protections designed by Facebook?
US lawmakers may be stuck accepting this kind of private governance (to go with our private prisons and private military contractors etc.) bc the 1st Am prevents Congress from setting the rules Internet users mostly want. Lawmakers’ hands are tied on this side of the Atlantic.