I've been telling people for months that the Oversight Board may surprise them. Well, surprise!
Facebook's Oversight Board invokes the Rabat Plan of Action. In case you're wondering what that is, you can start here. article19.org/resources/arti…
The Rabat Plan of Action requires that *state* decisions to restrict incitement of hatred yet respect freedom of expression and freedom of religion look at: (1) context, (2) status of the speaker, (3) intent, (4) content and form, (5) extent of reach, and (6) imminence of harm.
(One may note at this point that Facebook is not a "state." But it's not crazy that the Oversight Board "drew upon" the Rabat Six Factors to evaluate not-a-state Facebook's actions with regard to Trump. Implicit acknowledgement that FB's decisions have real-world speech impacts.)
It's not surprising that Trump's speech both on Jan. 6 and generally in the post-election period flunk the Rabat Six Factors test. Trump defenders already have claimed that Trump's intent (one of the factors) can't be known. But the OB rightly says this isn't relevant.
(Not to put too fine a point on it, but there's no serious disputing that Trump "should have known" his public statements probably would incite violence.)
The Oversight Board frequently notes that "a minority" of the Board sees more than one path to support restrictions on Trump's speech and account access on the platform, but forbears from deciding whether those other paths are valid because it's not necessary to the decision.
Strictly speaking, a U.S. legal tribunal may at its own discretion outline alternative reasoning for results, but the Oversight Board is hyperconscious that it's institution-building here. So the OB wants not to seem to be overplaying its hand. tl;dr, OB wants to earn your trust.
(U.S. law allows, of course, that Facebook is within its rights to ban Trump "indefinitely," but the Oversight Board correctly notes more than once that the standards FB uses to its decisions to restrict or ban are unclear or inconsistently applied. FB could do better.)
The Oversight Board tastes a little bit of Marbury v. Madison independence and kind of likes it. "We have our job," the OB effectively says (I'm paraphrasing), "but you need to do your job, Facebook."
Some of us Oversight Board-watchers had wondered whether the OB would restore Trump to the platform but also require FB to set standards that, if Trump violates them in the future, could lead to further restrictions and/or a permanent ban. The OB picks up on that here.
Many members of the Oversight Board are profoundly aware that Marbury v. Madison established the Supreme Court's independence *and* its power of judicial review *by refusing to do--and finding unconstitutional--what Congress had expressly asked it and authorized it to do*.
(For Supreme Court students, the Judiciary Act of 1789 expanded the Supreme Court's "original jurisdiction"--to hear cases directly rather than on appeal. But the SC noted that the Constitution didn't give Congress the power to do that through mere legislation.)
In contrast to the Supreme Court, however, the Oversight Board is empowered to offer what lawyers would call "advisory opinions," and FB asked the OB to do just that. The Oversight Board has some thoughts.
(In short, if you ask the OB for advice, they may give you advice you don't necessarily want to hear, and they may even point out that you asked the wrong questions to start with. This is institution-building behavior too, incidentally.)
And here's the Oversight Board giving FB not-too-subtle instruction on how it might go about setting policy standards that lead to more consistent and predictable (and less assailable) results.
"Since you asked for our advice ... well, have a seat." Among other things, the Oversight Board says the inflammatory speech of political leaders (and ev en people not holding office--hi, Donald!) should be rapidly escalated for top-level review and quick disposition.
Note that this is not saying all users should be held to the same process--instead, it's saying that powerful political voices posting inflammatory content should get *accelerated* process. Doing so both recognizes the importance of political speech and the need to limit harm.
This is a different approach from the idea that political figures should face the same process as all the little folks who get banned/restricted routinely. It's acknowledging that officeholders (and ex-officeholders) have more potential to do harm, but their speech is important.
If you've cleaned your plate so far in devouring the Oversight Board's opinion today--and made sure to eat your greens--here's your dessert. tl;dr, FB has a lot of work to do, including review of policies, processes, and, yes, programming that got us here.
'This case highlights further deficiencies in Facebook’s policies that it should address. In particular, the Board finds that Facebook’s penalty system is not sufficiently clear to users and does not provide adequate guidance to regulate Facebook’s exercise of discretion.' -30-
I hope someone threadreaderapps me.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
@jonibrennan@dangillmor@dak3@mmasnick This is fairly straightforward. Zuckerberg would like bright-line rules that define what Facebook and other social-media platforms should and shouldn't be doing. Whatever that consensus is, Facebook knows it can afford to pay to meet the standard. 1/x
@jonibrennan@dangillmor@dak3@mmasnick If the standard is difficult for current competitors to meet, or new startups to meet, that's not Zuck's problem. He just wants the reflexive and frequently contradictory complaints about Facebook to go away on the one hand, and to preserve market dominance on the other. 2/x
@jonibrennan@dangillmor@dak3@mmasnick But what Zuckerberg wants (a consensus about the rules so he can just meet the consensus standards and then relax) can be difficult for multinational companies like Facebook. Harmonizing content between U.S. and EU standards is hard; adding, say, Thailand and India is harder. 3/x