In light of the Gonzalez v. Google oral argument, I thought I'd re-up some of my views on the ways the Court could go. Put simply, the Court is evaluating what does the text of Section 230(c)(1) actually say and do? Pretty loaded question, indeed.
Let's start with the basics. Section 230 (c)(1) says that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
Notice that there is no mention of the word "immunity" in the text. All the statute says is that we cannot treat an "interactive computer service" provider, in this case Google's YouTube, as the publisher or speaker of a third-party post, such as a YouTube video. That's it.
Warped interpretations from various courts, starting with the 1997 Fourth Circuit case Zeran v. AOL, have radically read Section 230(c)(1) as providing tech companies with almost total immunity in civil litigation.
Current interpretations of Section 230 support protecting tech companies when they most torts, violate various civil statutes, and even when they fail to adhere to their own user agreements. See law.justia.com/cases/californ…
These immunities aren't written in the statute; they are-court created. Worse, there isn't a credible textual argument that would support these judicial interpretations. The Court in Gonzalez can rectify this by sticking to the statute.
Here are some options. One option, advanced by several senators, is that tech companies should only be protected from causes of action that target a speaker or publisher, such as defamation suits—as opposed to protecting them from enforcement actions via federal civil statutes.
Another option would be to shield companies from liability for hosting and displaying content, but hold them responsible when they take actions beyond those of a traditional publisher, such as when they algorithmically push certain content to users.
Yet another possibility would be to allow this case to proceed and hold Google liable as a distributor of the illegal content, the same way mail couriers or newspaper stands would be. Neither the text nor the structure of Section 230 suggests that the Court can't do so.
In the case that they find Google liable under a distributor theory, the Court should remand the case for the parties to argue whether Google knew, or should have known, that ISIS was using its platform as a recruiting tool.
But in any case, the statute does not support the current predominant reading of Section 230(c)(1), which shields tech companies from practically all civil liability when third-party content is at all implicated. That reading is detached from basic statutory construction.
Courts need to refrain from following a standard formula for #Section230 analysis: because they're not sure what tech companies actually do and the case deals with a website or app's interaction with third-party content, they'll just grant sweeping immunity.
Fmr Chief Judge Robert Katzmann of the Second Circuit admitted as much in his concurring opinion critiquing the majority in Force v. Facebook, when his court granted full immunity to Facebook under similar circumstances and claims as in this case. law.justia.com/cases/federal/…
In sum, if we're all equal under the law and our courts are the great equalizers of our democracy, then justice demands no less than a faithful reading of Section 230(c)(1).
• • •
Missing some Tweet in this thread? You can try to
force a refresh
In Gonzalez v. Google, SCOTUS has a chance to clarify #Section230's meaning. Courts interpret Section 230 as shielding #BigTech from practically all civil liability when 3rd party content is at issue. I argue that nothing in the text supports that reading. newsweek.com/gonzalez-v-goo…
One option to rectify this is that tech companies should only be protected from causes of action that target a speaker or publisher, such as defamation suits—as opposed to protecting them from enforcement actions via federal civil statutes.
Another option would be to shield companies from liability for hosting and displaying content, but hold them responsible when they take actions beyond those of a traditional publisher, such as when they algorithmically push certain content to users.