Super interesting, German District Court Frankfurt (2-03 O 188/21) rules on filter-obligations for social networks in defamation cases (Künast v. Facebook).
In a nutshell:
Facebook loses, needs to take pro-active measures to prevent (identical + equivalent) defamation.
1/X
Court basically argues: After the claimant (MP Künast) notified Facebook about one instance of defamatory post, the law didn’t require her to sysiphus-like search every re-share or re-upload. Instead, Facebook was obliged to pro-actively prevent copies and re-appearances.
2/X
Court argues that this also includes prevention of “similar content” (different wording, different layout, hidden pixels). Court argues that even if this might require human review, this doesn’t make the (de-facto-filter) obligation disproportionate.
3/X
Procedural argument: Facebook hadn’t explained substantively enough that such efforts could be disproportionately burdensome (burden of proof here with the provider, as the German Federal Court of Justice had clarified years ago for hosts).
4/X
Ruling is important. Filter obligations for defamations were subject to the ECJ ruling in Glawischnig v. Facebook. ECJ was bit vague, could be interpreted as not allowing traditional proportionate obligations (instead only allowing what automated means could fool-proof do)
5/X
The Frankfurt court now reaffirms that Glawischnig v. Facebook should not be interpreted to restrict existing understanding of obligations of intermediaries.
This is what can be drawn from the press statement of the Court (reasons not yet published).
6/X
Definitely good outcome for victims of Digital Violence. And Facebook management can be happy, too, as now they can better explain to CEO-level why they need more resources in moderation, trust & safety.
For those worried about too strict provider liability:
7/X
Yes, ruling implies to see Art. 15 ECD flexibly. This does not mean all services have to fear strong obligations. On case-by-case-basis, weighing fundamental rights, courts can be more provider-friendly (e.g. platforms w less resources, less involved in curation than FB).
8/X
Probably FB will appeal, maybe will end at ECJ.
However, even more important that #DSA does not change the underlying (as you see: still evolving) European law, making it harder for victims of Digital Violence.
See here Nr. 2: hateaid.org/wp-content/upl…
9/9
Probably claimant @RenateKuenast, her attorney @Anwalt_Jun and NGO #HateAid (which supported the claimant here), will soon describe in more detail what the Frankfurt Court explained today.
One more aspect of the decision, which is also (nearly) groundbreaking: Court also ordered FB to pay 10.000 Euro monetary damages (kind of immaterial damages for not sufficiently acting against the defamations). This is also hot topic (before, only one court did similar imO)
and as a last insight: ruling implies that the Frankfurt Court did not assume that a court order is (constitutive) precondition for filter-obligations ... this is not surprising from continental law perspective, but some scholars had interpreted ECJ in Glawischnig that way.
Thread-add: this landmark decision didn’t happen accidentally. Claimant Attorney @Anwalt_Jun has been fighting for this direction before, represented a claimant in a similar case (the first such defamation case against FB) in 2017, which was lost for procedural reasons ...
#HateAid is an innovative NGO fighting against Digital Violence. Claimant MP Künast, also with the background as being targeted by Hate Speech personally very much (it’s pattern: female, outspoken = exponentially attacked online) is a leading political voice in this field, ...
and, as seen now, also takes personal action (one reason why we see this decision here now in 2022 as a surprise when IP-rightsholders settled the same kind of litigation victories 20 years ago is that in hate speech / defamation only few victims will have the resources to sue)..
Thesis: decision might astonish now, but maybe in some years will be remembered as beginning of a long way back to "normal": platforms have to internalize the externalities they cause (here FB's amplifications, anonymous use as risk-factors for personality rights infringements)..
Concerns for freedom of expression of course legit, but think important: this case law that we see now will allow flexibility to rule less strict against other kinds of platforms (less amplifying, less engagement driven, smaller, more efforts against repeat infringers and so on)
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Art. 15(1) E-Commerce-Directive is THE decisive rule to which extent you might successfully sue Facebook & Co. to filter/prevent infringing content.
Topic is of relevance these days for 2 reasons: First, we might see another landmark ruling: on 8th April 2022 ...
🧵1/X
.. a German district Court will rule on filter obligations in prominent proceeding Künast v. Facebook (important since in this area we have one ECJ ruling which is somewhat vague and few national courts have chances to move on with interpretation).
2/X
The other reason why we should pay attention is the ongoing #DSA -negotiations. Lawmakers are split whether to change the rules here, with the EP wanting to make it more industry-friendly (not good!).
Sidenote: Art. 7 is way more important than many other things in the DSA.
3/X
Thread: Politiker wollen #Google und #Apple bitten, #Telegram aus den Appstores zu nehmen (#BMI#Faeser will an “gesellschaftliche Verantwortung” appelieren; NI-Min #Pistorius will “dringend mit ihnen sprechen und sie davon überzeugen …”). 1/X
2/X:
Informelles Verwaltungshandeln ist eine gutes Thema für mehrere Doktorarbeiten. Auf jeden Fall hat es mal mehr, mal weniger Beigeschmack. Hier finde ich es nicht überzeugend:
3/X:
(1.) Apple und Google trifft nach derzeitiger Rechtslage keine Pflicht, die App zu löschen.
(2.) Absprachen mit Big Tech stehen unter besonderem Stern: Big Tech ist oft “kooperativ”, wenngleich sie sich auch quer stellen könnten ...