, 29 tweets, 11 min read Read on Twitter
The CJEU's ruling in Glawischnig-Piesczek is out, and it is basically the worst case scenario. curia.europa.eu/juris/document… 1/
EU Member State courts can order platforms to use automated filters to block “identical” or “equivalent” uploads. Orders can have global effect and suppress legal expression in other countries, unless Austrian courts find some other reason in national law to stay their hand. 2/
TL;DR: It’s open season for courts to mandate made-up technology without knowing what that technology will do. And those mandates can apply to the whole world. 3/
I wrote about the issues in the case here cyberlaw.stanford.edu/blog/2019/09/f…. 4/
Aside from substance issues, there is a major process issue: this ruling affects billions of Facebook users, but they were not represented in court. Important legal arguments about their rights were simply not raised or considered. 5/
I thought the ruling would provide some wordy but unclear guidance about when a court can order filters or global takedown. But it doesn’t even do that. 6/
The section that OK’s global takedown (or at least says the law at issue here doesn’t bar them) in particular is almost comically short. Here's a screenshot. That's it!
@DanSvantesson talks about that part here: linkedin.com/pulse/bad-news… 7/
@DanSvantesson The filtering discussion though… Wow. Where to start? I’ll go with fundamental rights. 8/
@DanSvantesson The CJEU has repeatedly said in the past that filtering or monitoring by platforms can harm users’ rights to privacy and to free expression/information. So have int'l human rights bodies. So has the ECtHR (re expression rights and strict platform liability). 9/
@DanSvantesson Here, the Court does not even mention or consider those things. Its analysis only considers whether the filtering order might burden *Facebook*. (44, 46) Not a word about how the filter might affect the rights or interests of Facebook’s billions of users. 10/
@DanSvantesson How might this affect other Facebook users? NO ONE KNOWS. We can’t, because it is entirely unclear what technology the court thinks Facebook is being ordered to build. I talk about the different technologies this order might cover at pp 19-23 here cyberlaw.stanford.edu/files/Dolphins… 11/
@DanSvantesson Not knowing what technology this mandates is a huge problem. It makes assessing fundamental rights impact impossible. 12/
@DanSvantesson Is this a filter that will block too much expression (hurting other users’ rights)? Too little (hurting the plaintiff’s)? A filter will specifically impact data protection or privacy interests? The court lacks any information to address those questions. 13/
@DanSvantesson The court could have said "filters affect fundamental rights, and we can't assess that or balance the interests at stake without better facts, so here are factual questions for Member State courts to consider in filtering cases." Instead, just left rights out of its analysis. 14/
@DanSvantesson This filter might, per the lower court, detect images of plaintiff paired with specific words. ⇨Ordering FB to do that means ordering FB to run biometric facial recognition scans on people who have nothing to do with this case, in a reported 300 million photo uploads daily.⇦15/
@DanSvantesson It’s hard to overstate the privacy / data protection issues the filtering order will raise if it requires use of facial recognition technology. But apparently no one raised that point to the court. 16/
@DanSvantesson Or maybe this is just a filter for words. The specific words deemed defamatory in this case were “lousy traitor” (miese Volksverräterin), a “corrupt oaf” (korrupter
Trampel), and “fascist party” (Faschistenpartei). 16/
@DanSvantesson A filter that blocked those words would shut down a vast range of legal expression. en.wikipedia.org/wiki/Scunthorp… 17/
@DanSvantesson You can imagine other variants on the filter's technical design, but the court (and AG) didn't talk about those. That's the whole problem. The court doesn't consider what technology the order covers, and what unintended consequences come with that particular technology. 18/
@DanSvantesson Might this be a filter just for re-shares of the specific original post? It would be nice to think so. Unfortunately the Court – going beyond what was even asked – says FB must block any content with the same “message”.
@DanSvantesson Then it says FB can’t be required “to carry out an independent assessment of that content.” That is some serious magical thinking about what technology can do – detect the *meaning* of newly worded speech, without anyone at FB assessing it. 20/
@DanSvantesson We should be clear that while filters will block lawful speech, and re-use of old content in new contexts for news reporting and parody etc., this is not just a free expression issue. It is also a fair process, privacy, and equality issue. 21/
@DanSvantesson Filters that purport to detect "sentiment" or "meaning" of expression hurt members of racial and linguistic minorities in particular. The studies keep rolling in on this. homes.cs.washington.edu/~msap/pdfs/sap… cdt.org/files/2017/11/… 22/
@DanSvantesson Big picture: this ruling puts more pressure on lawmakers in Brussels to do the right thing. The court made a big mess. But there are opportunities to fix it - and opportunities to make it worse. EU policymakers have a big job ahead of them.
Over and out. 23/
@DanSvantesson For someone who tweets long threads a lot, I am remarkably bad at making the threading work. Tweets 17-23 in the thread are here
@DanSvantesson A few other interesting points from comment exchanges:

1. This creates massive business uncertainty for small platforms. Being told "don't worry, you may never face such an order and it will be proportionate" will not reassure investors.
@DanSvantesson 2. The deafening silence on expression and information rights -- which were squarely raised in the case and are core to the court's precedent -- suggests they just couldn't get to consensus. So now the legislature gets to do it.
@DanSvantesson 3. One saving grace here -- and it comes from the eCommerce Directive -- is that at least a court has to issue the order. Private actors can't just demand them. Though they can go to court and get them very fast with very little analysis, as this case shows.
@DanSvantesson 4. I keep saying this but it bears repeating. The lack of real data protection analysis in this case is appalling. My own publication on it only scratches the surface. I hope to see work on this from more people.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Daphne Keller
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!