Poland’s case arguing that the “upload filter” provisions of DSM/Copyright Directive Article 17 violate Internet users’ fundamental rights, which has been quietly sitting before the CJEU for months, is a ticking time bomb. 1/
One day the ruling is going to come out and change everything. Or maybe nothing. 2/
This thread is about that case, and how it relates to some other looming political and legal questions about filters. 3/
Like almost all intermediary liability cases, the Polish CJEU case about upload filters has the wrong parties before the court. That affects the information and arguments that courts hear, and often distorts outcomes – especially when it comes to fundamental rights. 4/
Usually the absent parties in platform liability cases are the *users* whose rights to privacy/data protection, expression/information, and more will be affected by the ruling. (Or, their rights are affected by how the ruling causes platforms to change their practices.) 5/
The fundamental rights issues are hard. It is not realistic to expect courts to grasp them without bring briefed. And no one has reason to do that well in cases involving only (1) the person harmed by online content and (2) Facebook, YouTube, or some other platform company. 6/
I wrote up the user fundamental rights issues with upload filters, as I see them, in this article about the CJEU’s last filtering case, Glawischnig-Piesczek v. Facebook Ireland -- academic.oup.com/grurint/articl… 7/
The Polish case has really unusual participants, though. The Polish government (nominally a stand-in for affected Internet users) is the claimant. The Court is also hearing from the EU Commission, Parliament and other govt entities. 8/
So everyone is in some sense speaking for the pubic interest. But there is no concrete dispute, no victim talking about real harms or platform talking about real technical capabilities, burdens, etc. The absence of those flesh & blood (or paper & money) parties is a loss, too. 9/
There’s a fascinating write-up of what Poland, the Commission, and others said during oral argument here: communia-association.org/2020/11/12/cje… 10/
Meanwhile the debates about upload filters between © holders and platforms is playing out in slow motion, as a political matter, all over Europe. The deadline for nat'l implementation of Article 17 is in June. Countries must pass laws before they know what the CJEU will say 11/
Here is Article 17, BTW: eur-lex.europa.eu/legal-content/… 12/
The countries that take their jobs seriously are having an incredibly hard time. Article 17 gave them an impossible mandate: make platforms prevent infringement and simultaneously protect lawful uses, at massive scale. 13/
@paul_keller has been doing amazing work describing all this on the Communia and Kluwer blogs. Here's his write-up of the current state of play. copyrightblog.kluweriplaw.com/2021/01/21/div… 14/
France, by the way, is not having a hard time. It solved the Article 17 problem by ignoring and literally not implementing the part about lawful use. copyrightblog.kluweriplaw.com/2021/01/28/art… 15/
Germany and Finland have come up with fairly intricate mechanisms to try to achieve both these goals at once. They are having the conversations about real world implementation and platform operations that should have happened *before* EU lawmakers passed Article 17.
16/
And @Senficon and Joschka Selinger recently wrote two fascinating posts about Art 17 objections based on fundamental rights of businesses, which have spillover effects for users' rights. copyrightblog.kluweriplaw.com/2021/01/18/art…
17/
The Commission has held ongoing sessions to try to find a way to thread this needle. The release date for their report keeps slipping. But they, too, will have to speak before knowing how the Court rules in the Polish case. 18/
Whatever the CJEU says about upload filters and fundamental rights will matter for some other, major areas of European law and fundamental rights, as well. 19/
The Polish case will matter for the Digital Services Act, which as of now does little to change the status quo under eCommerce Directive Article 15 and the case law interpreting it, including Glawischnig-Piesczek. 20/
The Polish case will matter for the Terrorist Content Regulation. That’s basically finalized and formally does not require upload filters. But both authorities’ ability to de facto pressure for filters, and platforms’ ability to use filters voluntarily, could be affected. 21/
The Polish case will matter for the pending EU law on platforms and child sexual abuse material (CSAM). Filters for CSAM are widely seen as the ones least likely to raise serious fundamental rights issues, bc of the unique harms and lack of contextually-lawful uses. 22/
Somewhere, in an office in Luxembourg or maybe an apartment out there in the pandemic diaspora, someone is thinking hard about the Polish case. The next step is the AG’s opinion – it is due April 22. 23/
No pressure, guys. But this is one's a really big deal for all of us. 24/24

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Daphne Keller

Daphne Keller Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @daphnehk

12 Feb
I respect the Facebook Oversight Board and wish them the best. but “sucked into the private power vortex” is a good description of what’s going on. Do we want to support and reinforce pseudo-governance with pseudo-rights protections designed by Facebook?
I wrote about the problems with constructing private powers that superficially resemble constitutional and accountable governments here:

theatlantic.com/ideas/archive/…
US lawmakers may be stuck accepting this kind of private governance (to go with our private prisons and private military contractors etc.) bc the 1st Am prevents Congress from setting the rules Internet users mostly want. Lawmakers’ hands are tied on this side of the Atlantic.
Read 4 tweets
12 Feb
I wonder what percent of the people who basically support the Facebook Oversight Board would drop that support over a move like this?
Probably I’m in that camp. If a few years pass and they prove super legit and useful I’d be open to the conversation. Bringing it up now is... Gauche? Gross? Creepy? Forfeiting credibility and goodwill? Something along those lines.
It’s also hard not to read this as policy advocacy. So many governments are looking for dispute resolution models that are cheaper than courts but more legitimate than platforms. And look who’s offering their services.
Read 4 tweets
11 Feb
I am *really* excited about the Brazilian Supreme Court's Right to Be Forgotten case. I can't wait for Brazilian experts to weigh in and tell us more about the details, but the bottom line is that it's a constitutional rights-based rejection of RTBF claims. 1/
Brazil, like most countries, has cases where a personal privacy claim overrides the speaker's expression rights (or listeners' rights to access information). So I don't think the point is to say that can't happen. 2/
Rather, it's that the *process* for weighing privacy and expression rights matters. The Inter American Human Rights framework is particularly strong on this, and on expression rights generally. 3/
Read 10 tweets
11 Feb
Thx for helping w my spam query. I'd love insights on a reframed version. I'm thinking about products that already use ranking (news feeds, search) and sometimes demote for *product* reasons.
@therealfitz @rahaeli @jonathanstray @laurenweinstein @natematias @grimmelm
An easier example is a Googlebomb. When pranksters used SEO tactics to make the White House homepage be the first result for searches on "miserable failure" for example. searchengineland.com/google-kills-b… 2/
The problem isn't that the white house page was bad content or needed to be removed/demoted based on Google's disapproval of its message. It was that it wasn't the most relevant result, so changing ranking to move it down improved product functionality or search quality. 3/
Read 6 tweets
11 Feb
OK I am noodling about spam and could use help from experienced people. @therealfitz @helloyouths @mattcutts @adelin
I think of spam as different from illegal or TOS-prohibited content, because the problem is not necessarily that spam is bad content per se. It’s that spam is in the wrong place – showing up in the inbox, news feed, or search results of someone who’s not interested. 2/
Thus demotion -- rather than removal – might be the right platform response. Demotion fixes the problem by putting content back in the ranking position that best correlates to the likelihood that a user actually wants to see it. 3/
Read 7 tweets
15 Dec 20
Assorted observations on the #DSA. These track my own interests, so if you want an overview, look elsewhere. (If I see one, I’ll add it to the thread though.) 1/
I’d like to see a chart showing which obligations apply to which platforms, particularly at the smaller end of the range. It looks to me like some very burdensome obligations are going to fall on some very small entities. 2/
I know there are some Commission slides showing the obligations at the giant (VLOP!) end of the range. But that small end needs some serious attention IMO. 3/
Read 30 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!