Daphne Keller Profile picture
Sep 16, 2020 16 tweets 3 min read Read on X
In platform content regulation, operational logistics are everything. That’s cropping up, inevitably, as the EU copyright filter wars heat up again this fall. 1/
You may recall that Copyright Directive Article 17 “resolved” the tension between rightsholders’ and users’ rights by simply mandating the impossible: platforms must “prevent” infringing uploads while simultaneously “in no way affect[ing]” legitimate uses. 2/
No filter in the world can actually achieve both those things. Even filter vendors told the Commission that. infojustice.org/archives/41930 3/
And PS filters + human review won’t do a good job either. Content moderators, even with great training, won’t know licensing history or nuances of every national law. And requiring lots and lots of human review = entrenching the incumbents who can afford that. 4/
EU lawmakers passed the buck to the Commission to issue Guidelines clarifying Article 17’s impossible “block infringement and don’t block lawful use” task. That’s what the fight is about now. 5/
Since filters CAN’T perfectly enforce the law, the dispute has inevitably moved to “who wins by default when filters fail?”
The operational logistics of real world content moderation are entering the policy discussion about three years too late.
6/
If it’s not clear if the user is lawfully sharing expression and information, but a filter catches their post, does the post (a) come down or (b) stay up? Are users guilty until proven innocent, or innocent until proven guilty? 7/
Defaults are everything.
DEFAULTS ARE EVERYTHING 9/
There is a pretense that the default answer will be temporary, because later on users or rightsholders will appeal and we’ll all arrive at the right answer. That’s a fantasy. 10/
Research suggests users don't really use counternotice. (Note the DMCA has special barriers, so things might be slightly better in Europe, but these numbers are *really* not encouraging.) 11/ cyberlaw.stanford.edu/blog/2017/10/c…
Anecdata suggests that the users who do appeal are often not the ones with a legit claims. (Ask people who work in Trust and Safety) 12/
Human rights advocates say appeals for *uploaders* can't address the right to receive information for listeners, like orgs that gather evidence of human rights abuses on the Internet. 13/ blog.witness.org/2019/01/witnes…
Anyhow, here are the latest sallies in the fight over who wins by default when the filters fail.

Rightsholders say: if it *might* be infringing, take it down and let the users appeal. 14/ politico.eu/wp-content/upl…
User rights groups say no, if it *might* be infringing that’s not reason enough to take it down. 15/
politico.eu/wp-content/upl…
Pretending that automation could solve these problems got us into this mess. EU lawmakers considering filter mandates in the Terrorist Content Regulation trilogues, please take note! 16/16

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Daphne Keller

Daphne Keller Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @daphnehk

Sep 29, 2023
The S Ct will review the must-carry provisions of the TX and FL laws, and the requirements for "individualized notice" to users of content moderation decisions, but not other transparency requirements in the laws.

It says cert is for Questions 1 and 2 in the SG's brief. 1/
What statutory provisions does that actually encompass? The SG brief says it includes Texas's appeals provision, too. The Texas statutory sections it mentions as part of Q2 are 120.103 and 120.104, unless I am missing something. 2/
Here is my annotated copy of the Texas law, BTW. 3/
docs.google.com/document/d/1DP…
Read 15 tweets
Sep 26, 2023
The EU's database is live! In theory it should include every Statement or Reasons sent by platforms explaining content moderation decisions. I've groused about it, but it's an amazingly ambitious effort and already pretty interesting. 1/
When I first opened it half an hour or so ago, the database had 3.4 million entries. Now it's 3.5 million. 2/
Tiktok has submitted 1,764,373 Statements of Reason. X has submitted TWO.
You can hear the enforcers in Brussels salivating from all the way over here in California. 3/
Read 12 tweets
Jul 21, 2023
The statements from Thierry Breton of the European Commission about shutting down social media during riots are shocking. They vindicate every warning about the DSA that experts from the majority world (aka global south) have been shouting throughout this process. 1/
Breton asserts authority under the DSA to make platforms remove posts calling for “revolt” or “burning of cars” immediately. If platforms don’t comply, they will be sanctioned immediately and then banned. (He says “immediately” four times in a short quote.) 2/
Image
Image
As someone who generally defends the DSA as an instrument that (1) has a lot of process constraining such extreme exercise of state power and (2) will be enforced by moderate and rights-respecting regulators, Breton’s take on the DSA me feel like a naive chump. 3/
Read 15 tweets
Jul 13, 2023
Monday is apparently the deadline (!) for comments on one of the most under-discussed transparency measures in the DSA: the public database of every (!) content moderation action taken by platforms.
1/ digital-strategy.ec.europa.eu/en/news/digita…
Comments can be general, or can be specific to the technical specs the Commission has published. I hope this could be a longer and iterative discussion, bc the spec is (understandably) very much a first draft.
2/
Unfortunately, I’m not sure how much iteration is possible. I think the VLOPs have to start submitting information in this format Aug 25. Which means they will design larger systems around it. Which then makes it very hard to turn the ship. 3/
Read 16 tweets
Jul 5, 2023
In the injunction against Biden administration officials "jawboning" social media companies, the judge makes a classic legal and logical error. He thinks he can protect "free expression" while leaving the govt free to restrict content he personally considers bad or dangerous. 1/
Here's the Missouri v. Biden injunction. It's 7 pages. 2/int.nyt.com/data/documentt…
The injunction has a long list of things the government officials can and can't do. They CAN'T encourage platforms to suppress "protected free speech." But they CAN urge them to suppress content in 7 listed categories -- which include a bunch of 1st Am protected speech. 3/
Read 16 tweets
Mar 23, 2023
One important message: SIMMER DOWN about the The Algorithm, wonks. You do not actually need to speculate and make things up.
Ranking systems are actually quite well understood among CS people, who can explain things calmly and rationally if you let them.
@randomwalker's point about TikTok natively using vertical/portrait orientation framing is really interesting. I've mostly been tuning out the whole "future is mobile" discussion for years, but this seems like a very concrete example of why that matters.
Read 10 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(