Assorted observations on the #DSA. These track my own interests, so if you want an overview, look elsewhere. (If I see one, I’ll add it to the thread though.) 1/
I’d like to see a chart showing which obligations apply to which platforms, particularly at the smaller end of the range. It looks to me like some very burdensome obligations are going to fall on some very small entities. 2/
I know there are some Commission slides showing the obligations at the giant (VLOP!) end of the range. But that small end needs some serious attention IMO. 3/
One of the toughest Qs: How can the law encourage platforms to voluntarily moderate, when plaintiffs will want to use those very moderation efforts against them in litigation to say they have too much knowledge/control to be immunized under Art 5 (fka Art 14)? 4/
I’m not sure this problem is actually solvable. I wrote about it in detail, with a European focus, here: cyberlaw.stanford.edu/blog/2020/05/s… 5/
Part of the answer is that the “due diligence” obligations in the DSA are mostly things like transparency and process improvements to notice and takedown – not things that should affect immunity under The Immunity Formerly Known as Article 14. Good choice there, drafters! 6/
But when it comes right down to it, IMO the DSA chooses to help plaintiffs in intermediary liability cases, at the cost of discouraging content moderation. It does that with one key word in this passage: “solely” 7/
So plaintiffs can still use platforms' voluntary efforts to say "see, they knew/should have known!" or "this specific monitoring injunction is OK and not burdensome, because they already built filters for this other purpose," etc. as long as those are not the SOLE argument. 8/
Recital 25 tries to walk this back, in a way that to me seems inconsistent with Art 6. But... trying to have it both ways and leaving things confusing might be the best case scenario on this issue. 8/
Moving on from that issue...
Oooh, Art 20! Loss of privileges for repeat offenses BOTH by users posting illegal content AND by notifiers sending abusive/false notices!
(Weird that the latter is "shall," but I'll take it!) 8/
Oooof, Art 21. Mandatory reporting of users to law enforcement for suspicion of certain crimes, including handing over "all relevant information." Calling surveillance people (including ECPA people) @Riana_Crypto @granick @agidari ... 9/
Ooof, Art 19. Does someone hate @article19org (or just international human rights free expression and information law generally) so much that they had to stick it to them by making that be the DSA Article about *trusted flaggers* of all things? 10/
Ooof, crisis protocols. Letting govts use the global power of platforms for their own ends – just in emergencies, of course. The legitimate uses are real... And yet this is such a Chekhov’s gun for some dystopian plot that is yet to unfold. 11/
Transparency fans, note that transparency obligations are spread across several different Articles. So if you don't like the first one, keep reading. 12/
Researcher access folks, check out Art 31, which spends 7 paragraphs describing a new access regime while kicking all GDPR questions right down the road. @mathver @PJLeerssen @persily 13/
Art 22 sets out fairly crisp and concise Know Your Customer rules for marketplaces. US folks interested in 230 and Amazon/AirBnB/etc., this would be interesting to compare. 14/
And... let's see, there is a system of national regulators + umbrella EU level coordination, with primary jurisdiction in the country of establishment. Making, oh yes, the IRISH regulator (presumably the media reg) once again the least appreciated new power on the scene. 15/
I think the only thing in here about ranking, recommendation, and amplification is in the rules for Very Large Online Platforms. Did I miss anything else on that? 16/
Gotta run, that's it for now, looking forward to seeing other hot takes. 17/17
OK, picking this back up. It is weird that there is not more in the #DSA about recommendations and ranking. Unless the longstanding ECD language "disabling access," re-used here, can now be read to cover things like demoting content... 18/
Also, @JoanBarata convinced me that @ellanso is right about Art. 14. As written, it seems to strip immunity as soon as platforms receive notices that meet *formal* requirements, even if the notices make legally dubious allegations. I hope that was not the Commission's intent! 19/
People unfamiliar with real world notice and takedown often assume platforms should honor all notices, and users will counter-notice to fix bad takedowns. That is 100% not how things really work. The Commission knows that. So I will optimistically assume this was not intended 20/
OK also. This out of court dispute system at Art 18 is wild! I can hear the sound of incorporation paperwork being put together already, by everyone who wants to be the new arbiter of content takedowns! 21/
Apparently platforms pay the fees if users win, and platforms always pay their own fees. It's like a perpetual motion machine for moving platform money to dispute resolution providers. Which would be OK if they dispensed something like due process and fair outcomes. But... 22/
I'm suspicious.
For one thing, it appears that users can go to court if they don't like this private dispute resolution, but platforms can't? Is that... consistent with the EU Charter? 23/
There are like 12 more tweets in this thread, here:
I seriously don't understand Twitter threading. (It's embarrassing.) When I look at #9 in this thread, it appears to come right after #8. But when I look at the thread starting with #1, it ends at #8.
Also, just to flag some things that are REALLY BIG DEALS but that we're all jaded about at this point:
- Extraterritorial jurisdiction over non-EU services. Confusing rules, but it seems to boil down to mere accessibility≠enough, but a "significant number of users"=enough. 24/
- Fines of up to 6% of annual turnover! Because we have to show the GDPR what regulation is REALLY important, and which enforcers REALLY have the big swinging sticks. 25/
- Local representatives in the EU who face personal liability for non-compliance, and also have the "necessary powers and resources" to do what EU authorities require. Since that requirement is TBD, it seems the local rep has to have quite sweeping power to alter a service... 26/
- And, just like in the TERREG, those local authorities with the sweeping TBD powers don't have to be courts.

Keller out for now, 27/27

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Daphne Keller

Daphne Keller Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @daphnehk

14 Dec
I am resisting the urge to tweet a bunch about the leaked DSA draft, just in case something changes by tomorrow.
But let me just say there are no real surprises.
I'm seeing a lot of discussion of the rules for "very large" platforms. But if I'm understanding right, a lot of the DSA's major new process rules apply to any platform with more than 50 employees. That ought to entrench incumbents quite nicely.
Read 15 tweets
16 Sep
In platform content regulation, operational logistics are everything. That’s cropping up, inevitably, as the EU copyright filter wars heat up again this fall. 1/
You may recall that Copyright Directive Article 17 “resolved” the tension between rightsholders’ and users’ rights by simply mandating the impossible: platforms must “prevent” infringing uploads while simultaneously “in no way affect[ing]” legitimate uses. 2/
No filter in the world can actually achieve both those things. Even filter vendors told the Commission that. infojustice.org/archives/41930 3/
Read 16 tweets
9 Sep
The contemporary equivalent of the Military Industrial Complex is the nexus of state and private platform power. We need a catchy phrase for that, and it needs sustained scrutiny. Things are getting ugly.
Thread 1/
The number of recent news stories about platform employees doing things to make governments happy, including disclosing user data and silencing (or declining to silence) online speech, is remarkable. And what gets reported is presumably just the tip of a very large iceberg. 2/
This @Wired piece about the Saudi royal family bribing Twitter employees to reveal logs data about dissidents and critics is mind-blowing. 3/ wired.com/story/mohammed…
Read 12 tweets
18 Aug
The postal service is America’s original all-access speech platform. Long may it thrive. washingtonpost.com/outlook/2020/0…
It would be really fun to research and write about the postal service as a model of platform regulation. I even have some notes on point. But I’ll probably never get around to it, and have not systematically researched. So here are a few thoughts / some free association. 2/
There’s con law about when and how the govt can make the USPS exclude content. In Lamont, the S Ct struck down a law requiring it to block foreign propaganda mailings by default. Recipients could then opt IN to receive the mail, putting their name on a govt list to do so. 3/
Read 15 tweets
3 Aug
Everyone should be paying WAY more attention to GIFCT, the database platforms use to share information about terrorist content (or things that met someone’s definition of terrorist content, which is part of the issue). Here comes a thread. 1/
This is a good moment to pay attention because GIFCT is in the middle of overhauling its internal governance, and a lot of civil society groups are very vocally disappointed by the direction it’s taking. See this hrw.org/news/2020/07/3… or this blog.witness.org/2020/07/witnes… 2/
GIFCT is a big deal both because of what it specifically does (set rules for “violent extremist” speech, an important and very contested category) and because it is a model for future semi-private Internet governance. 3/
Read 29 tweets
25 Jun
A law regulating platform amplification of user content IS a law regulating speech. Smart people are wasting time having discussions that assume that problem away. 1/
You can’t escape First Amendment barriers (or international human rights law) just by shifting focus from “harmful content” to “the amplification of harmful content.” 2/
Most discussions I hear about this focus on making platforms demote / not amplify harmful content that is still, for better or worse, constitutionally protected speech. A law like that would have two First Amendment problems. 3/
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!