Are there Democrats in Congress who simultaneously
(1) want platforms to act against things like electoral and Covid disinformation and
(2) support Rep. @davidcicilline's antitrust bill with Sect. 2(a)(3) intact?

I see a serious conflict there.
As I read it, that part of the Cicilline bill opens the door to Infowars, Breitbart, The Daily Stormer et al bringing must-carry claims against platforms, or demanding higher ranking.
Here's what that part of the bill prohibits. The first two are about self-dealing by platforms, which is totally appropriate for antitrust/competition law. The third one opens the floodgates to litigation about speech and content moderation -- and bad outcomes.
I haven't followed these bills closely, so if someone has an explanation that makes this problem go away, I'm all ears. But if the problems is there, it's a big and gnarly one. (Also it opens the law to much stronger 1st Am challenges by platforms.)
I feel like I should flag here for those who don't know that until 2015 I was an Associate General Counsel at Google.

This part of the bill is a bomb waiting to go off, all the same.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Daphne Keller

Daphne Keller Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @daphnehk

22 Jun
OK here's my quick and dirty gloss on the Peterson v. Youtube case from the CJEU today. 1/
curia.europa.eu/juris/document…
I see this case as having three big questions:
(1) When can a platform be liable for (c) infringement under substantive (c) law?
(2) When is platform immunized from such infringement by eCommerce safe harbors?
(3) What injunctions can issue even if platform is not liable?

2/
Question (1) -- when can a platform be liable for (c) infringement under substantive (c) law -- turns on evolving case law about communication to the public.
Bottom line: There are a lot of ways to wind up liable here, so the safe harbor issue in Question (2) matters a ton.
3/
Read 24 tweets
21 Jun
It’s 2027. You’re a growing US platform startup, considering international growth. UK’s Online Safety bill has become law, and so has the EU’s DSA. So you know initial compliance costs are steep in both markets, and both have future regulatory risk.
Do you launch in:
If you responded, is your answer based on knowing something about:

(Wish I could cross-reference w answers to first poll...)
OK last one. The EU DSA has serious extraterritorial reach (think GDPR) and fines up to 6% of annual turnover. UK's Online Safety law has even broader territorial reach, and 10% fines.
For the region where you aren't yet in compliance, do you:
Read 4 tweets
21 May
I finally hit on the perfect term for the interoperability issues around sharing friends' data: Other People's Privacy (OPP).
So bummed I hadn't thought of that in time for this event.
You better believe I'd be telling you about OPP™ right now if a few people hadn't beaten me to it. Surprisingly few, though.
And before you ask, it's for a different class of goods and services than that other OPP.
Read 4 tweets
19 May
Oh good lord. Now they've co-opted the Post Office and turned it into a government surveillance platform.

@glakier, we must reclaim it and restore it to its former glory as and institution!
(Here's a thread on that glory, and on the pieces of a Post Office Platform article I will probably never write. And @glakier knows way more. )
Per the article, the USPS used every stupid, creepy, irresponsible surveillance tool to do pseudo-police work that should never have been their job. And they sold it to Republicans as a way to keep tabs on BLM protestors, and said something else to Democrats.
Read 5 tweets
18 May
The UK Online Harms draft captures contradictions of the platform speech debate in perfect microcosm.

Platforms must take down one legally undefined kind of content ("harmful") while leaving up another ("democratically important").

Have fun with that, guys.
If we could agree on what's "harmful" and what's "democratically important," we would be in a much different place as a society.
But I'm sure Facebook can sort it out.
And if they don't, Ofcom can sort it out and fine them.
It's good to have the inherent contradictions of the last few year's debate forced to the surface like that. Dialectics move fast these days.
Read 4 tweets
12 May
Welcome to the future, where the government reaches out and takes user posts down from platforms directly. No more pretense that the platform is considering their request, exercising judgment, or trying to protect users.
bbc.com/news/technolog…
Removals like these should be tombstoned with state branding. Anyone trying to access the content should see exactly which govt agency took it down.
(As @alexfeerst and I discussed long ago re compliance with the rapid takedown requirements of the Terrorist Content Regulation.)
Direct state-initiated removal from a marketplace is arguably different from such removal for "pure speech" platforms.
Letting state agencies require speech suppression without prior judicial review would be a prior restraint problem in some constitutional / human rights systems.
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(