My Authors
Read all threads
1/ What might AG #BillBarr say at tomorrow's DOJ workshop on “#Section230—Nurturing Innovation or Fostering Unaccountability?"

Expect him to call for banning strong #encryption, as he did last July—but this time, via amending 230 and focused on "protecting the children"
2/ The whole point of the half-day workshop appears to be for Barr to make the case for a bill DOJ has no doubt helped @LindseyGrahamSC & @SenBlumenthal draft—that would empower a commission (stacked with DOJ allies) to effectively ban encryption

My take: techdirt.com/articles/20200…
3/ Last July, Barr gave a LONG speech denouncing encryption
justice.gov/opa/speech/att…
4/ Barr painted a very dark picture of the "going dark" problem
5/ Barr claimed (as DOJ did under Obama) that there MUST be a way to build a backdoor into communications services without compromising their security—and blamed tech companies for simply not nerding hard enough on the problem
6/ Technical experts have debunked this "nerd harder!" argument

And... "Damn it, Jim, I'm a lawyer, not a cryptographer!"

So let me walk you through how Barr's going to tie this into #Section230...
7/ Barr called "illegitimate" firms that "sell encryption that assures that law enforcement will not be able to gain lawful access," insisting they should settle for "encryption that provides the best protection against unauthorized intrusion by bad actors" (but with backdoors)
8/ Barr clearly wants to force companies that use strong #encryption (i.e., letting users alone hold keys so the companies can't see their communications, and thus can't turn them over to the government) to have to justify their decision to do so
9/ Congress has considered extending such backdoor requirements to Internet services for years, but never done so

But Barr would get exactly what he proposed in July from the #EARNIT Act, which @LindseyGrahamSC and @SenBlumenthal are expected to release shortly
10/ Specifically, under the #EARNIT Act, a company that offered strong #encryption to its users could be charged with recklessly facilitating the distribution of child sexual abuse imagery (CSAM)—under precisely the "A/B" theory Barr laid out
11/ The #EARNIT thus threatens tech companies with stiff criminal penalties and potentially crushing civil liability—but offers them a safe harbor: they could "earn back" their #Section230 protection IF they do whatever the Commission "recommends"

i.e., abandon strong encryption
12/ It's a kind of manufacturer's liability theory: if you build a system that facilitates something bad without your ability to see it, you're on the hook for it

That's essentially what @cagoldberglaw & @annietmcadams have argued. It's not an accident that they're on panel #1
13/ So there it is. Whatever's said on the panels, the point of the event will be to (a) make DOJ's ask to Congress even more clear than it was last July and (b) to frame the entire thing around child exploitation
@ericgoldman @Riana_Crypto 15/ Of course, there's a non-zero chance that Barr might quit or be fired before tomorrow morning's speech

We'll see what happens in the morning!
@ericgoldman @Riana_Crypto 16/ Maybe Barr is just posturing again, but he's apparently been talking seriously today about resigning after Trump threatened (this morning) to sue the prosecutors in the Roger Stone case and called for Stone to have a new trial...

washingtonpost.com/politics/trump…
@ericgoldman @Riana_Crypto 17/ PS Since my @TechDirt piece, a revised version of the #EARNIT Act has been leaked. The main change is that the AG would no longer rewrite the "best practices" (effectively mandates) issued by the commission, but it hardly matters, given how stacked the commission would be
18/ you can watch the livestream here, 9-12:45: justice.gov/live
19/ Bill Barr is using child sexual abuse material (CSAM) as a pretext for compromising the security of all Internet services & the #privacy of law-abiding users:
morningconsult.com/opinions/bill-…

Meanwhile, CSAM goes unpunished—bc of underfunding, NOT encryption
nytimes.com/2020/02/19/pod…
20/ the @NYtimes' @MHKeller and @GabrielDance found easy access to CSAM on *public* websites that law enforcement just didn't have the resources to go after

WHY AREN'T WE DEALING WITH *THAT* PROBLEM?

Also problematic was that the law makes research on CSAM nearly impossible...
21/ You'll hear Barr claim #Section230 is the problem but it has NEVER protected websites from prosecution for failing to cooperate with law enforcement to stop the spread of CSAM

Amending #Section230 may sooth populist rage over "Big Tech" but it's not the fix children need
22/ Instead of blaming #Section230, what would ACTUALLY help curb the flow of CSAM online?

1) Actually spending the funds Congress allocated to law enforcement in 2008
2) Not diverting CSAM funding to fight Trump's Captain-Ahab-level war on immigration

nytimes.com/interactive/20…
"no longer are tech companies underdogs. They are now today's titans."
- Barr

But #Section230 protects ALL online services, not just the big ones. Making websites liable for user speech would hurt small sites and startups while entrenching the market power of BigTech
Terrible signal inside the FBI so don't expect much livetweeting but follow the #Section230 hashtag for updates

Barr just started. He immediately turns to Section 230, tying the issue to competition policy. "Not all concerns about online platforms fall within antitrust..."
What's the scope of #Section230?

Basically limitless!
- @cagoldberglaw

FALSE! Section 230 doesn't protect websites from:
- Federal criminal prosecution
- IP law
- Privacy law
- OR content that the platform is responsible for creating, even in part -- as in the Roommates case
@cagoldberglaw insists we must change #Section230 to protect people from online abuse

Crocodile tears from a plaintiff's lawyer

Here's how she responded when @EricGoldman, the leading 230 law professor, merely blogged about the Supreme Court's decision not to take her case
So @cagoldberglaw incited others to create fake Grindr profiles so that others would show up at a law professor's home to RAPE him—merely because he disagreed with her

She could be sued. And under her view, so could Twitter

So, should Twitter delete her tweet?
Making sites like Twitter liable for user speech => some combination of (a) over-censoring user content and (b) sites simply going out of business

MAYBE Twitter survives (or more likely just Facebook), but EVERY WEBSITE WITH USER COMMENTS would likely no longer host user content
As @PJCarome notes:
- if Congress hadn't intended #Section230 protections to be broader than defamation, they wouldn't have included exceptions for criminal law, IP & privacy
- The lawsuits @cagoldberglaw would have failed under normal law anyway
...
...
- limiting #Section230 protections will expose websites to "death by 10,000 duckbites" (Judge Kozinski's term in the Roommates case)
- this will simply cause websites to do (even) less to moderate harmful user content
I don't know why the legal system didn't punish the guy who used fake Grindr profiles to send 1000 men to try to rape his ex-boyfriend, but #Section230 [didn't bar enforcement of federal criminal harassment law]
- @PJCarome
@pjcarome .@annietmcadams: these claims that it's impossible for tech companies to police their services... there's just no evidence for that

The REAL issue is that making websites liable for imperfect moderation of user content creates a perverse incentive not to try!

#Section230
@pjcarome @annietmcadams Again, @cagoldberglaw invokes the "1200 men coming to rape you example" and complains about platforms doing nothing

Again, her proposed revision to #Section230 could force to take down HER tweet urging her followers to do the EXACT SAME THING to
@ericgoldman

#Hypocrisy
@pjcarome @annietmcadams @cagoldberglaw @ericgoldman #Section230 isn't just for Big Tech. It protects ALL websites

Wanna lock in the incumbents? Sure, kill Section 230. Upstarts couldn't get off the ground.

The First Amendment protects a LOT of awful speech. But 230 allows responsible platforms to TRY to moderate

-@PJCarome
@pjcarome @annietmcadams @cagoldberglaw @ericgoldman .@jkosseff is right: it's not enough to complain about #Section230. We have to evaluate what the world would look like after SPECIFIC changes to 230--and "I don't know what those would look like or how we get consensus on them"
@pjcarome @annietmcadams @cagoldberglaw @ericgoldman @jkosseff Yiota Souras (of @NCMEC) reminds us all that the spread of child sexual abuse material is a monstrous problem

But:
a) #Section230 has NEVER hindered enforcement of federal law and
b) the real problem is underfunding enforcement, as the NYTimes notes today
@pjcarome @annietmcadams @cagoldberglaw @ericgoldman @jkosseff @ncmec Prof @ma_franks blames civil libertarians who argue that "non-consensual pornography" is a form of free speech but
(a) revenge porn is an entirely separate problem from CSAM, which NO ONE defends as "free speech"
(b) yes, #Section230 means we need a consistent national law on NCP
And rightly so! Whatever your feelings about any form of online abuse, like revenge porn, we can't have states trying to regulate the entire Internet. We need a single body of uniform federal law

So let's talk about that, NOT rolling back #Section230
Side note: best dressed at today's #Section230 workshop is definitely Kate @Klonick

♥️ that jacket!
@Klonick .@MA_Franks: No, prosecutors didn't need #SESTA - #FOSTA to charge Backpage executives, but the only reason other sites haven't rushed to fill the gap was S-F

What utter bullshit. SESTA-FOSTA hasn't actually been used. CONVICTING Backpage executives had a HUGE deterrent effect
@Klonick @ma_franks .@MSchruers: just in our sector of the tech industry, there are 100,000+ people working to take down bad content [including child sexual abuse material] and companies have developed range of technological tools to help

#Section230 makes all this moderation/innovation possible!
@Klonick @ma_franks @MSchruers .@MSchruers: There were only 1500 CSAM prosecutions brought last year

Yup. The problem isn't #Section230. It's that Congress has spent only HALF what it committed to spend on policing CSAM in 2008 and Trump has diverted some of that money to immigration
@Klonick @ma_franks @MSchruers Let me reiterate that @TechFreedom not only supported #FOSTA, the stand-alone bill drafted by Chairman Goodlatte, but also helped to draft it

We're quite open to updating criminal law

But we don't need to amend #Section230, as SESTA did, to do that

It's a total red herring
@Klonick @ma_franks @MSchruers @TechFreedom There's a huge difference between the top 3-4 sites (Facebook, Twitter) and sites like Backpage. They have strong incentives to remove harmful content, like CSAM & revenge porn. Users don't want that, nor do advertisers. The Techlash is obscuring this...
- @Klonick

#Section230
@Klonick @ma_franks @MSchruers @TechFreedom So let's get rid of the false dichotomy of ALL tech or ALL apps versus law enforcement
- @Klonick

Exactly right. Backpage DESERVED to be prosecuted under criminal law and was BEFORE SESTA amended #Section230

Again, 230 has NEVER hindered enforcement of federal criminal law
@Klonick @ma_franks @MSchruers @TechFreedom ALL incentives any platform might have to be a good samaritan is taken away by #Section230 (c)(a)
-@ma_franks

Is she even listening to her fellow panelists?!? They have STRONG economic incentives to take down harmful content AND 230 doesn't protect sites from criminal liability
@Klonick @ma_franks @MSchruers @TechFreedom More importantly, @MA_Franks is ignoring the fundamental reason for #Section230: making websites liable for user content creates a perverse incentive for them NOT to moderate or even to be aware of what content is on their sites

Nothing she's said addresses THAT problem
@Klonick @ma_franks @MSchruers @TechFreedom The Internet is fantastic for white supremacists, misogynists, child abusers. Who else is the Internet supposed to be for?
- @MA_Franks

Um, #MeToo? #BlackLivesMatter?

Hello? Have you not noticed ANY of the great things the Internet has enabled?!?

#Section230 isn't the problem
@Klonick @ma_franks @MSchruers @TechFreedom Yes, more could be done about online harms but we're in a norm-setting period. We do NOT understand the new technologies and what they're being used for, and once we do, the technology has moved on. We should be concerned about acting too quickly.
- @Klonick

#Section230
@Klonick @ma_franks @MSchruers @TechFreedom Yes, of course, there are some bad actors who claim protections they're not entitled to (cough, cough, Backpage). But #Section230 creates the right response: It allows industry to set up content policing programs without fear of increased liability
- @MSchruers
If we got rid of #Section230 (c)(1), we'd see broad over-censorship: anything people complained about would be taken down, which would hurt unpopular view points and marginalized voices most. We see this clearly in foreign countries that lack 230 protections
- @MSchruers
@MSchruers Moderator: NYTimes notes a 50% increase in reports of CSAM to NCMEC

Yes, but that's largely because companies are becoming more cautious in what they report (eg normal baby pics), so it's impossible to tell whether the problem is really growing or we're just measuring it better
@MSchruers And... the DOJ livestream has cut out

which merely serves to illustrates the imperfection of technology -- precisely the reason we have #Section230 in the first place

Content moderation at scale is impossible, and making companies liable will cause them to do LESS, not more
@MSchruers .@MA_Franks: these are human judgments that have to be made. The idea that AI is ever going to save us is ridiculous

OK, so... how the heck are websites supposed to do content moderation perfectly?

#Section230 removes the disincentive not to try to moderate content
The question is NOT whether we're living in the "best of all possible worlds" but whether making websites liable for imperfect content moderation would result in more/better or less/worse content moderation

The answer is clearly the latter

THAT is why we have #Section230
We're open to rethinking #Section230 but anyone proposing amendments bears the burden of explaining why their proposal would not result in LESS/WORSE content moderation

@MA_Franks just isn't doing that

Read the principles on 230 we drafted last summer
digitalcommons.law.scu.edu/cgi/viewconten…
@ma_franks Everything that you see on the Internet that makes you made is everything that makes you made about humanity
- @Klonick quoting @jackbalkin

And (re FB livestream being used for shootings): FB didn't kill anyone. They merely made it transparent.

#Section230
@ma_franks @Klonick @jackbalkin The problems created by the Internet are caused by people and will come from people
- @MSchruers

Right. #Section230 isn't the problem
@ma_franks @Klonick @jackbalkin @MSchruers Nebraska AG: we just want to amend #Section230 so states can enforce their criminal laws

That's insane. States criminalize all kinds of things, from defamation to panhandling

We need a consistent body of federal criminal laws to govern the Internet. Let the states enforce THAT
@ma_franks @Klonick @jackbalkin @MSchruers Unlawful and the problematic content we've been talking about today are not the only threads we need to protect against, but also fraud, foreign actors, etc. For that, we need strong encryption to protect us all.
-@MSchruers

So don't let #Section230 be used to ban encryption!
@ma_franks @Klonick @jackbalkin @MSchruers .@NCMEC's Yiota Souras: we received 17 million CSAM reports last year. If FB messenger goes to encryption, we'd lose 72% of those reports

Yet law enforcement lacks the resources to go after PUBLIC CSAM on Bing, as NYTimes reports today nytimes.com/2020/02/19/pod…

#Section230
@ma_franks @Klonick @jackbalkin @MSchruers @ncmec #Section230 protects libraries, schools, telecoms, big tech, small tech, startups... everyone!

Also, it's crazy to claim that 230 didn't foresee encryption, an issue Congress had already hotly debated for years by 1996

@MSchruers
@ma_franks @Klonick @jackbalkin @MSchruers @ncmec #Section230 doesn't prevent law enforcement action by the states—only when a website is acting as an "interactive computer service." States CAN punish services that create help content

But federalism demands that we don't have one state/locality to write laws for US
- @MSchruers
The alternative to the AG's demands is obvious:
1) fund federal enforcement
2) rather than allowing them to enforce a crazy-quilt of varying criminal law, deputize them to enforce a consistent body of FEDERAL criminal law--a very surgical amendment to #Section230
Indeed, deputizing state and local law enforcement doesn't actually require any amendment to #Section230 whatsoever

Federal criminal law ALREADY authorizes such deputization, as we pointed out in the #SESTA debate. We were completely ignored
law.cornell.edu/uscode/text/28…
If state AGs feel federal laws aren't being enforced adequately against CSAM (they certainly aren't!) or any other problem, all they have to do is ask DOJ to designate them as special attorneys under 28 U.S.C. § 543 so they can enforce federal law

#Section230 won't stop them!
Remember, @NewsCEO is on today's #Section230 panel for one reason only: to push Congress to exempt traditional media firms from the #antitrust laws so they can try to cartelize the media market

It's pure corporate-on-corporate warfare. It has nothing to do with victims or abuse
@NewsCEO Many proposals to amend #Section230 aren't REALLY about 230 at all, they just use the law as a bargaining chip to get tech companies to do something that politicians want, like being "politically neutral" (however 2 FTC Commissioners interpret that)
- @neil_chilson
@NewsCEO @neil_chilson #Section230 effectively hurts traditional media companies that DO try to police online content
- @NewsCEO

Bullshit. Take away Section 230 and every newspaper and broadcaster would take down user commenting from their websites because they can't possibly police comments at scale
@NewsCEO @neil_chilson #Section230 wasn't a special favor for the nascent tech industry. It was a recognition that content moderation at the scale of the Internet is inherently imperfect and, unlike newspapers screening letters to the editor, holding websites liable would DISCOURAGE content moderation
@NewsCEO @neil_chilson I'm not super impressed by the argument that tech companies couldn't possibly police all the content on their sites. That's THEIR problem, not my problem.
- @NewsCEO

But without #Section230, it would be IMPOSSIBLE for today's user-centered Internet to exist. This is key...
The "moderator's dilemma" is real: if companies become liable for trying to moderate, and failing to do so perfectly, they'll stick their heads in the sand and avoid content moderation at all

THAT is why we have #Section230

It's NOT a special subsidy. It's essential to the web
.@JuliePSamuels: the Digital Revolution allowed a shift from one-to-many communication to many-to-many

THAT just wouldn't happened without #Section230
@juliepsamuels Last summer, @EricGoldman and I led the drafting of 7 principles for how to think about #Section230, which 28 groups & 53 experts signed

I suppose my calling #BillBarr a mob lawyer and calling for his impeachment might explain my not being on a panel 🤔

digitalcommons.law.scu.edu/cgi/viewconten…
@juliepsamuels @ericgoldman #Section230's benefits haven't fundamentally changed since 1996. The law is still necessary to avoid the Moderator's Dilemma. And it still keeps markets open, allowing new startups to enter the market.

We can't eliminate ALL harms, nor do we expect to offline...
- @EricGoldman
@juliepsamuels @ericgoldman #Section230 holds individuals responsible for their actions online, not the tools they use. That's the normal way we do things in the US. That's how tort generally law works, too. We hold newspapers, not news stands, liable for their content... That's only fair
- @neil_chilson
@juliepsamuels @ericgoldman @neil_chilson Google is liable for the content they produce today, just as newspapers always have been. #Section230 DOES. NOT. CHANGE. THAT.
- @neil_chilson
@juliepsamuels @ericgoldman @neil_chilson I worry about #Section230 carveouts based on size. It's tough to draw a line in the sand. And just because you're small doesn't mean you're good
- @JuliePSamuels

Moreover, a size cap presents a huge problem for startups trying to grow, and would crate a moat around Big Tech
@juliepsamuels @ericgoldman @neil_chilson We're the only business mentioned in the First Amendment
- @NewsCEO

Yes, the "press" is mentioned but the Supreme Court has regularly extended full First Amendment protection to the Internet, not just media that the old industry's trade association CEO thinks "matter more"
@juliepsamuels @ericgoldman @neil_chilson @NewsCEO .@NewsCEO IS a lawyer. He should know better. Yet he keeps misrepresenting the law on both #Section230 and the First Amendment
@juliepsamuels @ericgoldman @neil_chilson @NewsCEO We've heard less about #encryption at today's #Section230 workshop than I expected but #BillBarr made the connection clear, all but calling for an amendment to deny 230 protection to sites that use strong encryption

ICYMI: justice.gov/opa/speech/att…
Barr's clearly pushing Congress to enact the #EARNIT Act

I explained the #Section230-#encryption connection in an op-ed this morning: morningconsult.com/opinions/bill-…

And in a @TechDirt essay last week: techdirt.com/articles/20200…
@techdirt The experience of being on the Internet without #Section230 would be fundamentally different. Without the content moderation that 230 makes possible (by removing the disincentive to moderate), the experience would be total chaos. Social media would be unusable.
- @JuliePSamuels
@techdirt @juliepsamuels We need to talk not just about the "responsibilities of platforms" but about the experience of users. How would they be affected by a loss of #Section230?
- @JuliePSamuels
@techdirt @juliepsamuels The source of many of the problems we've discussed today is how we interact with other humans. #Section230 isn't the problem; it's actually the solution because it enables innovation in ways for us to converse with each other that DON'T look like traditional media
- @EricGoldman
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Berin Szóka 🌐

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!