The EU’s “chat control” legislation is the most alarming proposal I’ve ever read. Taken in context, it is essentially a design for the most powerful text and image-based mass surveillance system the free world has ever seen.
This legislation, which is initially targeted at child abuse applications, creates the infrastructure to build in mandatory automated scanning tools that will search for *known* media, *unknown* media matching certain descriptions, and textual conversations.
The legislation is vague about how this will be accomplished, but the “impact assessment” it cites is not. The assessment makes clear that mandatory scanning of images & text, especially in encrypted data, is the only solution the Commission will consider.
eur-lex.europa.eu/legal-content/…
If you wonder what detecting “grooming behavior” means, here is a brief description. Roughly it means developing new AI tools that can understand the content of textual conversations and can automatically report you to the police based on them.
You might ask how the EU, famous for its focus on privacy, justifies the development of automated text-analysis tools that scan your private chats. The Impact Assessment has an analysis. To say that this analysis is deficient is really much too kind.
As a technologist I have to point out that the technological solutions to do this *safely* don’t exist. They are at best at the research stage. ML textual analysis schemes do exist, and often misfire. These systems will need to accomplish this task perfectly and also privately.
The idea that we can deploy AI systems to read your private conversations and report crimes is frankly dystopian. Even if such systems existed, no reasonable democracy would vote for this. But this is what the EU is proposing to mandate and *build* in the next couple of years.
If you take comfort from the fact that these systems are aimed at “awful crimes” or “will be fully transparent”, please don’t. The nature of these proposals is that they will be easy to reprogram, either by law or by technical accident.
For those who want to read for themselves, here is the very dry text: eur-lex.europa.eu/legal-content/…

And here is the impact assessment, which is much more descriptive: eur-lex.europa.eu/legal-content/…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Feb 21
One of the things I’m trying to explain to my blockchains class this week is that “algorithmic stablecoins” and “backed stablecoins” are fundamentally different, and the fact that they both have “stable” in their name is confusing.
One type of system creates a direct bridge to the traditional banking system. If you have that bridge (and it works, that is: allows someone to deposit and withdraw) you don’t need any other infrastructure, like centralized exchanges etc.
The other doesn’t create a bridge to anywhere. It’s trying to extract a feature (stability) from the fact that other bridges to traditional banking exist elsewhere. The result is the same, obviously the tech is different. But the “system” and its implications are very different.
Read 5 tweets
Feb 18
Per @RachelTobac: 75% of Twitter 2FA users are using SMS-based authentication. In theory those users could switch to authenticator apps (or pay 😂) but they probably won’t.
People keep saying things like “but authenticator apps will still be free and those won’t require you to pay, plus they’re more secure.” That’s true! But also completely misunderstands what’s about to happen.
What sets SMS 2FA apart is that it’s almost “free” from a user-effort perspective. If you own a phone, the feature is already built-in and enabled. Setup is nearly effortless. Backup is taken care of. Unfortunately none of the same things are true for HOTP/authenticator apps.
Read 9 tweets
Dec 30, 2022
I got into a discussion with @OrinKerr the other day about the recent NYT op-ed on Signal and whether “metadata-resistant communications” is good or bad for policing and if it should be opposed. And I guess I wanted to talk about how those arguments should be approached. 1/
It seems to me there are four ways to argue about this:

1. Based on preferences of the debating parties.
2. Based on (empirical) historical analysis of police capabilities and expectations of privacy.
3. Based on verifiable police need.
4. Based on democratic preferences. 2/
It seems like most public policy debate has focused on (1). I don’t think this is worth very much.

I’m interested in (2) and think we need more data, but I’m pretty confident it will tell a story of (exponentially) increasing police capability and individual loss of privacy. 3/
Read 6 tweets
Dec 17, 2022
This letter is pretty amazing. It’s from a Senate Armed Services member explaining how they’re going to build the infrastructure to monitor most Internet users, network-wide using private DNS metadata.
For people who don’t know what DNS (Domain Name System) is: it’s basically the telephone directory for the Internet. Anytime one computer talks to another, it asks a DNS resolver to look up the other computer’s Internet address. These lookup requests are “metadata”…
… such that if you can buy the records of these lookups, you have effectively a “god’s eye view” over the entire Internet. Naturally the private companies who collect this data are happy to sell it off like it’s frozen orange juice.
Read 4 tweets
Dec 7, 2022
Why Apple’s announcements today are a big deal, a thread. 1/
First: Apple has spent years building the infrastructure to build end-to-end backup for iCloud. This means backup where only you, and not Apple, hackers or the government, can access your own data. 2/
However: despite deploying the infrastructure to do this as far back as 2016, Apple limited the set of end-to-end encrypted data to things like passwords and your web history. Your text messages, photos, notes etc. were all accessible to someone who could get into iCloud. 3/
Read 16 tweets
Dec 7, 2022
Looks like Apple is rolling out opt-in end to end encryption for iCloud backups. apple.com/newsroom/2022/…
This will cover every kind of data except for iCloud Mail, Contacts, Calendars: features that require server access.
I spoke with Apple earlier this morning about this proposal, and I was pretty impressed by what they’ve done. Unfortunately I’m about to have a dentist look in my mouth so it will have to wait.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(