After quite a few months working on the #CSAM proposal, and as the EC's consultation is due to finishing today at midnight, here are my (very personal) thoughts on a draft text that comes from a good place, but could have disastrous consequences : a thread 👇
First and foremost : we are in dire need of action against the systemic sexual abuse of children (CSA). The refit of the 2011 directive is more than urgent, and cooperation between intermediaries, NGOs and Law Enforcement Authorities is needed. No question about that.
(1/18)
However, what I fail to comprehend is how a text supposed to "fight and prevent" CSA will achieve preventing and fighting it without any ambitious measure when it comes to
a) scientific research on the topic
b) real, IRL prevention policies
(2/18)
c) an actual system to empower victims to speak up
d) actual consequences for the perpetrators
e) ... the list goes on.
Instead, what we are proposed is a total shift on the ways that we have always considered due process in an online setting to work
(3/18)
For instance, this text shifts from targeted surveillance (i.e. : I know that Z is most probably a bad guy, so I'll monitor them for a bit) to a generalized one (i.e. : let's just scan everything, everywhere in Europe all at once, and hope to catch the baddies)
(4/18)
This so-called "needle in the haystack" approach is not only against standing ECJ decisions, but also not efficient : since no technology is 100% accurate (or not even 99,9%), the amount of false-positive that LEA will have to deal with will just be staggering, ...
(5/18)
... and prevent them from focusing on actual leads. And this, of course, without any increase of their budget planned so far.
From a personal perspective, it also means that everybody becomes a suspect, meaning that...
(6/18)
... if you send a nude, as an adult, there's a risk that it'll be flagged as CSAM. If you send a picture of your naked baby to your pediatrician to investigate a suspicious mole, it can be flagged as CSAM. If the system wrongly IDs your new fling as a kid...
(7/18)
... the conversation can be flagged as grooming. And in the end, all of your private messaging are reviewed by someone, even if they are perfectly legitimate.
(8/18)
All of this also means breaking the encryption of your messaging apps, and on this, you can read what @ellajakubowska1 has to say on the topic, as she does it far better than I ever could.
(9/18)
So now, you see why some are calling this proposal #Chatcontrol. But honestly, I find this expression erronous, as it goes well beyond that.
(10/18)
See, this is not just a proposal on the generalised scanning of all your conversations. It is also a proposal that calls for a generalised scanning of the european internet as a whole.
(11/18)
As the text calls for the implementation of URL-blocking (in opposition to more traditionnal DNS-blocking, but still pretty useless in terms of preventing perpetrators to accessing the content imho), it effectively calls for access providers to probe the whole network.
(12/18)
Because, whereas for DNS blocking an ISP only acts at the level of the resolving of a query, URL blocking means opening every. packet. transiting. through. the. networks.
That's a lot of packets.
(13/18)
And since, of course, everything is encrypted (for instance to allow safe banking through the https protocol), it means that your ISP will have no choice but to break through these security settings in order to comply with the Law - or face liability.
(14/18)
I'm not even going into the hidden gems of this text, such as the one in article 16.4 which basically calls for a content data retention strategy for a period of 12 months. You get the idea by now.
(15/18)
So, bottom line, what are you left with ?
Well, an internet that is less secured than in the 90s, with certain repercussions on your privacy, and most likely on the safety of the digital economy, because who wants to buy something on Etsy without secure banking.
(16/18)
You're also left with the possibility to have some of your content IDd as CSAM, even though it's not.
And how does it help victims ? Well, I'm really not sure that it does, as LEAs will be over-burdened, victims themselves could be seen as producers of CSAM, etc etc.
(17/18)
On that note, you still have more than a few hours to tell the Commission how you feel at this link : ec.europa.eu/info/law/bette…
- because I genuinely believe that we all (victims, parents, internet users) deserve something better for the protection of our children.
(18/18)

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with AlexSedLex 🇺🇦

AlexSedLex 🇺🇦 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(