I was going to write about how Apple's new detector for child sexual abuse material (CSAM), promoted under the umbrella of child protection and privacy, is a firm step towards prevalent surveillance and control, but many others have already written so good stuff. Some of it 👇
First, don't be fooled by the fancy crypto and the flashy privacy claims. This is a backdoor to encryption. Yes, folks, a privacy-preserving backdoor is no less of a backdoor. We are back to the crypto wars. @EFF goes at lenght about this in their analysis eff.org/deeplinks/2021…
Can be said louder, but not clearer
Why do we care about backdoors? They are essentially breaking encryption, once there is a backdoor there is no more "secure storage", "secure messaging", "secure anything". It just breaks all guarantees.
Second, backdoors they generate an expandable infrastructure that can't be monitored or technically limited
I don't want to re-explain much more what others have said so elocuently. For instance: mitpress.mit.edu/blog/keys-unde…
But the problem with this system is that compared to problems of traditional backdoors it opens a bigger pandora's box. Automated scanning is far from perfect, and that can result on ilegitimate accusations or enable targetting
Matt reiterates here on privacy not being the problem, but the lack of accountability, technical barriers to expansion, and no analysis (barely any acknowledgment) of the potential errors
Another hidden gem in this story is Apple deciding it is ok to have artificial intelligence mediating the relationship of children and parents. What can go wrong if AI decides what messages from a kid should be revealed to their parents? Some examples
The above from @KendraSerra is just a taste of how little thought has been given to how this technology may affect minorities and the damage it can bring to social relationships.
I have not seen anything about misuse, but as any other child monitoring technology this has great potential to become yet another tool by abusive partners. See the work from @TomRistenpart and team
ipvtechresearch.org/research
More problems. Apple is the first, but won't be last. If we mark such a solution as acceptable soon these controls will be all over the Internet and privacy-preserving surveillance and control will be the norm and expand its scope till we all live in 1984
Bottom line: Apple's deployment must not happen. Everyone should raise their voice against it.
What is at stake is all the privacy and freedom improvements gained in the last decades.
Let's not lose them now, in the name of privacy.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Carmela Troncoso

Carmela Troncoso Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @carmelatroncoso

9 Apr
We hear that the Digital Green Certificate (DGC) is good news as it will enable "safe free movement". Safe free movement is great news, the technical proposal not so much

A long thread coming 👇
This thread is based on three documents:
Legal analysis: ec.europa.eu/info/sites/inf…
Technical proposal: ec.europa.eu/health/sites/h…
Factsheet: ec.europa.eu/commission/pre…
Summary in 4 tweets

The DGC documents do not provide evidence whether and how the three promises will be met:
security is not guaranteed by design ✖️
certificates cannot themselves ensure non-discrimination✖️
"essential data" might just not be essential ✖️
Read 40 tweets
13 Oct 20
Yesterday, I had a business lunch. The restaurant asked me to input my data on SocialPass, an app now mandatory in Canton Vaud for restaurants gastrovaud.ch/coordonnees-de…
I get the need for tracking, not the creation of abusive surveillance infrastrutures ignoring data protection👇 Image
The app is extremely invasive. The designers have not heard of Data Minimization. It is unclear why my address, my name, or *my date of birth!* is at all necessary for a tracer to call me (yes, all fields mandatory and phone number checked via SMSa) Image
The privacy policy does not specify the purpose of the collection of these data (also not heard about purpose limitation)
socialpass.ch/protectiondesd…
I have to give it to them that they say they won't use or sell the data... Why even collect it then??????
Read 9 tweets
3 Apr 20
As countries deploy data-hungry contact tracing, we worry about what will happen with this data. Together with colleagues from 7 institutions, we designed a system that hides all personal information from the server. Please read and give comments!
More info in our repo: github.com/DP-3T/
3-page brief: github.com/DP-3T/document…
White paper: github.com/DP-3T/document…
Data Protection: github.com/DP-3T/document…
Read 9 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(