It’s gradually dawning on me how badly Apple screwed up with this content scanning announcement.
If Apple had announced that they were scanning text messages sent through their systems, or photo libraries shared with outside users — well, I wouldn’t have been happy with that. But I think the public would have accepted it.
But they didn’t do that. They announced that they’re going to do real-time scanning of individuals’ *private photo libraries* on their own phones.

That’s… something different. And new. And uncomfortable.
I think the error here is that engineers and law enforcement got lazy. I understand their thought process and where it went wrong. It looks like this…
1) We can scan photos that users share with other people, since nobody wants child exploitation media shared.

2) Backups of private data *aren’t* sharing, but people have to upload the data and we *can* scan it.

3) Since we can do (2), why not also scan data on users’ devices?
The problem is that engineers got so busy boiling this particular frog that they failed to realize something. The general public might have bought into step (1). But nobody bought into step (2). It was just a wonky technical accident that backups could even be scanned.
So when you tell the general public that you’re going to start rifling through their personal photos *on their own device* they don’t like it. It is, to be blunt and US-centric, unamerican.
Maybe Apple will reverse course and maybe they don’t. But I don’t think they have come to terms with the deep well of antipathy and mistrust this move is going to create. It’s going to be with them for years, undoing billions of dollars in priceless customer trust and marketing.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

7 Aug
Someone pointed out that Apple’s Intel Macs probably can’t run their client-side scanning software because they don’t possess a neural engine coprocessor. Real time scanning on Macs is going to require an upgrade to newer M1 hardware (or beyond).
It’s sure a weird thing to pay a ton of money for Apple’s latest hardware, and the first thing they do with it is scan your personal files.
Some other folks have asked whether corporate and enterprise-managed devices will be subject to scanning. What I’ve heard is that enterprise customers are *very* surprised and upset. Apple hasn’t announced if there will be an MDM setting to disable it.
Read 4 tweets
5 Aug
Yesterday we were gradually headed towards a future where less and less of our information had to be under the control and review of anyone but ourselves. For the first time since the 1990s we were taking our privacy back. Today we’re on a different path.
I know the people who did this have good intentions. They think this was inevitable, that we can control it. That it’ll be used only for good, and if it isn’t used for good then that would have happened anyway.
I was alive in the 1990s. I remember we had things like computers that weren’t connected to the Internet, and photo albums that weren’t subject to continuous real-time scanning. Society seemed… stable?
Read 4 tweets
5 Aug
Reading through the analysis. This is not… a security review.
“If we assume there is no adversarial behavior in the security system, then the system will almost never malfunction. Since confidentiality is only broken when this system malfunctions, the system is secure.”
Don’t worry though. There is absolutely no way you can learn which photos the system is scanning for. Why is this good? Doesn’t this mean the system can literally scan for anything with no accountability? Not addressed.
Read 12 tweets
5 Aug
A small update from last night. I described Apple’s matching procedure as a perceptual hash function. Actually it’s a “neural matching function”. I don’t know if that means it will also find *new* content on your device or just known content.
Also, it will use a 2-party process where your phone interacts with Apple’s server (which has the unencrypted database) and will only trigger an alert to Apple if multiple photos match its reporting criteria.
I don’t know anything about Apple’s neural matching system so I’m hopeful it’s just designed to find known content and not new content!

But knowing this uses a neural net raises all kinds of concerns about adversarial ML, concerns that will need to be evaluated.
Read 4 tweets
5 Aug
So I wrote this previous thread in a hurry and didn’t take time to spell out what it means, and what the background is. So let me try again.
For the past decade, providers like Apple, WhatsApp/Facebook, Snapchat, and others have been adding end-to-end encryption to their text messaging and video services. This has been a huge boon for privacy. But governments have been opposed to it.
Encryption is great for privacy, but also makes (lawful) surveillance hard. For years national security agencies and law enforcement have been asking for “back doors” so that police can wiretap specific users. This hasn’t been very successful.
Read 23 tweets
4 Aug
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(