I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. justice.gov/opa/pr/attorne…
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? google.com/amp/s/www.nyti…
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.

But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
But even if you believe Apple won’t allow these tools to be misused 🤞there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review.
Hashes using a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use.

We don’t know much about this algorithm. What if someone can make collisions?
Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file?
These images are from an investigation using much simpler hash function than the new one Apple’s developing. They show how machine learning can be used to find such collisions. towardsdatascience.com/black-box-atta…
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them. google.com/amp/s/mobile.r…

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

5 Aug
Yesterday we were gradually headed towards a future where less and less of our information had to be under the control and review of anyone but ourselves. For the first time since the 1990s we were taking our privacy back. Today we’re on a different path.
I know the people who did this have good intentions. They think this was inevitable, that we can control it. That it’ll be used only for good, and if it isn’t used for good then that would have happened anyway.
I was alive in the 1990s. I remember we had things like computers that weren’t connected to the Internet, and photo albums that weren’t subject to continuous real-time scanning. Society seemed… stable?
Read 4 tweets
5 Aug
Reading through the analysis. This is not… a security review. Image
“If we assume there is no adversarial behavior in the security system, then the system will almost never malfunction. Since confidentiality is only broken when this system malfunctions, the system is secure.”
Don’t worry though. There is absolutely no way you can learn which photos the system is scanning for. Why is this good? Doesn’t this mean the system can literally scan for anything with no accountability? Not addressed. Image
Read 12 tweets
5 Aug
A small update from last night. I described Apple’s matching procedure as a perceptual hash function. Actually it’s a “neural matching function”. I don’t know if that means it will also find *new* content on your device or just known content.
Also, it will use a 2-party process where your phone interacts with Apple’s server (which has the unencrypted database) and will only trigger an alert to Apple if multiple photos match its reporting criteria.
I don’t know anything about Apple’s neural matching system so I’m hopeful it’s just designed to find known content and not new content!

But knowing this uses a neural net raises all kinds of concerns about adversarial ML, concerns that will need to be evaluated.
Read 4 tweets
5 Aug
So I wrote this previous thread in a hurry and didn’t take time to spell out what it means, and what the background is. So let me try again.
For the past decade, providers like Apple, WhatsApp/Facebook, Snapchat, and others have been adding end-to-end encryption to their text messaging and video services. This has been a huge boon for privacy. But governments have been opposed to it.
Encryption is great for privacy, but also makes (lawful) surveillance hard. For years national security agencies and law enforcement have been asking for “back doors” so that police can wiretap specific users. This hasn’t been very successful.
Read 23 tweets
20 Jul
There is a take that companies like Apple are never going to be able to stop well-resourced attackers like NSO from launching targeted attacks. At the extremes this take is probably correct. But adopting cynicism as strategy is a bad approach. 1/
First, look at how Pegasus and other targeted exploits get onto your phone. Most approaches require some user interaction: a compromised website or a phishing link that users have to click.

iMessage, on the other hand, is an avenue for 0-click targeted infection. 2/
While we can’t have “perfect security”, closing down avenues for interactionless targeted infection sure seems like a thing we can make some progress on. 3/
Read 14 tweets
9 Jul
Every article I read on (ZK) rollups almost gets to the real problem, and then misses it. The real problem is the need for storage. ZK proofs won’t solve this.
I keep reading these articles that talk about the problems with rollups. And they’re good articles! E.g.: medium.com/dragonfly-rese…
But they always reach a point where they realize that the problem is state storage, and then they handwave that the solution is going to be something like Mina or zkSync, which don’t fully solve the state storage problem.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!