Although Facebook is the primary target of this pressure campaign, it’s hard not to notice how closely Apple’s client-side scanning announcement fits with the UK government’s desires.
Don’t listen to anyone who tells you “they’ll never give in to government pressure” when it’s obvious they already are.
The serious purpose of this campaign (which started in 2019 by the way) is to make it difficult to deploy new encrypted services. The UK doesn’t actually care if people figure out how to solve end-to-end encrypted scanning.
I think some people think they’re going to be clever by deploying client-side scanning, thus giving the UK “what it’s asking for.” In the short term this might work — only because the UK gov would be happy to let competitors throw Facebook under the bus. But not the long term.
The main law enforcement ask (and the UK is just the part of the global law enforcement community with the most invasive laws) is retrospective access to data. They also want these real-time scanning systems, but they’re terrified encryption means losing data access.
If somehow it looked like we were headed to a world with end-to-end encryption and client-side CSAM scanning, law enforcement wouldn’t be mollified. They’d just change their strategy.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Major surprise to me in reviewing this code is how immature the JS/Node/browser crypto ecosystem is in 2022. I wanted to say “just use <standard library>” but: what should that library be?
So instead of having proper well-maintained crypto libraries for securing all these assets, we have libraries from individual contributors. This is where OpenSSL was in 2000.
I decided to look at MetaMask’s crypto, and oh wow I wish I could unlook.
To be clear I didn’t even make it to a lot of the core routines yet. Just hunting through piles of poorly-commented JS and *hoping* the particular GitHub repo I’m looking at is actually the right one.
Reached a point where I was in someone’s personal GitHub repo and I was like “I think this is the right code” but honestly I dunno.
I am of the opinion that NFTs are going to be important. But I am also sympathetic to the take below. Don’t mistake *believing in the significance* of a technology for accepting and supporting all of its downsides.
One of the dumbest lessons I’ve learned in my career is that you should never disregard something that has hype behind it, even if you don’t think the tech makes sense.
Most “tech adoption” problems are really human coordination problems. Hype solves those. It doesn’t matter if you have a better solution, or that you think the proposed solution is stupid.
Facebook (ugh must we call them Meta) is deploying an image scanning system to detect revenge porn. The novelty is that the people reporting the images never have to show the originals to Facebook. about.fb.com/news/2021/12/s…
I’m sure this has been carefully thought out. I hope it has. Because as described in the post it seems fairly ripe for abuse.
In any case, it’s worth flagging this just in case you thought this image scanning tech would stop with child sexual abuse media. There is a whole library of content that people want to censor and surveil: often for perfectly benign reasons.
I’ve belatedly come to believe that we blew it by focusing on secure messaging, while Silicon Valley quietly built their unencrypted backup infrastructure and doomed most of our efforts.
I think people at Apple knew this back in ~2014, which is why they threw so much effort into an (ultimately doomed) effort to deploy end-to-end encrypted iCloud backup. But they were too late.
By the time they got close to deploying it, governments had realized the value of what Apple (and Google) had built. There was no way they were going to let that resource be taken from them.
I think this a very worrying idea, and one I’ve seen from many thoughtful policy experts: namely that laws are the only solution to the surveillance nightmare we’ve created with newer technologies.
The problem today is that we’ve opened up a virtually infinite feed of potential surveillance data, then centralized that data in a few poorly-secured centralized repositories. Governments can’t resist accessing this data, don’t ask them to try.
There is this idea among policymakers that if we just democracy harder, we can somehow bring digital privacy expectations back to where it was in the 1980s (or even the early 2000s.) But even if we had a working democracy, we can’t. It’s too late.