I’ve belatedly come to believe that we blew it by focusing on secure messaging, while Silicon Valley quietly built their unencrypted backup infrastructure and doomed most of our efforts.
I think people at Apple knew this back in ~2014, which is why they threw so much effort into an (ultimately doomed) effort to deploy end-to-end encrypted iCloud backup. But they were too late.
By the time they got close to deploying it, governments had realized the value of what Apple (and Google) had built. There was no way they were going to let that resource be taken from them.
Apple’s photo scanning plan last summer was the first push to mobilize that backup repository for mass surveillance (of genuinely bad people), and perhaps Apple’s attempt to strike a limited bargain.
I’m still curious what’s in the future for backup. I’m optimistic that Google has deployed some end-to-end encryption in Android Backup, but not enough. WhatsApp has also done this. Apple’s content scanning plans make me doubt they’ll ever do encrypted backup now.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I think this a very worrying idea, and one I’ve seen from many thoughtful policy experts: namely that laws are the only solution to the surveillance nightmare we’ve created with newer technologies.
The problem today is that we’ve opened up a virtually infinite feed of potential surveillance data, then centralized that data in a few poorly-secured centralized repositories. Governments can’t resist accessing this data, don’t ask them to try.
There is this idea among policymakers that if we just democracy harder, we can somehow bring digital privacy expectations back to where it was in the 1980s (or even the early 2000s.) But even if we had a working democracy, we can’t. It’s too late.
A friend recently lost his wallet with a Tile inside it, and rather than get the thing back we’ve been watching it wander around Baltimore for the last three weeks.
It turns out that knocking on random doors in Fells Point is a very bad strategy for recovering lost things.
So the remote client picks the DH parameters (why!) and sends them to you, where you have to carefully check that they’re constructed correctly (pretty sure these checks were added later.)
Then there’s this weird thing about the server picking randomness, which used to be a total security vulnerability since it allowed the server to pick the DH secret (now fixed, I think) and still more complicated group membership checks.
I’m uncomfortable using “1” as an x-coordinate for Shamir, since my stupid intuition tells me “it’s so close to 0” :) But Binance just letting you blast away at q, 2q, etc.
One of my favorite things about cryptocurrency is you never have to say “well the impact can’t be that bad” because you know someone secured like $100m with it.
Does anyone know how Apple’s privacy protocol for AirTags combines with their anti-tracking features? Seems like the two should be incompatible. (I have some guesses but I’m wondering if it’s officially documented.)
My guess is that the AirTag cycles its broadcast pseudonym at a slower rate than your phone can scan, so if your phone detects many broadcasts of the same pseudonym within some time window it says “ah, an AirTag is near me.” And it links this over many windows.
In other words, Apple is taking advantage of a weakness in their anti-tracking protocol to do tracking, in the service of preventing a different kind of tracking.