With this latest news about Apple scanning photos on everyone's devices, I just want to take a moment to say
GEE, THANKS BUDDY ๐๐๐
to everyone who helped make it difficult to publish academic work on safer, limited access to encrypted data.
We've known that this was coming for a long time. I personally got worried when presidential nominee Hillary Clinton talked about needing access to encrypted data.
But the "Crypto Wars" go back many years before that.
In all those years of relative peace and calm, academic researchers could have been talking about how to build systems with real limits built in.
Not like "We pinky swear these hashes are only for bad files". Real hard, technical limits.
But nope, you guys were all dead set against even *thinking* about this stuff. Because it might give bad people ideas.
So, when Apple decided they needed *some* way to scan for CSAM, *something* to show that they care about a problem that plays extremely strong with the public, what did they find?
Well, if they looked in the academic literature, they found a paper here and there, but not much.
I wrote one of those - two actually - together with @mvaria in 2018. And you would not believe the crap we had to jump through to get that first paper published.
We got all positive reviews, and STILL had to fight through a rebuttal process to get in.
They made us write a section on "ethics". Like, "You've been bad boys, now go think about what you've done until you change your mind."
Ha, but @mvaria knocked it out of the park with an awesome discussion of the challenges and tradeoffs.
After publication, we later had a phone call with someone in DC who was somewhat- or mostly-supportive, but who complained that we were making it harder to claim that "The science is settled."
Sorry bro, that's not how science works.
We got called a few bad names on Twitter, but so what.
None of the stuff after publication really bothered us. I was up for tenure and DGAF which way it went, and my coauthor was in a safe job at the time.
But the message to younger researchers was crystal clear: Don't go there, or you're gonna have a bad time.
And for the most part, other people have avoided looking into the whole "exceptional access" / "crypto wars" problem. Can you blame them?
There have been a few neat works that pop up here and there, but nothing like you might expect given the cultural relevance of the problem.
So, if you've been doing your part to make this research difficult, we've now got the world's largest corporation scanning peoples photos, using an opaque blocklist from a secretive quasi-government org with zero accountability.
I hope you're f***ing happy.
On the other hand, if you were one of the reviewers or organizers etc who said "I hate this stuff, and I hope we never need it, but I won't block it from being published" --
Then, for whatever it's worth, you have my deepest respect and my heartfelt thanks. ๐
Unfortunately it looks like we *are* going to need more work in this space, and soon.
Hopefully the next generation can do better.
โข โข โข
Missing some Tweet in this thread? You can try to
force a refresh