With this latest news about Apple scanning photos on everyone's devices, I just want to take a moment to say

GEE, THANKS BUDDY ๐Ÿ‘๐Ÿ‘๐Ÿ‘

to everyone who helped make it difficult to publish academic work on safer, limited access to encrypted data.
We've known that this was coming for a long time. I personally got worried when presidential nominee Hillary Clinton talked about needing access to encrypted data.

But the "Crypto Wars" go back many years before that.
In all those years of relative peace and calm, academic researchers could have been talking about how to build systems with real limits built in.

Not like "We pinky swear these hashes are only for bad files". Real hard, technical limits.
But nope, you guys were all dead set against even *thinking* about this stuff. Because it might give bad people ideas.
So, when Apple decided they needed *some* way to scan for CSAM, *something* to show that they care about a problem that plays extremely strong with the public, what did they find?

Well, if they looked in the academic literature, they found a paper here and there, but not much.
I wrote one of those - two actually - together with @mvaria in 2018. And you would not believe the crap we had to jump through to get that first paper published.

We got all positive reviews, and STILL had to fight through a rebuttal process to get in.
They made us write a section on "ethics". Like, "You've been bad boys, now go think about what you've done until you change your mind."

Ha, but @mvaria knocked it out of the park with an awesome discussion of the challenges and tradeoffs.
After publication, we later had a phone call with someone in DC who was somewhat- or mostly-supportive, but who complained that we were making it harder to claim that "The science is settled."

Sorry bro, that's not how science works.
We got called a few bad names on Twitter, but so what.
None of the stuff after publication really bothered us. I was up for tenure and DGAF which way it went, and my coauthor was in a safe job at the time.

But the message to younger researchers was crystal clear: Don't go there, or you're gonna have a bad time.
And for the most part, other people have avoided looking into the whole "exceptional access" / "crypto wars" problem. Can you blame them?

There have been a few neat works that pop up here and there, but nothing like you might expect given the cultural relevance of the problem.
So, if you've been doing your part to make this research difficult, we've now got the world's largest corporation scanning peoples photos, using an opaque blocklist from a secretive quasi-government org with zero accountability.

I hope you're f***ing happy.
On the other hand, if you were one of the reviewers or organizers etc who said "I hate this stuff, and I hope we never need it, but I won't block it from being published" --

Then, for whatever it's worth, you have my deepest respect and my heartfelt thanks. ๐Ÿ™
Unfortunately it looks like we *are* going to need more work in this space, and soon.

Hopefully the next generation can do better.

โ€ข โ€ข โ€ข

Missing some Tweet in this thread? You can try to force a refresh
ใ€€

Keep Current with Charles Wright

Charles Wright Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @hackermath

13 Aug
I've been thinking about the big disconnect among cryptography / security people on the encryption wars.

Like so many things, I think it all boils down to threat models. 1/๐Ÿงต
How do we think governments are going to act?

That is the big question. How you answer determines how you think we should best prepare.
Most of our community looks backwards, to the 1990s, for their model of the conflict.

The US government introduced a backdoored system, the infamous Clipper Chip, in 1993. They were going to make it mandatory.

en.wikipedia.org/wiki/Clipper_cโ€ฆ
Read 25 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(