Dear researchers: the hard part of problems like “traceability” is not the part where you build a mass surveillance system. Building mass surveillance systems is *easy*.
The hard part is building systems that don’t utterly shatter the security guarantees that the private system offered, and don’t have caveats like “obviously this can be abused, stopping that is future work.”
When I go out to see what our research community has been doing in this area, I expect them to understand what makes this research problem hard. Not to find slides like this one.
I don’t usually like to call out fellow researchers for solving only part of the problem, because research is incremental. But there are certain kinds of research where solving only “the easy part” actually makes the world worse.
Don’t 👏 Give 👏 Authoritarian 👏 Regimes 👏 The 👏 Tools 👏 To 👏 Stifle 👏 Dissent

And then claim that preventing them from doing so is “future work”.

Your conference publication is not that valuable.
I’m just going to add that I’m *not* telling people they shouldn’t write papers I disagree with.

Go ahead and write them.

But if the hard part of the problem is “how do we prevent abuse”, that can’t be left to Future Work.
If what you’re looking for is Least Publishable Units, go find a different area to work in. Lots of areas that don’t affect people’s lives at all.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

2 Jun
Good article by WhatsApp on why content “traceability” is so hard. faq.whatsapp.com/general/securi…
The post makes this point informally, but it really seems like there’s an impossibility result in this problem: it’s impossible to have privacy and traceability at the same time without some very specific requirements.
There’s this idea that you can have content sent among small groups where there’s privacy of who is forwarding what, but when a piece of content goes “viral” suddenly we can trace the content back to its originator.
Read 8 tweets
17 May
Interesting story about how Apple is moving encryption keys to China. nytimes.com/2021/05/17/tec…
Ok, I have lots of things to say about encryption keys and hardware security modules. But forget all that for a second. WTF Apple.
“A legal shield from American law.”
Read 16 tweets
13 May
This article about end-to-end encryption and authorities’ desire to perform real-time content scanning is very well written and I hope you’ll read it. It also makes me pretty angry.
For nearly a decade, technologists have been engaged in a good-faith debate with policymakers about the need for “exceptional access” — basically a way to bypass encryption when police get a warrant. 1/
This is a really hard problem. How do you build a system that can keep your data encrypted against hackers, but still allows (even local) police to decrypt it when they want. Some co-authors wrote about this. mitpress.mit.edu/blog/keys-unde… 2/
Read 12 tweets
17 Apr
“New: In 2010, KPN commissioned a study into the behavior of Huawei in the mobile network. The findings were so serious that it was feared for the continued existence of KPN Mobiel if the conclusions were to be leaked”
I can’t access the reporting (paywall and in Dutch) or the actual report. But it sounds like Huawei retained admin access to eavesdrop on calls in the Dutch network, against explicit agreements.
I’ve seen this pattern of story, and I know that it will be hailed by some as “the smoking gun proof of malice” and others will point out that the Huawei code was just a smoking pile of sloppiness, and really: it doesn’t matter.
Read 8 tweets
15 Apr
The extra barriers Apple is throwing up in the way of security researchers make me much more nervous about using their stuff.
It’s not totally the case that security researchers are (today) locked out of iOS. But it’s definitely getting harder. The work that P0 had to do to RE iMessage is an example. googleprojectzero.blogspot.com/2021/01/a-look…
Does it make me nervous that Apple had to write a “firewall” to protect iMessage from malicious payloads (because they’re not confident it can be secured)? Hell yes it does. Would it be nice to have more people banging on this? Yes!
Read 6 tweets
13 Apr
The more I read about the development of electronic payment tech from 1990-2010, the more it looks like a scam designed to ensure that only existing banking (and those few tech companies the banks selected) were viable options.
Apropos a 2010 post by Paul Graham on why the PayPal founders were geniuses. Maybe this is true, but what did PayPal actually do brilliantly? They built anti-fraud tech so that people could use 1970s credit card tech online.
Why weren’t there dozens or hundreds of PayPals, or people doing more sophisticated cash-like payments on the Internet? Well? There were some of the latter but their doors all got kicked in by the Feds. en.m.wikipedia.org/wiki/Liberty_R…
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(