Matthew Green Profile picture
I teach cryptography at Johns Hopkins. Mastodon at matthew_d_green@ioc.exchange and BlueSky at https://t.co/GI4QlxYTdk.
@littlegravitas@c.im 🇺🇦 🇪🇺 🇮🇱 🇵🇸 #FBPE Profile picture Hecate's Crossroad #QVArmy Profile picture Adam Smithee Profile picture Daniel O'Donnell Profile picture Cowly Profile picture 18 subscribed
Mar 31 6 tweets 1 min read
This thing Facebook did — running an MITM on Snapchat and other competitors’ TLS connections via their Onavo VPN — is so deeply messed up and evil that it completely changes my perspective on what that company is willing to do to its users. I don’t come from a place of deep trust in big tech corporations. But this stuff seems like it crosses a pretty clear red line, maybe even a criminal one.
Mar 12 12 tweets 3 min read
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798… To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
Mar 5 11 tweets 2 min read
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.” I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
Feb 21 18 tweets 4 min read
So Apple has gone and updated the iMessage protocol to incorporate both forward security (very good!) and post-quantum cryptography. security.apple.com/blog/imessage-… This is a big deal because iMessage (which gets no real attention from anyone) is one of the most widely-adopted secure communications protocols in the world. At least 1 billion people use it, all over the world. It’s the only widely-available encrypted messaging app in China.
Dec 27, 2023 18 tweets 5 min read
Article on some new research that finds ways to balance privacy and stalker detection for AirTags and other location trackers. This is a collaboration with my students @gabrie_beck, Harry Eldridge and colleagues Abhishek Jain and Nadia Heninger. wired.com/story/apple-ai… TL;DR thread. When Apple launched their “Find My” system for lost devices in 2019, they designed a clever solution to keep bad actors (including Apple) from tracking users. This works by making devices change their broadcast identifier every 15 minutes. blog.cryptographyengineering.com/2019/06/05/how…
Nov 12, 2023 12 tweets 3 min read
I’m a sucker for crypto papers that do insane things like build ciphertexts out of garbled circuits, and then use the garbled circuit to do stuff that only shows up in the security reduction. Eg: eprint.iacr.org/2023/1058 So what’s fun about this paper is that it’s trying to do something weirdly hard: build cryptosystems that allow you to encrypt (functions of) secret keys. This can be encrypting your own secret key, or eg I can encrypt your secret key and you can encrypt mine to form a “cycle”.
Oct 29, 2023 4 tweets 1 min read
So Apple deployed an entire key transparency thing for iMessage and it literally seems to be documented in a blog post. What the heck is the point of key transparency if you don’t document things, and (critically) provide open source ID verification tools? Key transparency is about deterring attacks. But it doesn’t deter them if you keep it all secret, Apple! Image
Oct 13, 2023 10 tweets 2 min read
Oh god: “Mathematician warns US spies may be weakening next-gen encryption.” 🙄 newscientist.com/article/239651… For the record, whatever issues have come up in the PQC competition, this is absolutely not the right way to address them.
Sep 29, 2023 8 tweets 2 min read
If anyone thought that the EU legislation on content scanning would be limited, you can forget about that. Europol has demanded unfiltered access to all data produced by these systems. balkaninsight.com/2023/09/29/eur…
Image To be clear what this means: these scanning systems may produce huge numbers of false positives. That means your private, encrypted messages get decrypted and handed over to the police *even if you haven’t sent anything illegal.*
Sep 23, 2023 9 tweets 3 min read
I wonder who exactly is paying for the ads and what their specific business interests are. Like if I was in the adtech or data brokerage industry, I’d sure love these ads. Encryption is bad! Apple is too private. Let’s pass some laws to “protect the children.”
Sep 19, 2023 5 tweets 2 min read
New leak from the Snowden documents. Image To give some context, here are the contents of an initial Snowden leak from September 2013. Cavium was a leading manufacturer of cryptographic co-processors for VPN devices at that time. archive.nytimes.com/www.nytimes.co…
Image
Jul 27, 2023 4 tweets 1 min read
I’m just catching up on Web Integrity but it looks really concerning. Basically adds DRM to your browser so only approved browsers can access certain sites. What worries me about this is that the web is currently one of the only open alternatives to the app stores. Closing it down (even if there are benefits) seems like it will make government control a lot easier.
Jul 12, 2023 6 tweets 1 min read
Computer security would be about 80% solved if we just deprecated every technology shown in this graphic. Diagram from:
Jul 3, 2023 9 tweets 3 min read
Too much timing data is available even from encrypted messaging apps, when a passive adversary surveills the network links for a whole country. It might be smart to add some kind of delayed delivery feature. nytimes.com/2023/07/03/tec… This paper looked at Signal’s Sealed Sender back in ‘21 and showed that you could recover sender/recipient information after seeing a few (encrypted) messages, because of things like delivery receipts. No idea if there’s a fix for this. https://t.co/WJLP2mN3CMndss-symposium.org/wp-content/upl…


May 17, 2023 5 tweets 2 min read
The EU Council is continuing to debate a law that would require communication providers to scan all communications, potentially including end-to-end encrypted conversations. And they are now debating including audio conversations as well. It’s not clear to me precisely what content scanning for audio conversations would entail, but it seems to involve some kind of AI system routinely listening to your phone conversations. Image
Apr 21, 2023 9 tweets 2 min read
My wife was looking for pictures of our kids on my phone, and found a photo from a topless beach. Which immediately led to a lot of marital awkwardness and worry (on my part) that somehow I took this creepy photo and also maybe that I have Alzheimer’s. A little investigation revealed the photo was from Spain, circa 2017. I wasn’t in Spain in 2017.

Felt like I had just gotten a death row pardon from the governor.
Apr 13, 2023 16 tweets 3 min read
Woohoo! WhatsApp has released key transparency! engineering.fb.com/2023/04/13/sec… So here’s a thread on key transparency, and why this is a big deal. 1/
Mar 28, 2023 6 tweets 2 min read
So a giant box just showed up in the CS department mailroom addressed to me. It’s from Vietnam and the packing list says “shirt.” Uh oh.
Mar 26, 2023 5 tweets 1 min read
The future of censorship-resistant communications is going to be distributing LLMs trained on dissident content, rather than the content itself. Imagine “the anarchist cookbook” but it’s a device-local chatbot that will answer all your (technical and ideological) questions interactively and persuasively.
Mar 13, 2023 4 tweets 1 min read
Where we’re going we’re gonna need a lot more RAM. Apparently the beefiest MacBook only has 96GB and you have to buy the super-hightech GPU just to get that.
Mar 10, 2023 9 tweets 4 min read
The EU’s “chat control” legislation is the most alarming proposal I’ve ever read. Taken in context, it is essentially a design for the most powerful text and image-based mass surveillance system the free world has ever seen. This legislation, which is initially targeted at child abuse applications, creates the infrastructure to build in mandatory automated scanning tools that will search for *known* media, *unknown* media matching certain descriptions, and textual conversations.