Matthew Green Profile picture
I teach cryptography at Johns Hopkins. Mastodon at matthew_d_green@ioc.exchange and BlueSky at https://t.co/GI4QlxYTdk. Bluecheck not my decision ;)
LittleGravitas 🇪🇺 🇪🇸 🇺🇦 🇮🇱 🇵🇸 #FBPE Profile picture Hecate's Crossroad #QVArmy Profile picture Adam Smithee Profile picture Daniel O'Donnell Profile picture ☀️ Leon-Gerard Vandenberg 🇳🇱🇨🇦🇦🇺 Math+e/acc Profile picture 27 subscribed
Jun 21 7 tweets 2 min read
I want to agree with the idea that mass scanning “breaks encryption” but I think the entire question is a category error. Any law that installs surveillance software directly on your phone isn’t “breaking” or “not breaking” encryption, it’s doing exactly what it promises to do. For decades we (in the west) had no mass surveillance of any communications. Starting in the 2010s some folks came up with the idea of scanning for illicit content like CSAM uploaded in plaintext on servers. (With apparently relatively little effect on the overall problem.) Image
Jun 10 22 tweets 6 min read
So Apple has introduced a new system called “Private Cloud Compute” that allows your phone to offload complex (typically AI) tasks to specialized secure devices in the cloud. I’m still trying to work out what I think about this. So here’s a thread. 1/ Apple, unlike most other mobile providers, has traditionally done a lot of processing on-device. For example, all of the machine learning and OCR text recognition on Photos is done right on your device. 2/
Image
Image
May 28 15 tweets 4 min read
Some folks are discussing what it means to be a “secure encrypted messaging app.” I think a lot of this discussion is shallow and in bad faith, but let’s talk about it a bit. Here’s a thread. 1/ First: the most critical element that (good) secure messengers protect is the content of your conversations in flight. This is usually done with end-to-end encryption. Messengers like Signal, WhatsApp, Matrix etc. encrypt this data using keys that only the end-devices know. 2/
May 23 11 tweets 4 min read
Several people have suggested that the EU’s mandatory chat scanning proposal was dead. In fact it seems that Belgium has resurrected it in a “compromise” and many EU member states are positive. There’s a real chance this becomes law. dropbox.com/scl/fi/9w611f2…


Image
Image
Image
The basic idea of this proposal is to scan private (and encrypted) messages for child sexual abuse material. This now means just images and videos. Previous versions also included text and audio, but the new proposal has for the moment set that aside, because it was too creepy. Image
May 12 13 tweets 4 min read
Telegram has launched a pretty intense campaign to malign Signal as insecure, with assistance from Elon Musk. The goal seems to be to get activists to switch away from encrypted Signal to mostly-unencrypted Telegram. I want to talk about this a bit. 1/ First things first, Signal Protocol, the cryptography behind Signal (also used in WhatsApp and several other messengers) is open source and has been intensively reviewed by cryptographers. When it comes to cryptography, this is pretty much the gold standard. 2/
May 9 4 tweets 1 min read
Seems like we’re getting a major push for activists to switch from Signal to Telegram, which has no encryption by default and a pretty shady history of refusing to add it. Seems like a great idea, hope folks jump all over that. Someone said “why the sarcasm”. Please don’t take my last sentence above seriously. Signal is an excellent and confidential platform. Telegram is not. Sometimes it’s worth using a non-confidential platform to reach lots of people (see Twitter) but it’s not a replacement.
May 7 7 tweets 2 min read
We’re pretty rapidly and consciously heading towards a future where everything you do on the Internet requires government ID, with basically no attention paid to the consequences of that (indeed, the consequences of that may be the whole point.) I’ve become a little bit despairing that we can fight this. The pressure on all sides seems much too intense. But we also have very little tech in place to make this safe: and realistically the only people who can develop it work in Cupertino and Mountain View.
May 2 8 tweets 2 min read
Europe is maybe two months from passing laws that end private communication as we know it, and folks are looking the other way (understandably.) You’re not going to get a do-over once these laws are passed. The plan, to repeat, is to mandate that every phone contains software that receives a list of illicit material (photos, keywords, AI models that can determine the sentiment of conversations) and scans your data for matches *before* it is encrypted, and alerts the police directly.
Mar 31 6 tweets 1 min read
This thing Facebook did — running an MITM on Snapchat and other competitors’ TLS connections via their Onavo VPN — is so deeply messed up and evil that it completely changes my perspective on what that company is willing to do to its users. I don’t come from a place of deep trust in big tech corporations. But this stuff seems like it crosses a pretty clear red line, maybe even a criminal one.
Mar 12 12 tweets 3 min read
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798… To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
Mar 5 11 tweets 2 min read
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.” I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
Feb 21 18 tweets 4 min read
So Apple has gone and updated the iMessage protocol to incorporate both forward security (very good!) and post-quantum cryptography. security.apple.com/blog/imessage-… This is a big deal because iMessage (which gets no real attention from anyone) is one of the most widely-adopted secure communications protocols in the world. At least 1 billion people use it, all over the world. It’s the only widely-available encrypted messaging app in China.
Dec 27, 2023 18 tweets 5 min read
Article on some new research that finds ways to balance privacy and stalker detection for AirTags and other location trackers. This is a collaboration with my students @gabrie_beck, Harry Eldridge and colleagues Abhishek Jain and Nadia Heninger. wired.com/story/apple-ai… TL;DR thread. When Apple launched their “Find My” system for lost devices in 2019, they designed a clever solution to keep bad actors (including Apple) from tracking users. This works by making devices change their broadcast identifier every 15 minutes. blog.cryptographyengineering.com/2019/06/05/how…
Nov 12, 2023 12 tweets 3 min read
I’m a sucker for crypto papers that do insane things like build ciphertexts out of garbled circuits, and then use the garbled circuit to do stuff that only shows up in the security reduction. Eg: eprint.iacr.org/2023/1058 So what’s fun about this paper is that it’s trying to do something weirdly hard: build cryptosystems that allow you to encrypt (functions of) secret keys. This can be encrypting your own secret key, or eg I can encrypt your secret key and you can encrypt mine to form a “cycle”.
Oct 29, 2023 4 tweets 1 min read
So Apple deployed an entire key transparency thing for iMessage and it literally seems to be documented in a blog post. What the heck is the point of key transparency if you don’t document things, and (critically) provide open source ID verification tools? Key transparency is about deterring attacks. But it doesn’t deter them if you keep it all secret, Apple! Image
Oct 13, 2023 10 tweets 2 min read
Oh god: “Mathematician warns US spies may be weakening next-gen encryption.” 🙄 newscientist.com/article/239651… For the record, whatever issues have come up in the PQC competition, this is absolutely not the right way to address them.
Sep 29, 2023 8 tweets 2 min read
If anyone thought that the EU legislation on content scanning would be limited, you can forget about that. Europol has demanded unfiltered access to all data produced by these systems. balkaninsight.com/2023/09/29/eur…
Image To be clear what this means: these scanning systems may produce huge numbers of false positives. That means your private, encrypted messages get decrypted and handed over to the police *even if you haven’t sent anything illegal.*
Sep 23, 2023 9 tweets 3 min read
I wonder who exactly is paying for the ads and what their specific business interests are. Like if I was in the adtech or data brokerage industry, I’d sure love these ads. Encryption is bad! Apple is too private. Let’s pass some laws to “protect the children.”
Sep 19, 2023 5 tweets 2 min read
New leak from the Snowden documents. Image To give some context, here are the contents of an initial Snowden leak from September 2013. Cavium was a leading manufacturer of cryptographic co-processors for VPN devices at that time. archive.nytimes.com/www.nytimes.co…
Image
Jul 27, 2023 4 tweets 1 min read
I’m just catching up on Web Integrity but it looks really concerning. Basically adds DRM to your browser so only approved browsers can access certain sites. What worries me about this is that the web is currently one of the only open alternatives to the app stores. Closing it down (even if there are benefits) seems like it will make government control a lot easier.
Jul 12, 2023 6 tweets 1 min read
Computer security would be about 80% solved if we just deprecated every technology shown in this graphic. Diagram from: