Matthew Green Profile picture
I teach cryptography at Johns Hopkins. Mastodon at matthew_d_green@ioc.exchange and BlueSky at https://t.co/GI4QlxYTdk. Bluecheck not my decision ;)
28 subscribers
Sep 19 14 tweets 3 min read
Most of cryptography research is developing a really nice mental model for what’s possible and impossible in the field, so you can avoid wasting time on dead ends. But every now and then someone kicks down a door and blows up that intuition, which is the best kind of result. One of the most surprising privacy results of the last 5 years is the LMW “doubly efficient PIR” paper. The basic idea is that I can load an item from a public database without the operator seeing which item I’m loading & without it having to touch every item in the DB each time.
Sep 12 4 tweets 2 min read
The new and revived Chat Control regulation is back. It still appears to demand client side scanning in encrypted messengers. But removes “detection of new CSAM” and simply demands detection of known CSAM. However: it retains the option to change this requirement back. For those who haven’t been paying attention, the EU Council and Commission have been relentlessly pushing a regulation that would break encryption. It died last year, but it’s back again — this time with Hungary in the driver’s seat. And the timelines are short. Image
Sep 10 6 tweets 2 min read
One of the things we need to discuss is that LLMs listening to your conversations and phone calls, reading your texts and emails — this is all going to be normalized and inevitable within seven years. In a very short timespan it’s going to be expected that your phone can answer questions about what you did or talked about recently, what restaurants you went to. More capability is going to drive more data access, and people will grant it.
Aug 26 6 tweets 1 min read
I hope that the arrest of Pavel Durov does not lead to him or Telegram being held up as some hero of privacy. Telegram has consistently acted to collect huge amounts of unnecessary private data on their servers, and their only measure to protect it was “trust us.” For years people begged them to roll out even rudimentary default encryption, and they pretty aggressively did not of that. Their response was to move their data centers to various middle eastern countries, and to argue that this made your data safe. Somehow.
Aug 25 5 tweets 1 min read
Apropos Pavel Durov’s arrest, I wrote a short post about whether Telegram is an “encrypted messaging app”. blog.cryptographyengineering.com/2024/08/25/tel… The TL;DR here is that Telegram has an optional end-to-end encryption mode that you have to turn on manually. It only works for individual conversations, not for group chats. This is so relatively annoying to turn on (and invisible to most users) that I doubt many people do.
Jul 13 5 tweets 1 min read
If you want to avoid disasters like the AT&T breach, there are basically only three solutions:

1. Don’t store data
2. Don’t store unencrypted data
3. Have security practices like Google

Very few companies can handle (3), certainly not AT&T. One of the things policymakers refuse to understand is that securing large amounts of customer data, particularly data that needs to be “hot” and continually queried (eg by law enforcement) is just beyond the means of most US companies.
Jul 9 4 tweets 1 min read
I remember a few years back when I suggested people stop using Chrome because it had clearly decided to privilege Google properties with additional access. Now this has become obvious and accelerated. The fact that Google is doing this right in the face of US and EU anti-monopoly efforts either means they’ve made a very sophisticated calculation, or they’re headed towards a major reckoning.
Jul 5 5 tweets 1 min read
I’ve been watching some early 90s movies recently, and being reminded of the privacy expectations we all used to take for granted in a world that basically worked fine. People paying in cash; using telephones; having important conversations in person. The idea that a corporation might track you routinely (even if just to push you ads) was barely on the radar. The idea that we needed to add that feature to keep us safe, that was laughable. The past is like a foreign country.
Jul 3 12 tweets 3 min read
I really do think context is important here. Some of these age verification laws are based on good-faith concerns. But a lot of them are really designed to censor big chunks of the Internet, making them less accessible to both kids and adults. If you’re thinking that some kind of privacy-preserving age verification system is the answer, that’s great! But you need to make sure your goals (easy access for adults, real privacy, no risk of credentials being stolen) actually overlap with the legislators’ goals.
Jun 21 7 tweets 2 min read
I want to agree with the idea that mass scanning “breaks encryption” but I think the entire question is a category error. Any law that installs surveillance software directly on your phone isn’t “breaking” or “not breaking” encryption, it’s doing exactly what it promises to do. For decades we (in the west) had no mass surveillance of any communications. Starting in the 2010s some folks came up with the idea of scanning for illicit content like CSAM uploaded in plaintext on servers. (With apparently relatively little effect on the overall problem.) Image
Jun 10 22 tweets 6 min read
So Apple has introduced a new system called “Private Cloud Compute” that allows your phone to offload complex (typically AI) tasks to specialized secure devices in the cloud. I’m still trying to work out what I think about this. So here’s a thread. 1/ Apple, unlike most other mobile providers, has traditionally done a lot of processing on-device. For example, all of the machine learning and OCR text recognition on Photos is done right on your device. 2/
Image
Image
May 28 15 tweets 4 min read
Some folks are discussing what it means to be a “secure encrypted messaging app.” I think a lot of this discussion is shallow and in bad faith, but let’s talk about it a bit. Here’s a thread. 1/ First: the most critical element that (good) secure messengers protect is the content of your conversations in flight. This is usually done with end-to-end encryption. Messengers like Signal, WhatsApp, Matrix etc. encrypt this data using keys that only the end-devices know. 2/
May 23 11 tweets 4 min read
Several people have suggested that the EU’s mandatory chat scanning proposal was dead. In fact it seems that Belgium has resurrected it in a “compromise” and many EU member states are positive. There’s a real chance this becomes law. dropbox.com/scl/fi/9w611f2…


Image
Image
Image
The basic idea of this proposal is to scan private (and encrypted) messages for child sexual abuse material. This now means just images and videos. Previous versions also included text and audio, but the new proposal has for the moment set that aside, because it was too creepy. Image
May 12 13 tweets 4 min read
Telegram has launched a pretty intense campaign to malign Signal as insecure, with assistance from Elon Musk. The goal seems to be to get activists to switch away from encrypted Signal to mostly-unencrypted Telegram. I want to talk about this a bit. 1/ First things first, Signal Protocol, the cryptography behind Signal (also used in WhatsApp and several other messengers) is open source and has been intensively reviewed by cryptographers. When it comes to cryptography, this is pretty much the gold standard. 2/
May 9 4 tweets 1 min read
Seems like we’re getting a major push for activists to switch from Signal to Telegram, which has no encryption by default and a pretty shady history of refusing to add it. Seems like a great idea, hope folks jump all over that. Someone said “why the sarcasm”. Please don’t take my last sentence above seriously. Signal is an excellent and confidential platform. Telegram is not. Sometimes it’s worth using a non-confidential platform to reach lots of people (see Twitter) but it’s not a replacement.
May 7 7 tweets 2 min read
We’re pretty rapidly and consciously heading towards a future where everything you do on the Internet requires government ID, with basically no attention paid to the consequences of that (indeed, the consequences of that may be the whole point.) I’ve become a little bit despairing that we can fight this. The pressure on all sides seems much too intense. But we also have very little tech in place to make this safe: and realistically the only people who can develop it work in Cupertino and Mountain View.
May 2 8 tweets 2 min read
Europe is maybe two months from passing laws that end private communication as we know it, and folks are looking the other way (understandably.) You’re not going to get a do-over once these laws are passed. The plan, to repeat, is to mandate that every phone contains software that receives a list of illicit material (photos, keywords, AI models that can determine the sentiment of conversations) and scans your data for matches *before* it is encrypted, and alerts the police directly.
Mar 31 6 tweets 1 min read
This thing Facebook did — running an MITM on Snapchat and other competitors’ TLS connections via their Onavo VPN — is so deeply messed up and evil that it completely changes my perspective on what that company is willing to do to its users. I don’t come from a place of deep trust in big tech corporations. But this stuff seems like it crosses a pretty clear red line, maybe even a criminal one.
Mar 12 12 tweets 3 min read
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798… To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
Mar 5 11 tweets 2 min read
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.” I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
Feb 21 18 tweets 4 min read
So Apple has gone and updated the iMessage protocol to incorporate both forward security (very good!) and post-quantum cryptography. security.apple.com/blog/imessage-… This is a big deal because iMessage (which gets no real attention from anyone) is one of the most widely-adopted secure communications protocols in the world. At least 1 billion people use it, all over the world. It’s the only widely-available encrypted messaging app in China.