Matthew Green Profile picture
May 28 15 tweets 4 min read Read on X
Some folks are discussing what it means to be a “secure encrypted messaging app.” I think a lot of this discussion is shallow and in bad faith, but let’s talk about it a bit. Here’s a thread. 1/
First: the most critical element that (good) secure messengers protect is the content of your conversations in flight. This is usually done with end-to-end encryption. Messengers like Signal, WhatsApp, Matrix etc. encrypt this data using keys that only the end-devices know. 2/
Encrypting the content of your conversations, preferably by default, is “table stakes.” It isn’t perfect, but it’s required for a messenger even to flirt with the word “secure.” But security and privacy are hard, deep problems. Solving encrypted messaging is just the start. 3/
There are lots of threats that still exist even if you add end-to-end encryption to messaging. One is: does your phone back up message content to the cloud? But another much harder one is: what about metadata? Ie what about the details of *who* you communicate with and when? 4/
E2EE cloud backup is incredibly, back-breakingly hard. It involves storing keys somewhere that the cloud provider can’t access, even in the event where you lose your phone and forget your passwords. But services have come up with solutions. Eg: blog.cryptographyengineering.com/2022/12/07/app…
But if cloud backup is hard, it’s literally *nothing* compared to metadata. Metadata is the hardest thing in the world. That’s because encryption does very little to help you: your messages (encrypted or not) need to be delivered. The servers that do this have to know to whom. 6/
Metadata is so hard that it really matters how much you trust the intentions and promises of your service provider. For example: WhatsApp is a Meta company, and they’re open about the fact that they use social graphs to perform advertising. That’s how they make money. 7/
I appreciate that WA is open about this and I trust them generally not to sell my data to criminals, but I also don’t like it. That’s why I don’t use WhatsApp as my primary messenger even if I strongly believe that their (content) encryption is very good. 8/
But you should be very wary of anyone who tells you they don’t do anything with metadata unless you either (1) trust their technical protections or (2) trust them a lot organizationally. And the technical side is very challenging. Just incredibly difficult. 9/
There are a bunch of separate issues, and a full discussion is so messy they require a different medium. They include:

* Contact discovery: how to find your contacts without giving away your social graph
* Registration: can you sign up as a pseudonymous account, or do you need an identifier (with enormous tradeoffs for spam.)
* Sender anonymity: can you send without revealing who you are?
* IP address anonymity: Ugh. Mostly this requires a VPN or Tor.
* Timing attacks and sophisticated adversaries: see attached diagram.Image
These are all incredibly difficult problems and folks are working on solving them. Signal uses trusted enclaves to perform contact discovery, and has a “sealed sender” to hide sender IDs. Other services allow you to sign up with pseudonyms. You should pick what works for you.
What you should not do, and what I see a lot of people doing, is panic about the fact that messaging services have access to metadata and/or even *use* that metadata, and switch to something that is unencrypted and arguably worse.
If you’re a technical expert, you should try to explain to others what the tradeoffs look like. Some people are better off using a service like WhatsApp because their contacts are there. Others are better off using tiny bespoke encrypted messengers with anonymity features.
What you should not do is indulge the “aha gotcha” crowd that is running around trying to convince people that all popular messages are “backdoored” because that message leads to panic and people using insecure systems. //
PS the diagram in the middle of this thread is from a great article by James Mickens. usenix.org/system/files/1…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

Sep 19
Most of cryptography research is developing a really nice mental model for what’s possible and impossible in the field, so you can avoid wasting time on dead ends. But every now and then someone kicks down a door and blows up that intuition, which is the best kind of result.
One of the most surprising privacy results of the last 5 years is the LMW “doubly efficient PIR” paper. The basic idea is that I can load an item from a public database without the operator seeing which item I’m loading & without it having to touch every item in the DB each time.
Short background: Private Information Retrieval isn’t a new idea. It lets me load items from a (remote) public database without the operator learning what item I’m asking for. But traditionally there’s a *huge* performance hit for doing this.
Read 14 tweets
Sep 12
The new and revived Chat Control regulation is back. It still appears to demand client side scanning in encrypted messengers. But removes “detection of new CSAM” and simply demands detection of known CSAM. However: it retains the option to change this requirement back.
For those who haven’t been paying attention, the EU Council and Commission have been relentlessly pushing a regulation that would break encryption. It died last year, but it’s back again — this time with Hungary in the driver’s seat. And the timelines are short. Image
The goal is to require all apps to scan messages for child sexual abuse content (at first: other types of content have been proposed, and will probably be added later.) This is not possible for encrypted messengers without new technology that may break encryption.
Read 4 tweets
Sep 10
One of the things we need to discuss is that LLMs listening to your conversations and phone calls, reading your texts and emails — this is all going to be normalized and inevitable within seven years.
In a very short timespan it’s going to be expected that your phone can answer questions about what you did or talked about recently, what restaurants you went to. More capability is going to drive more data access, and people will grant it.
I absolutely do believe that (at least initially), vendors will try to do this privately. The models will live on your device or, like Apple Intelligence, they’ll use some kind of secure outsourcing. It’ll be required for adoption.
Read 6 tweets
Aug 26
I hope that the arrest of Pavel Durov does not lead to him or Telegram being held up as some hero of privacy. Telegram has consistently acted to collect huge amounts of unnecessary private data on their servers, and their only measure to protect it was “trust us.”
For years people begged them to roll out even rudimentary default encryption, and they pretty aggressively did not of that. Their response was to move their data centers to various middle eastern countries, and to argue that this made your data safe. Somehow.
Over the years I’ve heard dozens of theories about which nation-states were gaining access to that giant mousetrap full of data they’d built. I have no idea if any of those theories were true. Maybe none were, maybe they all were.
Read 6 tweets
Aug 25
Apropos Pavel Durov’s arrest, I wrote a short post about whether Telegram is an “encrypted messaging app”. blog.cryptographyengineering.com/2024/08/25/tel…
The TL;DR here is that Telegram has an optional end-to-end encryption mode that you have to turn on manually. It only works for individual conversations, not for group chats. This is so relatively annoying to turn on (and invisible to most users) that I doubt many people do.
This on paper isn’t that big a deal, but Telegram’s decision to market itself as a secure messenger means that loads of people (and policymakers) probably assume that lots of its content is end-to-end encrypted. Why wouldn’t you?
Read 5 tweets
Jul 13
If you want to avoid disasters like the AT&T breach, there are basically only three solutions:

1. Don’t store data
2. Don’t store unencrypted data
3. Have security practices like Google

Very few companies can handle (3), certainly not AT&T.
One of the things policymakers refuse to understand is that securing large amounts of customer data, particularly data that needs to be “hot” and continually queried (eg by law enforcement) is just beyond the means of most US companies.
If you’re a policymaker and the your policy requires company X \notin {Apple, Google, Microsoft, Meta}* to store “hot” databases of customer data: congrats, it’s 1941 and you just anchored all the aircraft carriers at Pearl Harbor.

* Frankly I’m being generous with this list.
Read 5 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(