Telegram has launched a pretty intense campaign to malign Signal as insecure, with assistance from Elon Musk. The goal seems to be to get activists to switch away from encrypted Signal to mostly-unencrypted Telegram. I want to talk about this a bit. 1/
First things first, Signal Protocol, the cryptography behind Signal (also used in WhatsApp and several other messengers) is open source and has been intensively reviewed by cryptographers. When it comes to cryptography, this is pretty much the gold standard. 2/
Telegram by contrast does not end-to-end encrypt conversations by default. Unless you manually start an encrypted “Secret Chat”, all of your data is visible on the Telegram server. Given who uses Telegram, this server is probably a magnet for intelligence services. 3/
Signal’s client code is also open source. You can download it right now and examine the code and crypto libraries. Even if you don’t want to do that, many experts have. This doesn’t mean there’s never going to be a bug: but it means lots of eyes. github.com/signalapp/Sign…
Pavel Durov, the CEO of Telegram, has recently been making a big conspiracy push to promote Telegram as more secure than Signal. This is like promoting ketchup as better for your car than synthetic motor oil. Telegram isn’t a secure messenger, full stop. That’s a choice Durov made.
When Telegram launched, they had terrible and insecure cryptography. Worse: it was only available if you manually turned it on for each chat. I assumed (naively) this was a growing pain and eventually they’d follow everyone else and add default end-to-end encryption. They didn’t.
I want to switch away from that and briefly address a specific point Durov makes in his post. He claims that Signal doesn’t have reproducible builds and Telegram does. As I said, this is extremely silly because Telegram is unencrypted anyway, but it’s worth addressing.
One concern with open source code is that even if you review the open code, you don’t know that this code was used to build the app you download from the App Store. “Reproducible builds” let you build the code on your own computer and compare it to the downloaded code.
Signal has these for Android, and it’s a relatively simple process. Because Android is friendly to this. For various Apple-specific reasons this is shockingly hard to do on iOS. Mostly because apps are encrypted. (Apple should fix this.)
I want to give Telegram credit because they’ve tried to “hack” a solution for repro builds on iOS. But reading it shows how bad it is: you need a jailbroken (old) iPhone. And at the end you still can’t verify the whole app. Some files stay encrypted. core.telegram.org/reproducible-b…
It’s not weird for a CEO to say “my product is better than your product.” But when the claim is about security and critically, *you’ve made a deliberate decision not to add security for most users* then it exists the domain of competition, and starts to feel like malice.
I don’t really care which messenger you use. I just want you to understand the stakes. If you use Telegram, we experts cannot even begin to guarantee that your communications are confidential. In fact at this point I assume they are not, even in Secret Chats mode.
You should do what you want with this information. Think about confidentiality matters. Think about where Telegram operates its servers and what government jurisdictions they work in. Decide if you care about this. Just don’t shoot your foot off because you’re uninformed.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Most of cryptography research is developing a really nice mental model for what’s possible and impossible in the field, so you can avoid wasting time on dead ends. But every now and then someone kicks down a door and blows up that intuition, which is the best kind of result.
One of the most surprising privacy results of the last 5 years is the LMW “doubly efficient PIR” paper. The basic idea is that I can load an item from a public database without the operator seeing which item I’m loading & without it having to touch every item in the DB each time.
Short background: Private Information Retrieval isn’t a new idea. It lets me load items from a (remote) public database without the operator learning what item I’m asking for. But traditionally there’s a *huge* performance hit for doing this.
The new and revived Chat Control regulation is back. It still appears to demand client side scanning in encrypted messengers. But removes “detection of new CSAM” and simply demands detection of known CSAM. However: it retains the option to change this requirement back.
For those who haven’t been paying attention, the EU Council and Commission have been relentlessly pushing a regulation that would break encryption. It died last year, but it’s back again — this time with Hungary in the driver’s seat. And the timelines are short.
The goal is to require all apps to scan messages for child sexual abuse content (at first: other types of content have been proposed, and will probably be added later.) This is not possible for encrypted messengers without new technology that may break encryption.
One of the things we need to discuss is that LLMs listening to your conversations and phone calls, reading your texts and emails — this is all going to be normalized and inevitable within seven years.
In a very short timespan it’s going to be expected that your phone can answer questions about what you did or talked about recently, what restaurants you went to. More capability is going to drive more data access, and people will grant it.
I absolutely do believe that (at least initially), vendors will try to do this privately. The models will live on your device or, like Apple Intelligence, they’ll use some kind of secure outsourcing. It’ll be required for adoption.
I hope that the arrest of Pavel Durov does not lead to him or Telegram being held up as some hero of privacy. Telegram has consistently acted to collect huge amounts of unnecessary private data on their servers, and their only measure to protect it was “trust us.”
For years people begged them to roll out even rudimentary default encryption, and they pretty aggressively did not of that. Their response was to move their data centers to various middle eastern countries, and to argue that this made your data safe. Somehow.
Over the years I’ve heard dozens of theories about which nation-states were gaining access to that giant mousetrap full of data they’d built. I have no idea if any of those theories were true. Maybe none were, maybe they all were.
The TL;DR here is that Telegram has an optional end-to-end encryption mode that you have to turn on manually. It only works for individual conversations, not for group chats. This is so relatively annoying to turn on (and invisible to most users) that I doubt many people do.
This on paper isn’t that big a deal, but Telegram’s decision to market itself as a secure messenger means that loads of people (and policymakers) probably assume that lots of its content is end-to-end encrypted. Why wouldn’t you?
If you want to avoid disasters like the AT&T breach, there are basically only three solutions:
1. Don’t store data 2. Don’t store unencrypted data 3. Have security practices like Google
Very few companies can handle (3), certainly not AT&T.
One of the things policymakers refuse to understand is that securing large amounts of customer data, particularly data that needs to be “hot” and continually queried (eg by law enforcement) is just beyond the means of most US companies.
If you’re a policymaker and the your policy requires company X \notin {Apple, Google, Microsoft, Meta}* to store “hot” databases of customer data: congrats, it’s 1941 and you just anchored all the aircraft carriers at Pearl Harbor.