Matthew Green Profile picture
May 12 13 tweets 4 min read Read on X
Telegram has launched a pretty intense campaign to malign Signal as insecure, with assistance from Elon Musk. The goal seems to be to get activists to switch away from encrypted Signal to mostly-unencrypted Telegram. I want to talk about this a bit. 1/
First things first, Signal Protocol, the cryptography behind Signal (also used in WhatsApp and several other messengers) is open source and has been intensively reviewed by cryptographers. When it comes to cryptography, this is pretty much the gold standard. 2/
Telegram by contrast does not end-to-end encrypt conversations by default. Unless you manually start an encrypted “Secret Chat”, all of your data is visible on the Telegram server. Given who uses Telegram, this server is probably a magnet for intelligence services. 3/
Signal’s client code is also open source. You can download it right now and examine the code and crypto libraries. Even if you don’t want to do that, many experts have. This doesn’t mean there’s never going to be a bug: but it means lots of eyes.
github.com/signalapp/Sign…
Pavel Durov, the CEO of Telegram, has recently been making a big conspiracy push to promote Telegram as more secure than Signal. This is like promoting ketchup as better for your car than synthetic motor oil. Telegram isn’t a secure messenger, full stop. That’s a choice Durov made.Image
When Telegram launched, they had terrible and insecure cryptography. Worse: it was only available if you manually turned it on for each chat. I assumed (naively) this was a growing pain and eventually they’d follow everyone else and add default end-to-end encryption. They didn’t.
I want to switch away from that and briefly address a specific point Durov makes in his post. He claims that Signal doesn’t have reproducible builds and Telegram does. As I said, this is extremely silly because Telegram is unencrypted anyway, but it’s worth addressing. Image
One concern with open source code is that even if you review the open code, you don’t know that this code was used to build the app you download from the App Store. “Reproducible builds” let you build the code on your own computer and compare it to the downloaded code.
Signal has these for Android, and it’s a relatively simple process. Because Android is friendly to this. For various Apple-specific reasons this is shockingly hard to do on iOS. Mostly because apps are encrypted. (Apple should fix this.)
I want to give Telegram credit because they’ve tried to “hack” a solution for repro builds on iOS. But reading it shows how bad it is: you need a jailbroken (old) iPhone. And at the end you still can’t verify the whole app. Some files stay encrypted. core.telegram.org/reproducible-b…

Image
Image
It’s not weird for a CEO to say “my product is better than your product.” But when the claim is about security and critically, *you’ve made a deliberate decision not to add security for most users* then it exists the domain of competition, and starts to feel like malice.
I don’t really care which messenger you use. I just want you to understand the stakes. If you use Telegram, we experts cannot even begin to guarantee that your communications are confidential. In fact at this point I assume they are not, even in Secret Chats mode.
You should do what you want with this information. Think about confidentiality matters. Think about where Telegram operates its servers and what government jurisdictions they work in. Decide if you care about this. Just don’t shoot your foot off because you’re uninformed.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Matthew Green

Matthew Green Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @matthew_d_green

May 9
Seems like we’re getting a major push for activists to switch from Signal to Telegram, which has no encryption by default and a pretty shady history of refusing to add it. Seems like a great idea, hope folks jump all over that.
Someone said “why the sarcasm”. Please don’t take my last sentence above seriously. Signal is an excellent and confidential platform. Telegram is not. Sometimes it’s worth using a non-confidential platform to reach lots of people (see Twitter) but it’s not a replacement.
As for Telegram, during the early days of their run I thought they were just being stubborn and eventually they’d deploy good default encryption. It’s been years and they’ve made it very clear that they never will. For a messenger that advertises privacy, that’s strange.
Read 4 tweets
May 7
We’re pretty rapidly and consciously heading towards a future where everything you do on the Internet requires government ID, with basically no attention paid to the consequences of that (indeed, the consequences of that may be the whole point.)
I’ve become a little bit despairing that we can fight this. The pressure on all sides seems much too intense. But we also have very little tech in place to make this safe: and realistically the only people who can develop it work in Cupertino and Mountain View.
So what does a future involving age verification look like? As a first step it’s going to involve installing government ID on your phone. The ID will be tied to your biometrics (face). Apple is already deploying something like this, but it can’t be used for web browsing — yet. Image
Read 7 tweets
May 2
Europe is maybe two months from passing laws that end private communication as we know it, and folks are looking the other way (understandably.) You’re not going to get a do-over once these laws are passed.
The plan, to repeat, is to mandate that every phone contains software that receives a list of illicit material (photos, keywords, AI models that can determine the sentiment of conversations) and scans your data for matches *before* it is encrypted, and alerts the police directly.
This will initially be used to target CSAM (child sexual abuse material) but it will also target conversations that contain “grooming behavior”, which clearly involves some kind of AI recognition of content. Once these systems are in your phone, of course, this can be expanded.
Read 8 tweets
Mar 31
This thing Facebook did — running an MITM on Snapchat and other competitors’ TLS connections via their Onavo VPN — is so deeply messed up and evil that it completely changes my perspective on what that company is willing to do to its users.
I don’t come from a place of deep trust in big tech corporations. But this stuff seems like it crosses a pretty clear red line, maybe even a criminal one.
I would say: I’d like to see some very performative firings before I trust Meta again, but let’s be honest. This almost certainly went right to the top. Nobody is going to do something this unethical unless they know management has their back 100%.
Read 6 tweets
Mar 12
Google has a blog up discussing their threat modeling when deploying “post-quantum” (PQC) cryptographic algorithms. It’s an interesting read. bughunters.google.com/blog/510874798…
To elaborate a bit on what’s in the blog post, we know that quantum algorithms exist, in principle, that can break many of the cryptographic algorithms we routinely use. All we’re waiting for now is a capable enough quantum computer to run them. (And this seems hard.) 1/
But technology development isn’t linear. Sometimes problems seem impossible until a big breakthrough changes everything. Think about the development of classical computers before and after semiconductors. The same could happen with QC. 2/
Read 12 tweets
Mar 5
A thing I worry about in the (academic) privacy field is that our work isn’t really improving privacy globally. If anything it would be more accurate to say we’re finding ways to encourage the collection and synthesis of more data, by applying a thin veneer of local “privacy.”
I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
A typical pitch here is that, by applying techniques like Differential Privacy, we can keep any individual user’s data “out of the model.” The claim: the use of your private data is harmless, since the model “based on your data” will be statistically close to one without it.
Read 11 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(