This is an amazing paper. It implies (with strong statistical evidence) that the design of a major mobile-data encryption algorithm — used in GPRS data — was deliberately backdoored by its designer. eprint.iacr.org/2021/819
The GPRS standards were extensions to the GSM (2G/3G) mobile standard that allowed phones to use data over cellular networks. This was before LTE. For security, the standards included encryption to provide over-the-air security for your data. 2/
As is “normal” for telephony standards, the encryption was provided by two custom ciphers: GEA-1 and GEA-2. While there were strong export control regulations in place for crypto, there’s little overt indication that either of these ciphers was deliberately weakened. 3/
Keep in mind that these algorithms were most widely used in the time before most websites used TLS. So if you were using mobile data in the 2000s, you were basically relying on these encryption schemes to prevent eavesdropping. 4/
Anyway, here’s the upshot of what the paper finds: for one of the algorithms (GEA-1) there are supposed to be 2^64 possible internal states. However, due to some unlikely circumstances in the design of the system, only ~2^40 possible state values that turn up in practice.
(This difference of 24 might not seem like a big deal, but that’s a potential reduction of 16,777,216x in terms of security.) 6/
This leads to a practical passive eavesdropping attack on GEA-1 protected data connections. What’s amazing about this attack is that you only need to capture 65 bits of “known keystream” (meaning 65 bits of encrypted data where you already know the plaintext.) 7/
Coincidentally, the design of GPRS includes lots of packet headers and things that are predictable, making it relatively easy to grab 65 bits of known keystream just by listening passively. 8/
Ok, ok, ok this seems bad. But we haven’t gotten to the astonishing part yet.
The weakness in this cipher is due to a coincidental relationship between the feedback registers (LFSRs) in the cipher. These reduce the state space from ~2^64 to ~2^40… 9/
The obvious question is whether this happened accidentally. The authors therefore tried to generate random parameters for the cipher to see if they could make this condition happen by accident. 10/
After about 1 million attempts, they were unable to replicate this condition. Which means that the cipher designers got *very unlucky* or GEA-1 was deliberately built to contain a known weakness.
Fortunately for everyone involved, these ciphers have been mostly deprecated. They’re only even included in some older phone basebands. But that’s not cause for celebration.
Because at the end of the day, all of the same incentives exist for governments to sabotage encryption standards. We like to pretend that they’re too enlightened to do this anymore, or that we’re smart enough to catch them. Maybe. I doubt it. 13/
In the late 2030s you should expect a team of researchers to be writing a paper just like this one, except it will be about the encryption you’re using today. //fin
• • •
Missing some Tweet in this thread? You can try to
force a refresh
This is where we’re at. The responsibility for fighting surveillance abuse falls to tech companies, because nobody even pretends that the Federal government and courts are functional moral actors.
I have to assume that right now Apple and other tech companies are developing procedures to identify subpoenas that are aimed at Congress, on the assumption that the DoJ can’t be trusted to tell them.
“Well, we only handed over metadata, not content.”
You handed over a list that could contain every phone number House Intelligence Committee members ever spoke to or texted with, and you think that makes it ok?
Also I think it’s amazing that in five years we’ve gone from “if you haven’t committed a crime you don’t need encryption” to “US opposition lawmakers have their texts searched.”
Quick reminder: Apple could fix this in a heartbeat by adding an “end to end encryption for iCloud backup” setting (the tech is already in place), but they don’t. Even for those who want it.
I’m going to forget about TLS here for a moment, and point out that the best way to mitigate a lot of these attacks is just to replace cookies entirely.