So the remote client picks the DH parameters (why!) and sends them to you, where you have to carefully check that they’re constructed correctly (pretty sure these checks were added later.)
Then there’s this weird thing about the server picking randomness, which used to be a total security vulnerability since it allowed the server to pick the DH secret (now fixed, I think) and still more complicated group membership checks.
Then what should happen is some kind of key verification step, but it’s not clear that any of this happens. There are just so many opportunities for bugs in this thing.
It’s not so unusual for new apps to do some weird stuff in v1 of their crypto. What’s unique about Telegram is how stubborn they are about keeping this stuff in - many versions later.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I’m uncomfortable using “1” as an x-coordinate for Shamir, since my stupid intuition tells me “it’s so close to 0” :) But Binance just letting you blast away at q, 2q, etc.
One of my favorite things about cryptocurrency is you never have to say “well the impact can’t be that bad” because you know someone secured like $100m with it.
Does anyone know how Apple’s privacy protocol for AirTags combines with their anti-tracking features? Seems like the two should be incompatible. (I have some guesses but I’m wondering if it’s officially documented.)
My guess is that the AirTag cycles its broadcast pseudonym at a slower rate than your phone can scan, so if your phone detects many broadcasts of the same pseudonym within some time window it says “ah, an AirTag is near me.” And it links this over many windows.
In other words, Apple is taking advantage of a weakness in their anti-tracking protocol to do tracking, in the service of preventing a different kind of tracking.
Pretty confident Apple is going to ditch client-side CSAM scanning in favor of server-side CSAM scanning. This will be an improvement, but will leave them in a tight corner with E2EE.
Still a massive improvement: the deployment of client-side scanning for cloud backups would have been an asterisk on all device privacy forever, particularly as cloud backups become increasingly non-optional.
Watching this log4j bug metastasize, I’m seeing people ask why industry doesn’t fund open source. I don’t have a great answer, but I have some thoughts following the experience with Heartbleed in ‘14. 1/
When Heartbleed dropped, it was very similar to log4j: an underfunded OSS project (OpenSSL) that nobody thought about, but was *everywhere*. It took everyone by surprise, and even woke industry up. The result was a surge of funding. 2/
Industry (not the government, who still though “infrastructure” meant dams and bridges) suddenly realized they were using this stuff everywhere. So the Linux Foundation created the Core Infrastructure Initiative (now the OpenSSF). coreinfrastructure.org 3/
When is “turn off the cloud” no longer a viable option.
I think it’s optimistic that 40% of people think our devices will continue to be useful in the future without a connection to a cloud service.
Ok. I did not phrase this question well so let me try again. At what point do you think our mobile devices will become sufficiently tied to cloud services that “turn off cloud” is no longer an option — either explicitly, or *effectively*.