I was going to laugh off this Kaspersky password manager bug, but it is *amazing*. In the sense that I’ve never seen so many broken things in one simple piece of code. donjon.ledger.com/kaspersky-pass…
Like seriously, WTF is even happening here. Why are they sampling *floats*? Why are they multiplying them together? Is this witchcraft?
And here, Kaspersky decided that instead of picking a random password, they should bias the password to be non-random and thus “less likely to be on a cracker list”. 🤦🏻♂️
Then they used a non-cryptographic PRNG (Mersenne Twister). Amusingly, this is probably the *least* bad thing Kaspersky did, even though it’s terribly bad.
And in case you thought that after doing everything else wrong, they were going to do the next part right: nope. They then proceed to seed the whole damn thing with time(0).
I have to admire the combination of needless complexity combined with absolutely breathtaking incompetence.
Anyway, before anyone kills me for being mean to developers doing the best they can… The real takeaway here is that (obviously) nobody with even modest cryptographic knowledge ever audited, thought about, or came near this product.
And in case you’re of the opinion that bad implementations are unique to Kaspersky: it’s entirely possible to make some other mainstream password managers “hang forever” by setting the password chatset constraints too high, indicating that they haven’t figured this out either.
Some actual constructive lessons:
* Always use a real RNG to generate unpredictable seeds, never time(0)
* Always use a cryptographic RNG
* Never ever use floats in cryptography (I suspect some Javascript nonsense here)
* To convert from bits to an alphabet of symbols… 1/
(Rewriting this because now I’m afraid people will take advice from tweets)
You should use rejection sampling, with you can find articles about online. Be careful that your rejection loop doesn’t run forever.
And please, get someone to look at your code. Especially if it’s going to be in a mainstream product. You cannot ever ship anything bespoke like this without having an expert glance it over. Even an hour would have flagged all this stuff.
Oh gosh.
Anyway I recently had a discussion with a group of expert cryptographers/cryptographic engineers about whether “don’t roll your own crypto” is a helpful rule, or if it’s non-inclusive.
I don’t know the answer, but stuff like this is why the phrase was invented.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Trying to plan a seminar on the topic of “how do we maintain privacy in the coming dystopia” and it’s kind of a thing.
Over the past thirty years we’ve done amazing thing technologically when it comes to anonymity and privacy, and to some extent it was “all theoretical” that we’d need it. That’s all behind us.
So here we are in the bad timeline. Social networks want to jam AI into your encrypted messages; governments want to access your private messages; everyone you maybe once hoped to rely on is either planning to sell you out or else trying to find the fastest way to monetize you.
Specifically, Google when asked by a US senator could easily have denied that the UK was pressuring them, but instead said this.
If you call someone in their home and ask them if someone has a gun to their head, and they say “I can’t talk about that” then you call 911 because that’s what common sense tells you to do.
It is insane how scary the threat models of encrypted messaging apps providers are.
You have these apps with billions of users. Some of those users are doing huge financial transactions. Some are politicians. Some are coordinating literal national security operations. And all these messages go through a few vulnerable servers.
I think older people (that includes me I guess) think that messaging apps are like AOL Instant Messenger, not used for anything important. It’s completely insane how much of our society now runs on them, and what a total disaster it would be if a couple of major apps were broken.
Ok, look people: Signal as a *protocol* is excellent. As a service it’s excellent. But as an application running on your phone, it’s… an application running on your consumer-grade phone. The targeted attacks people use on those devices are well known.
There is malware that targets and compromises phones. There has been malware that targets the Signal application. It’s an app that processes many different media types, and that means there’s almost certainly a vulnerability to be exploited at any given moment in time.
If you don’t know what this means, it means that you shouldn’t expect Signal to defend against nation-state malware. (But you also shouldn’t really expect any of the other stuff here, like Chrome, to defend you in that circumstance either.)
You should use Signal. Seriously. There are other encrypted messaging apps out there, but I don’t have as much faith in their longevity. In particular I have major concerns about the sustainability of for-profit apps in our new “AI” world.
I have too many reasons to worry about this but that’s not really the point. The thing I’m worried about is that, as the only encrypted messenger people seem to *really* trust, Signal is going to end up being a target for too many people.
Signal was designed to be a consumer-grade messaging app. It’s really, really good for that purpose. And obviously “excellent consumer grade” has a lot of intersection with military-grade cryptography just because that’s how the world works. But it is being asked to do a lot!
New public statement from Apple (sent to me privately):
“As of Friday, February 21, Apple can no longer offer Advanced Data Protection as a feature to new users in the UK.”
Additionally:
"Apple can no longer offer Advanced Data Protection (ADP) in the United Kingdom to new users and current UK users will eventually need to disable this security feature. ADP protects iCloud data with end-to-end encryption, which means the data can only be decrypted by the user who owns it, and only on their trusted devices. We are gravely disappointed that the protections provided by ADP will not be available to our customers in the UK given the continuing rise of data breaches and other threats to customer privacy. Enhancing the security of cloud storage with end-to-end encryption is more urgent than ever before. Apple remains committed to offering our users the highest level of security for their personal data and are hopeful that we will be able to do so in the future in the United Kingdom. As we have said many times before, we have never built a backdoor or master key to any of our products or services and we never will.”
This will not affect:
iMessage encryption
iCloud Keychain
FaceTime
Health data
These will remain end-to-end encrypted. Other services like iCloud Backup and Photos will not be end-to-end encrypted.