This new EU legislation granting providers the right to “voluntarily” scan private messages doesn’t break encryption, or take us to a regime of mandatory mass surveillance. But it definitely sets the stage.
What’s remarkable about this stuff is that it’s phrased as “protecting children from child abuse”. And as a parent I appreciate that. But has anyone explored, empirically, if any of this surveillance actually works to stop the problem?
Here in the US we’ve built an enormous surveillance system to detect instances of child sexual abuse material, it’s been running for years, and the number of reports is going up exponentially.
How many pedophiles are there? Isn’t it a smallish number?
It’s remarkable that we’re all so eager to build a surveillance system that has basically unlimited machine access to all of our private messages, and nobody seems at all interested in whether any of this stuff works to deter the underlying crimes.
Building systems to detect CSAM is hard. Detecting “grooming” behavior is even more invasive, since it requires machine learning techniques that understand a wide variety of interactive human communication.
Building the infrastructure to surveil communications at this level of detail (and committing to keep it operating as more and more services inevitably add end-to-end encryption) is not a business that democracies should be undertaking.
There is functionally no difference between a machine system that can detect “grooming behavior” and one that can detect literally any other human communication pattern.
Once you hold open the door for providers to solve this problem, you’ve committed to supporting automated surveillance with capabilities that could make the Chinese government envious. It’s only a question of how you train them. //
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Every article I read on (ZK) rollups almost gets to the real problem, and then misses it. The real problem is the need for storage. ZK proofs won’t solve this.
I keep reading these articles that talk about the problems with rollups. And they’re good articles! E.g.: medium.com/dragonfly-rese…
But they always reach a point where they realize that the problem is state storage, and then they handwave that the solution is going to be something like Mina or zkSync, which don’t fully solve the state storage problem.
I was going to laugh off this Kaspersky password manager bug, but it is *amazing*. In the sense that I’ve never seen so many broken things in one simple piece of code. donjon.ledger.com/kaspersky-pass…
Like seriously, WTF is even happening here. Why are they sampling *floats*? Why are they multiplying them together? Is this witchcraft?
And here, Kaspersky decided that instead of picking a random password, they should bias the password to be non-random and thus “less likely to be on a cracker list”. 🤦🏻♂️
I’m struggling to understand how a 1-bit hash error can get irreversibly incorporated into CT, while all the blockchains of the world hum along happily. groups.google.com/a/chromium.org…
The problem here is not that a hash can be corrupted, because that happens. The problem is that somehow the totally “breaks” the CT log? Seems like an avoidable design error. But it’s early and I’m still drinking my coffee.
Anyway, it seems to me that every cryptographic system should be built with the assumption that something (memory, network, 56K phone modem) will introduce errors, and the system will detect those errors — but not by exploding.
This is an amazing paper. It implies (with strong statistical evidence) that the design of a major mobile-data encryption algorithm — used in GPRS data — was deliberately backdoored by its designer. eprint.iacr.org/2021/819
The GPRS standards were extensions to the GSM (2G/3G) mobile standard that allowed phones to use data over cellular networks. This was before LTE. For security, the standards included encryption to provide over-the-air security for your data. 2/
As is “normal” for telephony standards, the encryption was provided by two custom ciphers: GEA-1 and GEA-2. While there were strong export control regulations in place for crypto, there’s little overt indication that either of these ciphers was deliberately weakened. 3/