Also I think it’s amazing that in five years we’ve gone from “if you haven’t committed a crime you don’t need encryption” to “US opposition lawmakers have their texts searched.”
Quick reminder: Apple could fix this in a heartbeat by adding an “end to end encryption for iCloud backup” setting (the tech is already in place), but they don’t. Even for those who want it.
Also, I understand the legal situation. But imagine being a lawyer and knowing that the White House is spying on the House Intelligence Committee, but being duty bound to keep that secret for three years.
I guess what I’m saying is that if you come out of that with any reverence at all for the way US law works in this area, you’re a stronger person than I am. And that’s not a compliment.
This is how our country works now, and (very probably) will again in the future.
This is where we’re at. The responsibility for fighting surveillance abuse falls to tech companies, because nobody even pretends that the Federal government and courts are functional moral actors.
I have to assume that right now Apple and other tech companies are developing procedures to identify subpoenas that are aimed at Congress, on the assumption that the DoJ can’t be trusted to tell them.
“Well, we only handed over metadata, not content.”
You handed over a list that could contain every phone number House Intelligence Committee members ever spoke to or texted with, and you think that makes it ok?
I’m going to forget about TLS here for a moment, and point out that the best way to mitigate a lot of these attacks is just to replace cookies entirely.
I know this is a bit of a stereotype, but why is Russian crypto always so weird?
“We don’t use a normal random number generator, we use a gerbil connected to a hot cup of tea. Also use our ciphers where the S-Boxes are ‘random’ meaning they actually aren’t.”
Dear researchers: the hard part of problems like “traceability” is not the part where you build a mass surveillance system. Building mass surveillance systems is *easy*.
The hard part is building systems that don’t utterly shatter the security guarantees that the private system offered, and don’t have caveats like “obviously this can be abused, stopping that is future work.”
When I go out to see what our research community has been doing in this area, I expect them to understand what makes this research problem hard. Not to find slides like this one.