So this indictment is puzzling. It concerns Michael Sussman, a lawyer who organized the collection of DNS data from hosting providers allegedly for political purposes. Many of the companies are anonymized, can we tell who they are? (Thread) context-cdn.washingtonpost.com/notes/prod/def…
So we begin with “Internet Company-1”, which is a (major?) DNS resolver.
The executive in question (Tech Executive-1) claims to have been offered a position as Hillary Clinton’s cyberczar if she won, so maybe that’s a clue?
There are two other Internet companies in here. Internet Company-2 collects DNS data (maybe passively) and Internet Company-3 is maybe a threat Intel company owned by company #2. The executive has ownership interest in all three.
In case it isn’t obvious from context, this whole thread is about the Trump-Alfa Bank DNS allegations. Some of these quotes sent between researchers are pretty damning.
Overall this is an awful-looking story. The Clinton campaign and sympathetic executives at tech companies ran wild through private DNS data (which apparently has no protections at all) to concoct a narrative, and then dragged university researchers in to help confirm it.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It is insane how scary the threat models of encrypted messaging apps providers are.
You have these apps with billions of users. Some of those users are doing huge financial transactions. Some are politicians. Some are coordinating literal national security operations. And all these messages go through a few vulnerable servers.
I think older people (that includes me I guess) think that messaging apps are like AOL Instant Messenger, not used for anything important. It’s completely insane how much of our society now runs on them, and what a total disaster it would be if a couple of major apps were broken.
Ok, look people: Signal as a *protocol* is excellent. As a service it’s excellent. But as an application running on your phone, it’s… an application running on your consumer-grade phone. The targeted attacks people use on those devices are well known.
There is malware that targets and compromises phones. There has been malware that targets the Signal application. It’s an app that processes many different media types, and that means there’s almost certainly a vulnerability to be exploited at any given moment in time.
If you don’t know what this means, it means that you shouldn’t expect Signal to defend against nation-state malware. (But you also shouldn’t really expect any of the other stuff here, like Chrome, to defend you in that circumstance either.)
You should use Signal. Seriously. There are other encrypted messaging apps out there, but I don’t have as much faith in their longevity. In particular I have major concerns about the sustainability of for-profit apps in our new “AI” world.
I have too many reasons to worry about this but that’s not really the point. The thing I’m worried about is that, as the only encrypted messenger people seem to *really* trust, Signal is going to end up being a target for too many people.
Signal was designed to be a consumer-grade messaging app. It’s really, really good for that purpose. And obviously “excellent consumer grade” has a lot of intersection with military-grade cryptography just because that’s how the world works. But it is being asked to do a lot!
New public statement from Apple (sent to me privately):
“As of Friday, February 21, Apple can no longer offer Advanced Data Protection as a feature to new users in the UK.”
Additionally:
"Apple can no longer offer Advanced Data Protection (ADP) in the United Kingdom to new users and current UK users will eventually need to disable this security feature. ADP protects iCloud data with end-to-end encryption, which means the data can only be decrypted by the user who owns it, and only on their trusted devices. We are gravely disappointed that the protections provided by ADP will not be available to our customers in the UK given the continuing rise of data breaches and other threats to customer privacy. Enhancing the security of cloud storage with end-to-end encryption is more urgent than ever before. Apple remains committed to offering our users the highest level of security for their personal data and are hopeful that we will be able to do so in the future in the United Kingdom. As we have said many times before, we have never built a backdoor or master key to any of our products or services and we never will.”
This will not affect:
iMessage encryption
iCloud Keychain
FaceTime
Health data
These will remain end-to-end encrypted. Other services like iCloud Backup and Photos will not be end-to-end encrypted.
What is this new setting that sends photo data to Apple servers and why is it default “on” at the bottom of my settings screen?
I understand that it uses differential privacy and some fancy cryptography, but I would have loved to know what this is before it was deployed and turned on by default without my consent.
This seems to involve two separate components. One that builds an index using differential privacy (set at some budget) and the other that does a homomorphic search?
Does this work well enough that I want it on? I don’t know. I wasn’t given the time to think about it.
Most of cryptography research is developing a really nice mental model for what’s possible and impossible in the field, so you can avoid wasting time on dead ends. But every now and then someone kicks down a door and blows up that intuition, which is the best kind of result.
One of the most surprising privacy results of the last 5 years is the LMW “doubly efficient PIR” paper. The basic idea is that I can load an item from a public database without the operator seeing which item I’m loading & without it having to touch every item in the DB each time.
Short background: Private Information Retrieval isn’t a new idea. It lets me load items from a (remote) public database without the operator learning what item I’m asking for. But traditionally there’s a *huge* performance hit for doing this.