Really is disappointing how many high profile cryptographers actually seem to believe that "privacy preserving" surveillance is not only possible (it's not) - but also somehow "not surveillance" (it is).
Meanwhile Apple are making statements to the press with the effective of "We are not scanning peoples photos for illegal material, we are hashing peoples photos and *using cryptography* to compare them to illegal material"
As if those aren't the *EXACT SAME THING*.
It's very important to focus on the principles involved here and not the mechanism. Just because you use cryptography to alter the thing you are surveillance doesn't make it not-surveillance.
When you boil it down, Apple has proposed your phone become black box that may occasionally file reports on you that may aggregate such that they contact the relevant authorities.
It doesn't matter how or why they built that black box or even what the false positive rate may be
I just need you to understand that giving that black box any kind of legitimacy is a dangerous step to take, by itself, absent any other slips on the slop.
I called it a rubicon moment because that is what it is. There is no going back from that.
I work for a tiny non-profit (@OpenPriv) that builds open source privacy tools, mostly for marginalized communities - because of my role I get exposed to what happens to people when their lives become subject to surveillance.
I feel those stories in my bones, everyday.
I learned long ago that I can't make people care about other people, but please at least see this for what it is, a sign of worse things to come for everyone's privacy if the push back from this isn't enough to rock Apple to the core.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
These are fair question regarding systems like the one Apple has proposed, and there is enough general ignorance regarding some of the building blocks that I think it is worth attempting to answer.
But it's going to take way more than a few tweets, so settle in...
First, I'll be incredibly fair to Apple and assume that the system has no bugs - that is there is no way for a malicious actor inside of outside of Apple to exploit the system in ways that it wasn't meant to be exploited.
Idealized constructions only.
At the highest level there is your phone and Apple's servers. Apple has a collection of hashes, and your phone has...well tbh if you are like a large number of people in the world it probably has links to your entire digital life.
As I have said before, I am willing to be the person who draws a line here, against the calls for "nuance".
There is no room for nuance, because nuance thinks surveillance systems can be built such that they can only used only for good or to only target bad people.
It is our duty to oppose all such system *before* they become entrenched!
Not to work out how to entrench them with the least possible public outrage at their very existence by shielding their true nature with a sprinkling of mathematics.
Clearly a rubicon moment for privacy and end-to-end encryption.
I worry if Apple faces anything other than existential annihilation for proposing continual surveillance of private messages then it won't be long before other providers feel the pressure to do the same.
You can wrap that surveillance in any number of layers of cryptography to try and make it palatable, the end result is the same.
Everyone on that platform is treated as a potential criminal, subject to continual algorithmic surveillance without warrant or cause.
If Apple are successful in introducing this, how long do you think it will be before the same is expected of other providers? Before walled-garden prohibit apps that don't do it? Before it is enshrined in law?
"Stop using encryption so we can can check your messages for criminal activity" becomes "Allow us to scan all the files on your computer for criminal activity"
I'm so tired. Maybe let's not do the dystopia of corporations building cop bots into general purpose computers.
At some point protecting your privacy is going to boil down to not using devices that actively spy on you. No amount of overlay software can protect you from making a bad choice there.
Don't support corps that scan your computer for crimes seems pretty fucking basic.
Today I spent 5 hours debugging, and finally moved a single line of code up 10 lines.
Reduced cpu usage 20x.
Just want that on the record.
Basically: something that was supposed to get called once per app run, was getting called on every top-level UI rebuild, and because it updated one of the main global settings providers that update was also triggering a top level UI rebuild - so there was a rebuild cascade.
Moved the call out to where it actually belongs...no more rebuild cascade, all the consumers are now caching properly, all the render boundaries are kicking in and everything is now ridiculous fast.
Need a break from research, ask me any cryptocurrency/blockchain related question and I will give you my honest, unfiltered answer.
Only if we consider all transactions as equally valuable to store - which their not. Ultimately blockchain space is a limited resource and is subject to the same economic constraints as other limited resources.
Any legitimacy that smart contracts might have had died when the DAO was reversed. Either code is law damn the consequences, or smart contracts are just as fragile as any other mechanism when it comes to mob justice.