I'm so old I remember when experts were saying we need more encryption to address the current cybersecurity threats. lawfareblog.com/most-email-isn…
You know, last week.
Lest we forget, the UK already has draconian anti-encryption provisions in its law, as well as authorization for "bulk hacking," two words which never cease to send chills down my spine.
The case for encouraging greater development and use for encryption has been well-documented, and has only grown stronger over the years. You can see the history at encryptioncompendium.org
Let's dig back into this article, specifically the claim that end-to-end encryption will "prevent any access to messaging content" because it's simply not true. There are a ton of other ways for law enforcement to get access to content for investigations and prosecutions...
We know this partly because there have been successful prosecutions of individuals even where e2e tools are used. A few options - other conversation participants, the device itself (if in custody), hacking the endpoint (allowed in UK law even if I have qualms)...
And the number one way, through human intelligence and information - infiltrating the network and uncovering where the content is coming from. E2e doesn't "prevent" any access. What it does is ensure that access isn't cheap and easy, something important to protecting the data...
protecting both from bad actors as well as from company misuse- you can be assured a company won't use your data in ways you don't want or sell it if they never have it to begin with. That's why encryption is so vital to human rights.
Which isn't to say that the problems in the article - child exploitation and abuse - are not real, significant issues that we need to be paying attention to and devoting resources to. But that doesn't have to-- and shouldn't -- come at the expense of global cybersecurity.
That's why the Lawfare article I started with is calling for *more* encryption. Encryption is - must be - inevitable - so we really should have more conversations about what that means and fewer about how to stop it from happening.
Fin. (for now).
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The Colorado Privacy Act, SB 21-190. You can find the info here, including all the previous iterations. I'll hit the high points but if you want the details you should always go straight to the text: leg.colorado.gov/bills/sb21-190.
Lots of definitions. A big one is consent. Specific, unambiguous, informed. Earlier version referenced a "narrowly defined purpose" which was removed before the final. NOT consent: broad policies, exiting a window, agreement through dark patterns (defined elsewhere)
Dark patterns - UI "designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice"
Also "Decisions that Produce Legal or Similarly Significant Effects Concerning a Consumer" may be longest term of art ever
This Clegg piece is getting passed around a lot and I have thoughts about some of the things it says, which I'll provide here in a thread, featuring and responding to 10 pieces of the write-up. The following represents my personal thoughts and opinions. Sorry in advance. 1/
It starts with this recognition of the benefits of targeted advertising for the world. We know this argument - I've even made this argument before, and I referenced it recently around how tech has traditionally been built up around a call of being good for humanity 2/
But, as with many things, I've seen more and changed my mind. First, this isn't just "targeting," it's micro-targeting. The marginal benefits that people receive from micro-targeted ads are not worth the potential harm of those ads, how they can distort perception of the world 3/
The Crypto Colloquium was a multi-stakeholder dialogue that measured consensus on the topic of encryption and flagged important questions that need to be answered by any proposal accessnow.org/cms/assets/upl…