First up! A panel about encrypted messaging with @Riana_Crypto, @mattblaze, @djweitzner moderated by @joncallas
Important limits: gov't cannot dictate how to comply, just have to meet the goal.
So it doesn't require building backdoors.
Law enforcement have been chipping away at those carveouts ever since CALEA was passed, e.g. VoIP is a "communication service" now.
(Brooklyn Apple v FBI 2016 case)
Chat apps are information services. There's a Facebook Messenger case we only know through leaks where gov't (allegedly) tried to force Facebook to change Mess. enc
Veteran of two crypto wars. We didn't know we'd need to number them in the first one!
In #1 in the 1990s, the advocates of crypto&security were visionaries and had to have faith that the internet and its security was going to be important any day now.
Everyone here probably believes that computer security is important and *not* too good. It's a mess and crypto is one of the few tools that works. Taking away/making more complicated & expensive would be a disaster
* $20 was a reasonable price for a chip to use crypto -- so no software crypto!
* the killer app was voice communication on landline phones. Mobile phones were clunky and $$$
* FAX machines were really important
All of these underlying assumptions proved to be laughably false. What are our wrong engineering assumptions now?
@Riana_Crypto: this is the most Gen X thing ever: in the 90s you fought against the man and now you want to maintain the status quo.
2016: UK Investigative Powers Act "Snooper's Charter"
2018: Australia - Assistance and Access Bill
2020: India proposed filtering and decryption reqs on internet platforms
In the UK can issue "Technical Capacity Notices" (demand to redesign to decrypt). But must be "technically reasonable" according to an advisory board. Evaluations and reqs are secret.
No transparent process.
This shift in the debate is going to be permanent. It's a good solution (for the legislators).
* Susan Landau and Denis McDonough suggested looking harder at device encryption
* Carnegie Encryption Working Group suggests testing
Tech:
* what is the right measure of "technical feasibility"?
* how do we know when a vulnerability is "systemic"? How can we assess the relative security costs?
* can security vulns be detected and evaluated in secret?
* how do we assess the relative risks of exceptional access systems which could open up new vulns vs limiting law enforce. access?
* do all of these "assistance" requests have to be secret?
* what is the effect of secrecy on user trust and technical security properties?
When I joined the ACLU they asked what I wanted to work on and I said "please not encryption backdoors"... and the world has found me.
* it is a basic human right for two people to talk confidentially no matter where they are
* public posts are public. Integrity is important, availability is the whole point
* there's a huge grey area between private and public
* knowing the difference can be hard
* today's crypto wars are driven by a need to solve real problems
* I'm going to lump them together as "abuse": child abuse, intimate partner abuse, elder abuse, misinformation, disinformation, attacks on accepted norms, validity of governance
@joncallas points out that: a) he's against CSAI and other forms of abuse and b) it's dumb that he needs to point this out
New considerations and work:
* new design principles: privacy and security by design; tools for the platforms, people themselves, caregivers; considerations for meta-abuse (abuse of the anti-abuse system)
* how to handle unsolicited contact?
* easier reporting/blocking
* voluntary ML advice on content; fact-checkers
* data provenance, limitations on forwards, group size
* social graph analytics, better profile handling
* context-dependent behaviour based on personal status
* build on Screen Time etc. for analytics for caregivers
Q: can you explain what you did with the Clipper Chip, @mattblaze?
A: found a simple vuln which allowed circumvention of key escrow part. Could prob. have fixed design flaw, but demonstrated crypto protocol design is really hard -- might have had other vulns!
1. DB of keys. Really, really have to protect that and it's very, very hard.
2. Expensive as a design constraint, which made it much less expensive not to bother with encryption at all.
[ this is known as the "nerd harder" argument ]
@Riana_Crypto interesting tension in the policy debate: safety valves in the conversation, where there isn't full protection e.g. Apple backups
shout out to @ohemorange for insight that encryption is semantic not syntactic
@benadida makes everyone stop