Next up at #PEPR21: Cryptographic Privacy-Enhancing Technologies in the Post-Schrems II Era
from Sunny Seon Kang, Data Privacy Attorney
Going to provide context on CJEU case C-311/18, aka "Shrems II"
This launched companies into a whole tizzy because they said that folks needed "supplementary measures". What the heck is that?
Without Privacy Shield, you can't transfer data from the EU to the US (thanks, Shrems I), because the US isn't considered to have "adequacy" [essentially strong enough protections under the law. People were pissed about Snowden]
So instead we (often) use Standard Contractural Clauses. They're valid, but not always sufficient:
* SCCs don't bind the governments -- they're not a part of the contract
* So post-Shrems II may need "supplemental measures" which can be technical, contractual, or organizational
So you can't just use SCCs or BCRs (Binding Corporate Rules). You may need technical measures as well. On top of all the other ones you should have.
But at first no one knew what they were. There's guidance from EDPB on 11/2020
So what's an example of a technical supplementary measure? Secure multi-party computation (MPC). [cf en.wikipedia.org/wiki/Secure_mu…]
This allows parties to do computation on encrypted data.
For example, we could use secret sharing to choose several random numbers that add up to your salary. The other people do the same. Then you compute the averages and aggregate. You get the average without seeing the input.
[There are other forms, for example garbled circuits]
[Note that this is only appropriate for certain kinds of computation. It's *very* expensive speaking as someone who has spent a bunch of time trying to use this in various applications at scale. Plus works *very* poorly if you have abusive parties who are trying to mess you up]
But hey, the guidance points out that this is awesome and suggests you use it.
[And it is awesome! Encrypted computation should totally be used wherever you can. We're getting better at it over time with cryptographic advances, plus increased computing power helps a ton.]
So you might think, what's the holdup? Why doesn't everyone use it?
[Nope! But most people don't have a PhD literally in encrypted computation. 😆 We should talk about this a ton.]
* Feat of uncertainty. Regulatory guidelines are tech-neutral and don't say "this particular tech meets the bar"
* Overhead and scalability. Cryptographic privacy tech may require "significant configuration" with the legacy system
[I'd say this understates it. I tried to use encrypted computation for something once and Google literally didn't have enough computers available to do it. We would have had to shut down something like Gmail. This is *not* a small number of computers and this was one feature.]
Q: Who will be involved in making those technical guidelines?
A: Regulators and stakeholders and people who actually make these technologies. Also consumers are impacted and they may have feedback. Many multi-stakeholder meetings.
Q: What are the benefits/drawbacks of using 2 or more independent parties?
A: Wouldn't say there are more benefits -- use as many as you have participants.
[ This is not 100% true. You can in some cases use more parties to increase robustness against collusion. Or warrants. But be aware that this usually comes with greater network and computation cost, and more risk that some of your parties will have network or other failure]
My hands tell me it's time to take a break from livetweeting, so I'll see you at the next #PEPR21 session.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It's time to kick off an entire session about data deletion at #PEPR21 (It's hard!) with "Deletion Framework: How Facebook Upholds its Commitments Towards Data Deletion" from Benoît Reitz, Facebook
That's right, come one come all, this is @Facebook' data deletion framework.
We can't expect people to write their own data deletion logic.
* They often don't know how to do it well and write bugs
* The deletion logic and data definition may drift apart over time
So we get annotations that people put on their storage
Annotations example. There are multiple different types of edges, like "deep" saying when a post is deleted, the comments should also be deleted.
If it's a "shallow" edge, it should delete the association but not the object (e.g. a post is deleted but not the whole user)
It's time to talk about consent at #PEPR21 starting with "Designing Meaningful Privacy Choice Experiences for Users" by Yuanyuan Feng, Carnegie Mellon and Yaxing Yao, University of Maryland
Notice and choice is a legal framework. There are privacy notices which tell people about the practices. The controls let people have limited controls.
But in practice the controls are usually difficult to find, overly simplified, and sometimes manipulative using "dark patterns"
Dark patterns manipulate people into making choices they might not otherwise make. For example, the terms/policy are linked in tiny type and there's only one button: sign up. Any choices are hidden behind this, which is suboptimal.
First up at #PEPR21 "Privacy for Infrastructure: Addressing Privacy at the Root" by Joshua O’Madadhain and Gary Young from @Google.
Because hey, privacy is a full-stack problem, from humans and the societies they build all the way down to the hardware. Infrastructure is key.
Both Josh and Gary have been at Google for "a while" (I think that's about 15 years each) and are both wizzes when it comes to privacy, especially in infrastructure.
Infrastructure is systems that provide other systems or products with capabilities [not the security kind]
Types:
* storage systems
* network systems
* data processing systems
* server frameworks
* libraries
* system integrations
* etc.
More and more folks want to hire privacy engineers. This is great! You almost certainly need them! But, just like security, privacy engineering is a whole field.
So for the folks who want to hire or become a privacy engineer, a rundown of the current rough types I see. (Big🧵)
First off, let's talk about the two things that people want out of a privacy engineer: (1) privacy-respecting products and systems, (2) compliance.
Compliance is making sure that all the correct paperwork is filled out showing that you followed the rules. Here's the thing...
Compliance is necessarily reactive. It's responsive to failures of the past. If you're doing new things, then you're likely to hit new failure modes. For you, compliance isn't going to be sufficient. Because when things go really wrong, no one cares about paperwork.
Most of us know about the Dunning-Kruger effect, where people who are clueless about a subject are also clueless about how clueless they are. I had not looked at the original study.
Part of it "tests" humour. According to the Cautionary Tales podcast, these are the test jokes:🧵
First off, I find it interesting that there's a "correct" answer. (It's #2, which I found, like many of you, to be too cruel to be funny.) But what I found more interesting is that they determined this "correct" answer by asking a panel of professional comedians.
The Dunning-Kruger study was published back in 1999. There's been an awful lot of change in what is considered funny. There's a lot less tolerance for punching down. Comedians from groups that many professional comedians thought were unfunny (e.g. women) are magically funny now.
@anildash@natematias@ruchowdh@cfiesler FWIW, working with folks to build products and systems which are respectful of the lovely diversity of humans which exist is what I do. I've been lucky enough to work with a bunch of deeply ethical, thoughtful, and smart folks with a range of backgrounds and skillsets.
@anildash@natematias@ruchowdh@cfiesler I can talk about a bunch of things that I've done, places where you can see my work and that of folks like me, I can talk about PEPR, a conference for talking about this sort of work, but what I can't really talk about is the many things that never launched because of quiet chats
@anildash@natematias@ruchowdh@cfiesler Fundamentally, people want to build great systems and products. I try to help them understand that to get to greatness, you need to have respect built in -- folks I've worked with often come out feeling like they've built a better product and know how to design better.