, 25 tweets, 4 min read Read on Twitter
Let's talk tech policy about privacy! Unlike a lot of folks who talk tech policy, I'm tech-primary: my PhD is in cryptography, I've been a software engineer, I've run privacy engineering for big and small tech companies and we've developed a lot of the techniques.
So what I want to talk about today is how policy can work with and support getting good privacy work done. Regulation is clearly needed. *Good* regulation is needed. Bad regulation can easily make a bad situation worse and not everyone in the regulatory space understands in depth
No one understands everything! That's why we collaborate. So let me give you a bit of what I've been thinking from a privacy engineering point of view, in conversation with @yonatanzunger

(And please, ask questions. Any time. My DMs are open.)
#1: Regulation is most effective when it works with the practitioners within an organization. When costs aren't externalized and can be clearly understood up front, that helps privacy folks within a company explain tradeoffs more easily and relies less on a forward-thinking CEO.
It's difficult to measure privacy and security risk, so it's especially difficult for business leaders to assess as opposed to easier-to-quantify financial opportunities and risks. Make obvious costs obvious up-front; then privacy folks have more time for more difficult issues.
Appreciating tradeoffs is also hard for many business leaders because, as a whole, these leaders notoriously don’t reflect the communities that their products serve, so they often lack an intuitive sense for these risks.
#2 - Regulation is least effective when we don't know what a good solution looks like.

Compliance effectively relies on turning potential problems into checklists with minimally-ambiguous questions. But if you don’t know what a good answer looks like, you have a problem.
If you don’t know what a good answer looks like, you also don’t know how to write a good checklist. Instead what you’ll see is compliance specialists making up a checklist which may or may not cause organizations to achieve useful outcomes.
If something is well-understood, we can easily write a checklist for it.
If something is hard but we can at least clearly recognize correct solutions, we can write some kind of checklist for it.
When we enter the realms of poorly-understood tradeoffs, rapidly-changing human factors, and research problems, that’s where I start to worry that regulation may tell people to do things which end up being counterproductive.

"Poorly understood" is where much of privacy is.
I have a hard time seeing regulation in the ambiguous parts of the privacy space play out well until we have some better-systematized answers. Otherwise we are likely to see ossification around what is easy rather than continued seeking after what is right.
#3 - Regulation needs to be chosen carefully, otherwise it will be so expensive that only the big players will be able to effectively comply.

For example, Google spent a lot of time and money on GDPR, for example. A lot. I personally spent hundreds of hours and I’m not cheap.
And Google has an awful lot more privacy engineers than anyone else; I’d already been running the largest privacy engineering training program in the world just in order to build up the team. Google already had a well-developed privacy program backed with technical infrastructure
I'm at a startup now (@humuinc) We’ve got really serious privacy and security chops compared to even much larger companies… but we also have nearly as much paperwork as those companies and many fewer people to do it.
It’s much, much harder for many other companies that don’t have that expertise in-house, who haven't been able to hire people who know how to do privacy eng.

Lack of expertise means a lot of companies didn't know how to approach privacy regulation except as a paperwork exercise.
To be abundantly clear, it is *not* optional to do the right thing, but what we can do is systematize the field enough and build regulation and policy in concert with that, where it’s needed, so that we can bring *all* good ideas to the table.
We’re just beginning to see more diversity in who is funded by VCs and I want to see what those companies bring to the world, rather than just seeing what already-big tech companies have in store.
#4 - Regulation has to account for human diversity

There is often not just one right answer because there is not just one human need. Good rule of thumb: if someone offers you a one-size-fits-all answer in privacy, it’s probably wrong somewhere.
For example, Privacy by Design says that you must always default to the most protective option. But “most protective” is not always the same. When Fancy Bear is out there spearphishing politicians to influence elections, is it more protective to scan email for malware or not?
There are many questions where the answer is not obvious or not the same for everyone. If you're asking your users meaningful questions, you don't know the answers.
I want regulation to encourage understanding of and respect for humans, especially as our understanding grows and humans change.
#5 - Regulation should let practitioners and academics learn.

Right now, people in industry don’t talk about near-misses, let alone vulnerabilities, let alone incidents, because they’re terrified of being sued.
Right now, people outside of industry can speculate about failures, but lack real data. Having seen a lot of incidents in my last job, I can safely say that the speculation is often wildly incorrect.
Big companies may be able to get enough data about what causes failures and near-misses to learn. Academia, civil society groups, and many small companies are working without important pieces of the puzzle.
I want fewer, less-severe incidents. But to get there, we need to talk about what works and what doesn’t work and why it doesn’t work, which means we need to carve out some kind of space for that discussion.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Lea Kissner
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!