Okay! This will be the thread head for the third session of #pepr20, which will re-convene after the birds-of-a-feather breakout sessions, in about ten minutes.
We have a secret motive for tweeting this, it helps us pay attention. Our brain doesn't cling to things unless we're using *all* of our brain.
Okay, the theme of this next block of talks ("session") is design. So now we're on slack channel 3. #pepr20
"How to (In)Effectively Convey Privacy Choices with Icons and Link Text", by two academics. #pepr20
Privacy choices are "very abstract", so people tend not to read them. (Very perceptive!) #pepr20
One way to deal with this might be icons. Icons direct people's attention, they're concise, and when designed properly they can mean the same thing across cultures. #pepr20
Icons are increasingly encouraged by regulations, including GDPR and CCPA. #pepr20
However, the actual history is "checkered"... These examples shouldn't be taken too harshly. First example: four Mozilla icons from 2011. What are they for? #pepr20
It turns out: Retention; third-party use; data shared with ad networks (a megaphone); one more. #pepr20
(That went by very quickly, which must be extremely tempting when pre-recording a segment like this. It would have been nice to see it a bit longer.) #pepr20
Now we have some icons from Disconnect. These ones are simpler, which is nice, but what do they all mean? #pepr20
Ooh! We know this one, it's the AdChoices icon. Yay lol, we now have a non-Google-internal citation for the fact that people don't recognize this icon. #pepr20
(It was highly relevant to our own work, otherwise there's no way.) #pepr20
Here's some CCPA text about the requirement to put a link to the "do not sell" feature. The regulation says there must be a link, and there may be a button or logo, but gives no guidance on its appearance. #pepr20
The work being presented did an iterative design process with user testing. They did a bunch of ideas on post-its and grouped them on a whiteboard... #pepr20
Concepts: Stop, no sharing, opt-out, no info, refuse, withdraw something... a couple other clusters. #pepr20
Some ideas around choice and consent used checks and Xes. Opting out concepts focused on taking things out of boxes and folders. Do not sell concepts focused on "no" symbols with slashes, and on the stop sign. #pepr20
Their first evaluation used 240 participants, from Mechanical Turk. Half the participants saw it WITH the phrase "do not sell my personal information", and half saw it with no wording. #pepr20
Users were first asked to say what that one icon meant. Then they were asked to pick the best icon out of twelve. #pepr20
Participants liked the stylized toggle button, with a check and an X, for "best conveys choices about personal info". For "do not sell my personal information", they liked the dollar sign with the slash. #pepr20
So they refined the icons, added some color... The toggle, now blue, best conveys choices. The dollar signs were the winner for "do not sell" but still not very much enthusiasm. #pepr20
The toggle conveys accept/decline... great. The slash dollar, people mostly interpreted as something being free, or cash not being accepted. #pepr20
The AdChoices icon, people mistook for a "play" button. #pepr20
Similarly, they iterated on the taglines. "Do not sell", "don't sell" ... "personal information" ... "choices", "privacy" ... 540 participants. #pepr20
The word "sell" didn't make sense except when alongside "personal information". #pepr20
"Options" and "choices" worked better than "opt-outs". Winners: "Privacy choices", "privacy options", "personal info choices". #pepr20
Now a study with 1416 participants, putting the pieces together. Mocked up the website of a fictional shoe vendor. #pepr20
Each participant saw one variation. Either a tagline + icon, or just one or the other. #pepr20
"Personal info" people took to mean shoe size. Slash dollar people took to be about encrypted payments. People mistook the toggle icon for a real toggle. #pepr20
None of the icons worked well without the toggles. The icons also didn't do much to change perception of the taglines. Still useful because the icons draw attention. #pepr20
They made some recommendations to the office of the California Attorney General. #pepr20
They went with the blue toggle. The AG put out proposed regulations including a red toggle... kind of similar, but red... Looks even more like an actual toggle. #pepr20
This slide shows some tweets critiquing the red toggle. #pepr20
So the researchers tested the blue and red toggle against each other. They also tested a version of the red toggle with a larger X, and the icon from the red toggle but in blue. #pepr20
They found that the size of the X didn't help. However, there was a big difference between the California icon and theirs... the California one was much more likely to be misinterpreted as a real toggle. #pepr20
Color turned out not to make much difference. #pepr20
(It does seem like a very hard problem to make an icon that looks like a UI element but isn't mistaken for one.) #pepr20
The regulatory guidance got revised, with a promise to do research. That hasn't happened yet. These researchers recommend using their blue toggle. :) #pepr20
Take-away: Involve consumers from the start. We as experts will have our pre-conceptions that confuse things. #pepr20
Before releasing icons into the wild, test noticeability, discoverability, recognizability... #pepr20
Utility - is the choice behind the icon useful. Behavior - do people click it? The researchers did A/B testing. #pepr20
This can be done online, ie. with Mechanical Turk. You can also create your own panel and recruit users however you want, it depends on your goals. #pepr20
Wide adoption is not the same thing as good design. (OUCH at the AdChoices logo being next to that remark!) #pepr20
The researchers also believe it's important to standardize privacy indicators. It doesn't seem scalable for each law to require a separate icon. #pepr20
These authors contributed to a recent IAPP book with more thoughts. #pepr20
Question: Could this approach be generalized to other privacy icons?
Answer: Yes absolutely, that's what we're advocating for. #pepr20
Question: Did mturk testing always put the questions in the same order?
Answer: Yes, except in a few places which were randomized to avoid priming. The part with just one icon was put first in the study. #pepr20
Question: Were you able to get a sense of participants' privacy literacy?
Answer: A little. We described the California law and asked if people knew of anything like that in the US, and if so, what? Hardly anybody did. #pepr20
The thing with using Mechanical Turk in studies is that mturker panels is that, compared to telephone samples, they are more computer savvy but not necessarily more knowledgeable about privacy. #pepr20
Question: What about making this a link instead of a button?
Answer: Haven't given it a lot of thought. The law requires a link; the button is optional. So we tested the button. #pepr20
Had we turned the words into a button instead of a link, would that have helped? We don't know. #pepr20
Next talk! "Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces" by Martin J. Kraemer. #pepr20
Using internet stuff raises data protection concerns. Legislation is traditionally about an individual right to privacy. GDPR excepts "households" from consideration. CCPA mentions households as deserving of protection, but doesn't define details. #pepr20
In the literature, privacy beyond the individual gets discussed but often without understanding of what privacy means in communal spaces. #pepr20
What are communal spaces, and why are they interesting? #pepr20
Shared spaces: homes, cafes, airports. Increasingly equipped with internet connectivity. Heterogenous groups of individuals with dynamic social structures. Both attributed and un-attributed responsibilities, and varying skill levels with tech. #pepr20
Privacy is highly context-dependent and situational, so it's hard to understand what it means in these contexts. How can research help? #pepr20
One approach: co-design workshops. Inspired by the _Future Workshop_ format, to investigate issues where there's no clear theory. Part A: critiquing the present. Part B: envisioning the future. Skipped the implementation phase. #pepr20
They decided to facilitate these workshops using design techniques and artifacts. #pepr20
Part A has three stages. First, explore social and physical aspects of shared spaces, because we know it's relevant. Do this with affinity diagrams and sketching. As context scenarios they chose "home" and "cafe". #pepr20
Second, attune to popular data protection issues. They used two examples - browser bar padlocks, and mobile application permissions. A researcher led this part. #pepr20
Third, reflect on further issues. They provided a list of guiding questions in case participants were still having trouble with the topic. #pepr20
Phase B, about envisioning the future, actually does design work. Also three steps. #pepr20
Step one, design - pick one of the shared spaces from part A and work with that. #pepr20
Step two, re-design - either re-do their own sketch, or take a different group's.
Step three - reflect on the whole process and present their solution. #pepr20
They first tried this with some groups of students and researchers from their university. The task was to design a "good" internet experience. #pepr20
Now we'll see two case studies from the workshops. The first one talks about "data flush", the second talks about putting a password in the kitchen cupboard. #pepr20
The group that came up with "data flush" was working on a sketch of a cafe created by another group. Their goal was to aim for anonymity on a cafe wi-fi, during and after use. They wanted devices to be remembered for 24 hours, for usability. #pepr20
The group said that if they come back another day and are prompted to log in again, that's a good way to communicate that their data is gone. #pepr20
The group discussed how to locate the data flush. They initially considered putting a button on the table, but then realized some people might leave earlier than others... so moved it to the door. #pepr20
People who are not friends may not arrive at the same time, and may not want to interact with each other or share the same table. #pepr20
A different group came up with a solution for the home. They were asked to consider different personas. #pepr20
They came up with a "connect button", similar to what already exists. #pepr20
They suggested that a user who does have expertise could help others connect, even if that user isn't around in person. #pepr20
They considered that it doesn't help much to require an expert if the expert isn't available. They realized that they can put a password inside the cupboard, using what they called a "semi-public space". #pepr20
Using students as participants means people have limited life experiences, especially around home ownership. #pepr20
Using the design artifacts did help participants to re-adjust their solutions. For example, the personas helped to clarify whether the solutions were useful to everyone. #pepr20
One group discussed how rules and relationships were much more relatable than social norms and obligations. #pepr20
Findings: There is definitely some point to framing these discussions around particular types of social groups, such as friends. #pepr20
There should be a focus on the relationships of people in shared spaces. Spouses? Co-workers? #pepr20
The kitchen cupboard example is a good illustration of how creative participants can be. #pepr20
Conclusions: Communal spaces and relationships in them are important to consider with this stuff.
Contextual artifacts are useful to these study groups. #pepr20
Question: Why the focus on data flush, rather than on limiting retention to begin with?
Answer: From the participants' perspective, the assumption was that there was no way to avoid storing information. Also they were concerned about convenience. #pepr20
Question: This work is very focused on physical spaces. What about digital ones?
Answer: Yes, it is. There's already research on social networks. #pepr20
The physical space makes the difference which we have been ignoring in research on social networks. #pepr20
Question: How might this work when people are coming into and out of the space, such as a space with sensors (didn't catch what kind)?
Answer: The pilot showed that people have lots of ideas. #pepr20
Participants used the physical environment and their knowledge of it to make sense of the problem. #pepr20
Question: What about the cafe learning that the people know each other?
Answer: Yes, suppose so. How different does technology make that, vs. it being plainly visible? Don't know, and participants didn't consider it. #pepr20
Question: How transferable is this to a corporate context?
Answer: Would be interesting. Very different social structure, since participants there have clearly defined roles and responsibilities. #pepr20
(Unclear whether the question was about transferring the *study participants*, or the *subject matter*) #pepr20
Next talk! "Throwing out the checklist", Dan Crowley, from Quizlet. #pepr20
The talk will be about how they grew their organization, replacing a checklist with a "privacy by ethos" culture. Scott McNealy, 1999: you have no privacy, "get over it"... but this was about governments! Public interest has only increased since then. #pepr20
Regulators have established new rules. The public understanding has increased a lot, too. #pepr20
Regulators gave us a framework: Privacy by design. Seven principles make it up. 1: Proactive, 2: Strong defaults, 3: Embedded in the org, 4: positive-sum... missed the rest #pepr20
There's been a lot of people working on this stuff, but there are clearly still unanswered questions about how to make it work for companies and for users. #pepr20
We know what good programs look like, right? Triage questionaires, privacy impact assessments, other paperwork. Is that right for every company at all times? #pepr20
All these processes do have pitfalls. When visibility is limited, we can't fix what we don't see. Sometimes teams are under-staffed. Sometimes approval processes become political. There can be culture class; privacy process is perceived as a delay... #pepr20
Small, and rapidly-growing orgs are extra-prone to these problems. What do you do? "Throw that checklist in the trash" #pepr20
"Turn everyone else into a privacy person too". It's not the documents you write, it's the culture you leave behind. #pepr20
(We're not sure we would go all the way with this... The paperwork is important.) #pepr20
"Keep a checklist for yourself" but "Focus on culture; ask questions". (That does seem like a good balance.) #pepr20
Ask what could go wrong, but also ask what more could go right? Ask product managers and designers, what can you teach me? Privacy is multi-faceted. "Everyone has something to contribute." #pepr20
"Make it fun"! If your program is a checklist, it's as fragile as the person least likely to follow it. Have prizes, have games to make people understand you're not there to be the boring compliance person. #pepr20
They made literal trust and safety hats and left them in every conference room, so everyone could put on the hat. #pepr20
Start early. Teams need to engage early in the development cycle. That also means build your own culture. The pandemic makes that extra hard! Make sure to make time for Zoom coffee chats, lunch-and-learn sessions, etc. #pepr20
Show people that they are valued stakeholders in privacy, and that you can help them if they help you. #pepr20
Build an "education culture" where people teach each other things. #pepr20
It's important to be humble, candid, and open. (Full agreement from us!) #pepr20
Building this culture early will ensure that you'll be set up to succeed when your program grows to the point that it needs a checklist. #pepr20
Question: How do you create this culture at scale, for rapidly-growing orgs?
Answer: It's hard. A comment in Slack frames this in terms of executive buy-in. If the answer to "what do we do" is "it depends", that's extra-true when the question is about culture. #pepr20
Identify the key stakeholders and try to progressively get more buy-in from those people. #pepr20
Question: How do you decide whether this culture is working?
Answer: Every time I draft OKRs this comes up... Especially if the success is for things to NOT happen (no data breaches, no regulatory inquiries)? #pepr20
There is a lot of communication that needs to happen with executives to convey that privacy needs different types of metrics from other areas. #pepr20
Some things you can track: How quickly is it flagged? How do people respond when hearing about it? (This sounds legally very risky, but cool if your lawyers are cool with it.) #pepr20
Two behaviors they look for: How often do teams reach out proactively? You can have regular check-ins, but some teams also come to ask about future work they're speculating about. That's a great signal when they do. #pepr20
Second, what are teams putting about privacy in their own design documents? If their own processes already think about privacy, you don't need to give them a mandate to check in. #pepr20
Question: About the conversations and games to foster the culture, how well does that sort of activity scale? How do you make time for this when also doing dev work?
Answer: It's about finding the right people who can magnify your efforts. #pepr20
No physical water cooler these days, but Quizlet has a system to randomly match employees with each other for 20-minute conversations. About 250 people at this org, and it's effective at that size. Won't scale to 10,000. #pepr20
Question: How do you get devs and designers invested in this?
Answer: Set it up on a principled basis. It's not about "we are required to do this", it's about identifying what the tangible risks *to real people* are, not the legal risks. #pepr20
There are many things in life where you can make a mistake but then make up for it later. Those are situations where the person being harmed has recourse. Privacy is not one of those situations. You need to avoid the mistake to begin with. #pepr20
Trust takes forever to build, and an instant to lose, and it's very valuable to a brand. #pepr20
Question: Do you feel that GDPR makes it easier to get cultural change?
Answer: Yes and no. The big fines make an impact on executives. #pepr20
There's some skepticism in the industry though because it doesn't seem to have had consequences.
Making cultural changes early on will set us up to not have to worry about that. #pepr20
Question: Would tracking revenue be good for this?
Answer: Yes, but hard to quantify revenue loss due to trust and safety concerns. Works well in business-to-business context where somebody tells you why you lost the sale, but not in business-to-consumer. #pepr20
If you have a way to quantify that, "please talk to me"! And to the Slack channel! #pepr20
Question: (missed it)
Answer: It's a discussion. Look at the risk to the user, try to figure out the actual concern and the likelihood it happens. What paths do we have to fix that issue? Small roll-outs have different mitigations than large ones. #pepr20
Need to think about shared devices, communal spaces... these are higher risk factors. #pepr20
Question: What's on your privacy roadmap?
Answer: As an educational tech platform, we mostly have student notes. We recently launched an AI thing that broadens how people use our site, how do we make that safe? #pepr20
When you build stuff that measures how much students understand, that requires a lot of care. How do you build a safe model that doesn't rely on individual data? #pepr20
Finally: Does the presenter have a question for the audience?
"Less of an ask than an exhortation": Be an ally. Have patience with your attorneys and your compliance folks, we're here to help you engineers and to understand and do the right thing. #pepr20
As much as you can, help "us" (attorneys) understand the roadblocks. If there are engineering blockers, we want to understand it. #pepr20
Now there's a 20-minute break. Yay! Time to glance at the Slack. #pepr20
To keep things organized, we'll move to a new thread after the break.
We're live-tweeting PEPR20! After the break, this will be the thread head for the fourth block of talks ("session"), which will be the last one for the first day. #pepr20
"Product Privacy Journey: Towards a Product Centric Privacy Engineering Framework", by Igor Trindale Oliveira, is now starting.
Why a product-centric approach? Other possible focuses would be compliance, design, engineering, users... #pepr20
Just to keep our tweets organized, this will be the thread topper for our live-tweet of session 2 of #pepr20, when the break is over.
Okay! We're back from break. The talk title went by very quickly, ... now there's a pause, hopefully the speaker will introduce themselves again. #pepr20
According to the schedule, this one should be "Building and Deploying a Privacy Preserving Data Analysis Platform", by Frederick Jansen. #pepr20
Okay! We will be live-tweeting #PEPR20, the USENIX conference on Privacy Engineering Practice and Respect. Feel free to mute that hashtag if you don't want to drown in tweets.
Just so people know, if you're a trans person working any sort of professional job and you're interested in advocating to your company about healthcare, we're happy to chat privately about what to ask for and how.
We were heavily involved in efforts around that during our time at Google, and there's a lot of transferable knowledge that applies to any company.
Belatedly, we realized that because we DO have that highly detailed knowledge on this topic, we should directly talk about Discord's thing.
The thing you have to understand about America is that anyone who grew up there, grew up being fed propaganda that most of us took at face value. That sounds like an extreme position, but it's the literal truth.
The US mythologizes its own impact on the world, focusing only on the positives and glossing over the negatives.
The US mythologizes its own *place* in the world, declaring itself a leader in all sorts of things - public health; infrastructure; democracy - where it is nothing of the sort, and has not been for a long, long time.
Here's a thought we've shared privately, but it's taken time for us to get a formulation of it that doesn't ramble too much.
When people talk about working for change "within the system" vs. "outside the system", what system do they mean?
Answer: It depends! A thread.
People without a science background, or even people with that background who don't also pay attention to the humanities, may not realize that the word "system", in its modern sense, had to be invented. It wasn't a single moment, either, the idea was refined over many years.
Wikipedia has a page that's titled just "System", because it's a more interesting concept than you might realize. en.wikipedia.org/wiki/System#Hi…