Serge Egelman Profile picture
Does his own research. Dir. of Usable Security & Privacy @ICSIatBerkeley. CTO, @AppCensusInc. All opinions are those of his employer(s), and not his own.
Aug 16, 2022 35 tweets 12 min read
In another case of please-take-our-court-filings-seriously-but-ignore-our-marketing-materials-that-make-the-opposite-claims, Kochava is suing the FTC. 1/ Who is Kochava? Kochava is an advertising company: they offer tracking/profiling services, run their own advertising network, and also sell consumer data (they're a registered CA data broker).

You've probably never heard of them, but they collect data from a *lot* of apps.
2/
Jul 14, 2022 31 tweets 7 min read
New research just dropped! @Noura_7N presented our paper at @PET_Symposium yesterday:

"Developers Say the Darnedest Things: Privacy Compliance Processes Followed by Developers of Child-Directed Apps"

Citation and paper here: blues.cs.berkeley.edu/blog/2022/07/1…

1/
We (and others!) have previously shown that online privacy compliance is woefully inadequate (e.g., blues.cs.berkeley.edu/blog/2018/04/2…).

We know that violations of privacy laws are pervasive, even amongst very popular apps and services.

Our question has become, how did this come to be?
2/
Apr 27, 2021 23 tweets 7 min read
We did a thing!

Here’s coverage from @themarkup:
themarkup.org/privacy/2021/0…

Here are the details of the vulnerability and why it’s an issue:
blog.appcensus.io/2021/04/27/why…

1/n Before I go into details of why this is a big deal, I had hoped this statement would make it into the article, because I think it's incredibly important that this be put in context (but understandable, since I provided it at the last minute).

2/n Image
Apr 22, 2021 5 tweets 2 min read
Whenever I hear promises about only sharing “anonymized” or “aggregate” data, it generally turns out to be bullshit. This does a good job explaining why.

One common reason is simply that anonymity often (incorrectly) assumes that other data doesn’t exist or won’t be released. For example, if a hospital anonymizes data by only sharing zip codes, gender, and date of birth, that data, by itself, is seemingly anonymous.

However, combined with datasets that map these fields to additional fields, this data can be easily anonymized.
May 18, 2020 17 tweets 6 min read
I just posted the camera-ready version of our upcoming PETS paper here: blues.cs.berkeley.edu/blog/2020/03/2…

The Price is (Not) Right: Comparing Privacy in Free and Paid Apps. Proceedings on Privacy Enhancing Technologies (PoPETS), 2020(3). This effort was largely led by undergraduate @catherinekshan who will be starting her PhD at Stanford in the fall!

(As well as @irwinreyescom, @AlvaroFeal, Joel Reardon, @primalw, @narseo, @AmitElazari, and @kenbamberger)
Nov 27, 2019 36 tweets 11 min read
A thread on 1st-party vs. 3rd-party cookies, and how tracking companies are actively trying to undermine users' privacy preferences.

First, some basics (this is simplified):

A cookie is a key/value pair (a variable) that allows websites to save data in users' web browsers. 1/ Cookies aren't always bad: they allow settings to be saved, as well as enable things like shopping carts. But they can—and are—used for tracking people: a website can assign a user a unique identifier, so that they know what pages that user visits on a site over time. 2/
Dec 26, 2018 11 tweets 3 min read
More research showing why the "privacy paradox" is a misnomer.

For those unfamiliar, the "privacy paradox" refers to the notion that people *say* they care about privacy, but then *act* in ways that are at odds with those stated preferences (e.g., sharing information online). 1/ The idea that people's intentions often don't match their actions is nothing new. This has been well-studied with regard to behavior change in psychology. For example, in Ajzen's "Theory of Planned Behavior"
(en.wikipedia.org/wiki/Theory_of…), perceptions of control moderate behavior. 2/
Dec 23, 2018 11 tweets 3 min read
Today's personal story of machine learning run amok (thread):

My research group is looking at the privacy behaviors of paid apps. Doing so requires us to purchase lots of paid apps. The Play Store processes each purchase separately, as opposed to batching them. 1/ So, purchasing, say, 1,000 apps will result in 1,000 credit card authorizations. Banks enforce a maximum number of card authorizations per day. In conducting this study, it appears as though this number is in the double digits. Thus, my card was repeatedly frozen. Repeatedly. 2/
Dec 10, 2018 7 tweets 2 min read
A primer on privacy as "contextual integrity" and why privacy notices on mobile platforms (both Android and iOS) are insufficient for attaining informed consent. Thread:

If your doctor asked for permission to collect your medical history, you would probably say yes.

1/
However, if that doctor asked to collect your medical history to give to marketers for advertising purposes, you would probably decline.

The difference is, in the first case, you're making assumptions about how the data will be used based on who is making the request.

2/