"Wanting it badly is not enough" could be the title of a postmortem on the century's tech-policy battles. Think of the crypto wars: yeah, it would be super cool if we had ciphers that worked perfectly except when "bad guys" used them, but that's not ever going to happen.
1/
Another area is anonymization of large data-sets. There are undeniably cool implications for a system that allows us to gather and analyze lots of data on how people interact with each other and their environments without compromising their privacy.
2/
But "cool" isn't the same as "possible" because wanting it badly is not enough. In the mid-2010s, privacy legislation started to gain real momentum, and privacy regulators found themselves called upon to craft compromises to pass important new privacy laws.
3/
Those compromises took the form of "anonymized data" carve-outs, leading to the passage of laws like the #GDPR, which strictly regulated processing "personally identifying information" but was a virtual free-for-all for "de-identified" data that had been "anonymized."
4/
There was just one teensy problem with this compromise: de-identifying data is REALLY hard, and it only gets harder over time. Say the NHS releases prescribing data: date, doctor, prescription, and a random identifier. That's a super-useful data-set for medical research.
5/
And say the next year, Addison-Lee or another large minicab company suffers a breach (no human language contains the phrase "as secure as minicab IT") that contains many of the patients' journeys that resulted in that prescription-writing.
6/
Merge those two data-sets and you re-identify many of the patients in the data. Subsequent releases and breaches compound the problem, and there's nothing the NHS can do to either predict or prevent a breach by a minicab company.
7/
Even if the NHS is confident in its anonymization, it can never be confident in the sturdiness of that anonymity over time.
Worse: the NHS really CAN'T be confident in its anonymization. Time and again, academics have shown that anonymized data from the start.
8/
Re-identification attacks are subtle, varied, and very, very hard to defend against:
When this pointed out to the (admittedly hard-working and torn) privacy regulators, they largely shrugged their shoulders and expressed a groundless faith that somehow this would be fixed in the future. Privacy should not be a faith-based initiative.
Today, we continue to see the planned releases of large datasets with assurances that they have been anonymized. It's common for terms of service to include your "consent" to have your data shared once it has been de-identified. This is a meaningless proposition.
11/
To show just how easy re-identification can be, researchers at Imperial College and the Université catholique de Louvain have released The Observatory of Anonymity, a web-app that shows you how easily you can be identified in a data-set.
Feed the app your country and region, birthdate, gender, employment and education status and it tells you how many people share those characteristics. For example, my identifiers boil down to a 1-in-3 chance of being identified.
13/
(Don't worry: all these calculations are done in your browser and the Observatory doesn't send any of your data to a server)
If anything, The Observatory is generous to anonymization proponents. "Anonymized" data often include identifiers like the first half of a post-code.
14/
You can read more about The Observatory's methods in the accompanying @nature paper, "Estimating the success of re-identifications in incomplete datasets using generative models."
ETA - If you'd like an unrolled version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
"Enshittification" isn't just a way of describing the *symptoms* of platform decay: it's also a theory of the *mechanism* of decay - the means by which platforms get shittier and shittier until they are a giant pile of shit.
1/
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on , my surveillance-free, ad-free, tracker-free blog:
I call that mechanism "twiddling": this is the ability of digital services to alter their business-logic - the prices they charge, the payouts they offer, the particulars of the deal - from instant to instant, for each user, continuously:
But then he had an urgent discussion with the flight attendant, explaining that as a former senior Boeing engineer, he'd specifically requested that flight because the aircraft wasn't a 737 Max:
The foundational tenet of "the Cult of Mac" is that buying products from a $3t company makes you a member of an oppressed ethnic minority and therefore every criticism of that corporation is an ethnic slur:
Call it "Apple exceptionalism" - the idea that Apple, alone among the Big Tech firms, is virtuous, and therefore its conduct should be interpreted through that lens of virtue.
3/
The news that Gen Z users have abandoned Tiktok in such numbers that the median Tiktoker is a Millennial (or someone even older) prompted commentators to dunk on Tiktok as uncool by dint of having lost its youthful sheen:
A key requirement for being a science fiction writer without losing your mind is the ability to distinguish between science fiction (futuristic thought experiments) and *predictions*. SF writers who lack this trait come to fancy themselves fortune-tellers who SEE! THE! FUTURE!
1
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on , my surveillance-free, ad-free, tracker-free blog:
The thing is, sf writers cheat. We palm cards in order to set up pulp adventure stories that let us indulge our thought experiments. These palmed cards - say, faster-than-light drives or time-machines - are *narrative devices*, not scientifically grounded proposals.
3/
Bruce Schneier coined the term "feudal security" to describe Big Tech's offer: "move into my fortress - lock yourself into my technology - and I will keep you safe from all the marauders roaming the land":
But here's the thing about trusting a warlord when he tells you that the fortress's walls are there to keep the bad guys *out*: those walls also keep you *in*.
3/