1/ Major victory.. the UK data regulator (ICO) has just provisionally fined Clearview AI £17 million. Wanna know why?
2/ Clearview AI is in the business of collecting faces online, storing them in a searchable database of now 10 billion+ faces, and reportedly selling access to that database to the police (and previously private companies). nytimes.com/2020/01/18/tec…
3/ This happens without knowledge or consent of people whose faces are collected - which can include anyone who ever posted a photos of themselves on social media, or who appeared in pictures uploaded by friends or strangers.
4/ By selling this face database to the police, Clearview AI puts us all on a watchlist, and threatens our online and offline rights and freedoms.
6/ We argued that Clearview AI's collection of faces violates UK and EU privacy laws.
Today, the UK data regulator agreed - and asked the company to stop collecting and delete the faces of people in the UK.
Plus, they slapped it with a provisional fine of £17 million.
7/ We applaud the ICO's decision, which asserts our privacy rights and sends a clear message to businesses like Clearview - stop playing with our privacy and freedoms, or face the music, too.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
ICYMI: The ICO has announced their provisional intent to fine Clearview AI £17 million after our complaint with @NOYBeu, @HermesCenter and @Homo_Digitalis_ in May 🎉
When you upload photos to Instagram and other social media platforms, do you expect them to be scanned for biometric information and stored on a database?
Pakistan's ID system NADRA is yet another eye-opening example of how ID systems can be exclusionary by design, specially when adherence is made compulsory & when such systems are used as gatekeepers to access a wide range of essential goods + services.
But what could go wrong when socio-cultural patriarchal expectations are hard-coded into a one-size fits all foundational digital ID system? And what happens to those that don't fit the standard on the eyes of the state?
NADRA relies on the assumption that people belong to families that fit a fixed and traditional state-defined pattern, and automatically excludes all that don't. Social expectations as defined by the state have become embedded into its database.
Via @WIRED wired.com/story/pakistan…
1 - @EFF revealed today how the LAPD have been requesting Ring doorbell cameras footage of Black Lives Matter protests (eff.org/deeplinks/2021…). Here is a short thread on why this is part of a worrying bigger picture.
1.1 - Over 2,000 public safety agencies have signed formal partnerships with Ring worldwide. The partnerships allow police to use a law-enforcement portal to canvass local residents for footage without warrants. privacyinternational.org/long-read/3971…
1.2 - Warrants exist to protect us. Law enforcement needs to justify access on reasonable suspicion.
WhatsApp's notification to accept its new policy or lose your account is wrong on so many levels we need a short thread to talk about it
🧵👇 1/9
First this goes to show how much Facebook values their users' data over their users.
"Accept our data grab or get out" is pretty far from what consent should look like under laws like GDPR 2/9
Truth is, Facebook probably just wants your phone number 📱 (if they don't already have it) and your contacts's names and numbers. They've used dubious methods to get it in the past and this could be another way to obtain it 3/9
🧵[THREAD]🧵 #Covid19 We are concerned about "immunity passports" and the dangers that they can bring to individuals, communities & society.
Our worries aren't pie-in-the-sky; they are based on our & our global partners' experience working on identity and ID over the years.[1/7]
"Immunity passports" risk excluding ppl, making them unable to get employment, to travel & to risk engagements with police. This is a real danger of an D system; our research in Chile shows how people who don't have access to a system are excluded. privacyinternational.org/long-read/2544… [2/7]
Healthcare and these types of intrusive systems are an unhealthy mix. For example, our partners @KelinKenya examined what happened when there was an attempt to introduce biometrics for HIV/AIDS treatment in Kenya and how the community reacted kelinkenya.org/everyonesaidno/ [3/7]