Profile picture
Os
, 12 tweets, 4 min read Read on Twitter
Big horror news: NIST has been using child pornography, immigration records and the photographs of dead arrestees to train facial recognition. Op-ed by @drnikki , @profwernimont and myself slate.com/technology/201…
I'm going to put up a piece later tonight on maybe my blog on the human experience of processing and dealing with this kind of work, because it's a lot. People screaming. People crying. Consumed by a machine, without their consent, to hunt down people who look like them. Just....
These datasets are of course massively racially biased; African-American people make up 12.8% of the US population but, thanks to the racism of the carceral "justice" system, almost 50% of the dead arrestee database. The immigrant photos were originally _specifically_ Mexican.
There is no sign that the victims of the child abuse - or anyone else - consented to the use of their photographs for this purpose, and in some tests the dataset was specifically "child abuse images from unsolved cases". Where there has already been no justice.
The people behind using these datasets are the same people who Trump just put in charge of AI standards, and the same people who, in an attempt to make NON-racist AI, recently declared photos of black people just naturally all looked the same
There will be no ethics, no justice, no effective regulation found through giving NIST the dominating role in facial recognition standards - as evidenced by the fact that they've had it for years and saw no problem using some of these datasets _over decades_. NIST won't save you.
But first in a moment of practical academia I am going to go back to rewriting that draft for @ducktopian because I did not expect this piece to get published until some time next week and ~we have a timeline to keep~. No rest for the disconnected.
"Os are you saying there should be an independent group providing regulation for facial recognition" yes, pragmatically, and utopianly, also yes, but by "independent" I mean angry mob and by "regulation" I mean "salting the earth on which computer science departments sit"
Big props to Morning Os for accidentally picking out the best possible t-shirt for this piece dropping
Very much in the “disappointed but not surprised” category that none of the people in my mentions desperately trying to justify it seem to care about or ever make mention of the (disproportionately black and brown) immigration and arrestee photos.
Me: anyway yeah there’s like random-ass immigrants in this dataset and a lot of heavily brutalised people of colour and also some children who a-

Fellow whites:
This is not to downplay the seriousness of the inclusion of CSA photos but: they took dead black teenagers and used them to train surveillance systems and then released their photos online so private companies could train even BETTER surveillance systems. Maybe mention it?
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Os
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!