I’m going to get on a soapbox here for a minute since several of my good colleagues have written extensively about the social implications of this technology. buzzfeed.com/daveyalba/thes…
First of all, we need to bear in mind that this technology, like most facial recognition technologies, is predicated upon a history of institutionalized sexism and racism in image processing. From the use of the image Lena, “the First Lady of the internet...”
To the failure of image processing and machine vision to be tested adequately on people of color. Interestingly enough, this coincides with Hollywood’s inability to properly light black and brown bodies, because they’re on the same family tree of technologies.
And we might also note a continuity of consent violations from the lack of consent involved in the use of Lena’s image, to the collection of facial recognition data above. Which is where we need to start: these technologies are premised not only in cultures of surveillance...
But in maintaining culture of oppression, ideologies of oppression. Which is to say that these technologies and their deployment are the outcome, the consummation, of the confluence of tech culture and implicit ideologies of oppression all in the name of keeping us safe.
To wit, I’ve had to use these kiosks twice in my recent trip to Canada, and both times they failed to capture my image while wearing my glasses; one time an attendant had to come by and check to be sure the machine was operating properly because it kept failing to capture me.
Which I was amused about, but couldn’t explain to the customs agent that I, and my policed and surveilled body, couldn’t be captured by their surveillance technology due to institutionalized racism in its design.
Additionally, we should view this as being in continuity with the failures of the millimeter wave scanners to appropriately scan Black women’s hair, not that w NEED more surveillance, but that the failure reflects institutional racism.
That said, as many folks on here have pointed out (and you should check out @Wolven, @allergyPhD, @ashleyshoo, and the work of the @AINowInstitute for more on this) the folks doing the implementation and design, and the regulation, are driven by ideologies of oppression.
These ideologies are made manifest in the decisions to collect biometric data, how that data is used, who can use that data, and who is participating in the use of that data. Even the regulatory structures (that don’t exist) will likely be aligned with the ideological biases...
Of individuals who view such technologies as a necessity to keep us safe; and safe from whom is a question we should be asking. Moreover, we should be wary of improvements that allow this technology to better capture people of color given the hyper policed nature of our bodies.
Now, to be clear, all of this has been said by folks in STS like @safiyanoble and I’m just repeating it, but the point that I want to make is that STS folks have been warning y’all, WARNING all of y’all, for years that this shit was coming. And now it’s here.
And it’s only going to get worse as the technology gets better, as the oppressive elements of our society grow louder and more insistent about maintaining their oppressive ideologies and their implementation in technology and society. It’s going to get worse in unthinkable ways.
Unthinkable except to the STS folks doing this work. That being said, please listen to us when we try to warn you about racist and sexist algorithms, the implication of oppressive ideologies in our technologies, and the continuities of our technology with our oppressive ideas.
Listen to us because this is the work we’re invested in and the work we do. It’s thankless work, often ignored by the people who should be listening, but it’s necessary. So listen to us when we try to warn you.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Dr. Johnathan Flowers - Institutional Killmonger
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!