ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread) #TransAwarenessWeek#tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.
When trans people are stopped and searched at airports, [sometimes] the sex on their travel documents does not match their gender presentation. @imajeanpeace law.unimelb.edu.au/news/caide/mac…
In the intro to their book Design Justice, AJL's Dir of Research & Design @schock, who is nonbinary & trans, wrote about how her own experience constantly flagged as a risk by airport security scanners teaches us a larger lesson about how tech design tends to reproduce inequality
Systems that rely on so-called Automated Gender Recognition are dangerous. They are inaccurate, they are non-consensual, and they can result in harmful and potentially fatal outcomes, depending on how they’re used.
Finally, if you are an ally, please amplify trans voices and commit to concrete actions, especially donating to organisations that are led by and focus on Black trans folks.
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation.
Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.
Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.