ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread) #TransAwarenessWeek#tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.
When trans people are stopped and searched at airports, [sometimes] the sex on their travel documents does not match their gender presentation. @imajeanpeace law.unimelb.edu.au/news/caide/mac…
In the intro to their book Design Justice, AJL's Dir of Research & Design @schock, who is nonbinary & trans, wrote about how her own experience constantly flagged as a risk by airport security scanners teaches us a larger lesson about how tech design tends to reproduce inequality
Systems that rely on so-called Automated Gender Recognition are dangerous. They are inaccurate, they are non-consensual, and they can result in harmful and potentially fatal outcomes, depending on how they’re used.
Finally, if you are an ally, please amplify trans voices and commit to concrete actions, especially donating to organisations that are led by and focus on Black trans folks.
#BlackHistoryMonth.
We commence this month with words from our founder, Dr. @jovialjoy, followed by a poem a 💌 on the value of each life:
"In a year where we will see massive investments in #AI and the deployment of more generative AI systems like #ChatGPT, King’s prescient words on the need to “shift from a thing-oriented society to a person oriented society” ring true not just from the sanctuaries of Georgia...
but also the server farms of giant corporations.
AJL will continue to strive for a world where AI serves all of us, not just the privileged few. We will strive for a society where hue is not a cue to dismiss your humanity."
🚨: Oxygen levels measured by the "gold standard" pulse oximeters were "3X less likely to detect low oxygen levels in Black patients than in white patients."
🧵below ⬇️⬇️
.@ushamcfarling writes how using pulse oximeters that can't read oxygen levels on darker skin as accurately could delay potentially life-saving #COVID therapies in Black, Asian & Hispanic patients
via @statnews statnews.com/2022/05/31/fau…
🧵2
🛑This isn't NEW news.
➡️ In Aug '20: @BostonReview wrote "A #PulseOximeter "must “see” your blood by having the light pass through your skin. This should give us pause, since a range of tech-based on color sensing is known to reproduce racial bias." bostonreview.net/articles/amy-m…
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation.
Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.
Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.