#BlackHistoryMonth.
We commence this month with words from our founder, Dr. @jovialjoy, followed by a poem a 💌 on the value of each life:
"In a year where we will see massive investments in #AI and the deployment of more generative AI systems like #ChatGPT, King’s prescient words on the need to “shift from a thing-oriented society to a person oriented society” ring true not just from the sanctuaries of Georgia...
but also the server farms of giant corporations.
AJL will continue to strive for a world where AI serves all of us, not just the privileged few. We will strive for a society where hue is not a cue to dismiss your humanity."
🚨: Oxygen levels measured by the "gold standard" pulse oximeters were "3X less likely to detect low oxygen levels in Black patients than in white patients."
🧵below ⬇️⬇️
.@ushamcfarling writes how using pulse oximeters that can't read oxygen levels on darker skin as accurately could delay potentially life-saving #COVID therapies in Black, Asian & Hispanic patients
via @statnews statnews.com/2022/05/31/fau…
🧵2
🛑This isn't NEW news.
➡️ In Aug '20: @BostonReview wrote "A #PulseOximeter "must “see” your blood by having the light pass through your skin. This should give us pause, since a range of tech-based on color sensing is known to reproduce racial bias." bostonreview.net/articles/amy-m…
ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread) #TransAwarenessWeek#tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation.
Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.
Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.