1 - Some machines use algorithms like training guides to learn how to complete tasks as data comes in over time. #AlgorithmicBias#CodedBias
2 - These machines then use what they learn and make big decisions in people’s lives like:
➡️ who gets hired or fired.
➡️ who receives proper medical treatment.
➡️ who is targeted in police investigations.
3 - Sometimes throughout the process of teaching a machine to make decisions, societal biases can creep in and encode racism, sexism, ableism, or other forms of harmful discrimination.
4 - Robert Williams, a 43-year old black man from a suburb of Detroit, was wrongfully arrested in front of his daughters and held for 30hrs after facial recognition software led to his misidentification. #AlgorithmicBias#CodedBiasmetrotimes.com/news-hits/arch…
5 - Daniel Santos, a teacher in Houston, was FIRED after an ”automated assessment tool” didn’t count all his caring, qualitative work with students — undervaluing his performance and labeling him a “bad teacher”. #AlgorithmicBias#CodedBiaschron.com/news/houston-t…
6 - Some have said #AlgorithmicBias is just a technical problem and with better data, the issues will be fixed.
But how systems are used is just as important as how well they work!
7 - The more people know and understand #AlgorithmicBias, the more we can upend the harm being created.
#BlackHistoryMonth.
We commence this month with words from our founder, Dr. @jovialjoy, followed by a poem a 💌 on the value of each life:
"In a year where we will see massive investments in #AI and the deployment of more generative AI systems like #ChatGPT, King’s prescient words on the need to “shift from a thing-oriented society to a person oriented society” ring true not just from the sanctuaries of Georgia...
but also the server farms of giant corporations.
AJL will continue to strive for a world where AI serves all of us, not just the privileged few. We will strive for a society where hue is not a cue to dismiss your humanity."
🚨: Oxygen levels measured by the "gold standard" pulse oximeters were "3X less likely to detect low oxygen levels in Black patients than in white patients."
🧵below ⬇️⬇️
.@ushamcfarling writes how using pulse oximeters that can't read oxygen levels on darker skin as accurately could delay potentially life-saving #COVID therapies in Black, Asian & Hispanic patients
via @statnews statnews.com/2022/05/31/fau…
🧵2
🛑This isn't NEW news.
➡️ In Aug '20: @BostonReview wrote "A #PulseOximeter "must “see” your blood by having the light pass through your skin. This should give us pause, since a range of tech-based on color sensing is known to reproduce racial bias." bostonreview.net/articles/amy-m…
ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread) #TransAwarenessWeek#tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation.
Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.
Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.