🚨: Oxygen levels measured by the "gold standard" pulse oximeters were "3X less likely to detect low oxygen levels in Black patients than in white patients."
🧵below ⬇️⬇️
.@ushamcfarling writes how using pulse oximeters that can't read oxygen levels on darker skin as accurately could delay potentially life-saving #COVID therapies in Black, Asian & Hispanic patients
via @statnews statnews.com/2022/05/31/fau…
🧵2
🛑This isn't NEW news.
➡️ In Aug '20: @BostonReview wrote "A #PulseOximeter "must “see” your blood by having the light pass through your skin. This should give us pause, since a range of tech-based on color sensing is known to reproduce racial bias." bostonreview.net/articles/amy-m…
🧵3
In Dec '20, Dr. Sjoding from @UMich Medical School said: "I had no understanding that the pulse ox was potentially inaccurate & that I was missing #hypoxemia in a certain minority of patients.” nytimes.com/2020/12/22/hea…
✒️ @RoniNYTimes
🧵4
Drs have said in articles throughout the #pandemic that they weren't aware of this issue DESPITE Drs being aware of this since the '90s.
➡️➡️In 1990, Dr. Jubran & Dr. Tobin reported that pulse oximetry was almost 2½ X less accurate in Black patients. journal.chestnet.org/article/S0012-…
🧵5
📢Spread the News to the #Excoded to prevent #AIHarms:
Physicians suggest that people with darker skin:
🛑 Question their pulse oximeter results
🛑 Speak with their physicians ➡️➡️ especially if they feel poorly or see any drop in oxygen levels.
ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread) #TransAwarenessWeek#tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation.
Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.
Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.