ICYMI:
Transgender Awareness doesn't end with a week.
THE TROUBLE WITH GENDER BINARY ALGORITHMS for FACIAL RECOGNITION & AI
(a thread)
#TransAwarenessWeek #tdor2021
AI systems are trained to think like those who designed them — typically with a binary, cisnormative conception of gender.
So-called Automated Gender Recognition (AGR) — Tech that attempts to automatically classify people as either male or female - a binary conception of gender. "The input is either a picture, video, or social media post of someone.

towardsdatascience.com/towards-trans-…
When trans people are stopped and searched at airports, [sometimes] the sex on their travel documents does not match their gender presentation. @imajeanpeace
law.unimelb.edu.au/news/caide/mac…
In the intro to their book Design Justice, AJL's Dir of Research & Design @schock, who is nonbinary & trans, wrote about how her own experience constantly flagged as a risk by airport security scanners teaches us a larger lesson about how tech design tends to reproduce inequality
Systems that rely on so-called Automated Gender Recognition are dangerous. They are inaccurate, they are non-consensual, and they can result in harmful and potentially fatal outcomes, depending on how they’re used.

engr.washington.edu/news/article/2…
What can be done?

1. In general, so-called Automated Gender Recognition should not be used. The best way to learn about someone's gender is to ask them!
2. Check out @heathfoggdavis book Beyond Trans: Does Gender Matter?

amazon.com/Beyond-Trans-D…
3. Have you been misgendered by an AI system? Share your story with us! ajl.org/connect/expose…
Learn more about Trans Day of Remembrance here: tdor.tgeu.org #TDOR2021
Finally, if you are an ally, please amplify trans voices and commit to concrete actions, especially donating to organisations that are led by and focus on Black trans folks.

Here is a list:
docs.google.com/document/d/1dd…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Algorithmic Justice League

Algorithmic Justice League Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @AJLUnited

17 May
On Sunday night, @60Minutes aired an episode on facial recognition and #algorithmicbias, and excluded the groundbreaking work of prominent Black female AI scientists: @jovialjoy, @rajiinio, and @timnitGebru.

#CiteBlackWomen 1/6
Instead, @60Minutes featured another study — calling it “groundbreaking” — even though the study itself explicitly cited both @jovialjoy's and @rajiinio's algorithmic bias work as its motivation. 2/6
And it gets worse.

Our founder @jovialjoy spent hours with @60Minutes producers before the show aired, built a custom demo for @andersoncooper, and recommended research to feature. 3/6
Read 6 tweets
17 May
BREAKING: Last night during @60Minutes’ episode on facial recognition, the Black women who co-authored pioneering research on #AlgorithmicBias were completely erased.

Read what happened below then take action by joining our fight to be heard ➡️ bit.ly/ajlnewsletter-…
While talking about how AI development often leaves out the marginalized (which leads to incredible harm), @60Minutes not only excluded @jovialjoy, @rajiinio, and @timnitGebru — but miscredited their seminal work on #AiBias.

The irony is too much to stomach. #CiteBlackWomen 2/4
This isn't new.

Far too often, we still face erasure.

Read more from our founder @jovialjoy who spent hours with @60Minutes developing the @AndersonCooper interview (and a lifetime uncovering bias in AI) ⬇️ #CiteBlackWomen #VoicingErasure 3/4
Read 4 tweets
25 Apr
❓❔ What is #AlgorithmicBias ❔❓

💻 A short #CodedBias Thread 💻 Image
1 - Some machines use algorithms like training guides to learn how to complete tasks as data comes in over time. #AlgorithmicBias #CodedBias Image
2 - These machines then use what they learn and make big decisions in people’s lives like:

➡️ who gets hired or fired.
➡️ who receives proper medical treatment.
➡️ who is targeted in police investigations.
Read 8 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Thank you for your support!

Follow Us on Twitter!

:(