1 - Some machines use algorithms like training guides to learn how to complete tasks as data comes in over time. #AlgorithmicBias#CodedBias
2 - These machines then use what they learn and make big decisions in people’s lives like:
➡️ who gets hired or fired.
➡️ who receives proper medical treatment.
➡️ who is targeted in police investigations.
3 - Sometimes throughout the process of teaching a machine to make decisions, societal biases can creep in and encode racism, sexism, ableism, or other forms of harmful discrimination.
4 - Robert Williams, a 43-year old black man from a suburb of Detroit, was wrongfully arrested in front of his daughters and held for 30hrs after facial recognition software led to his misidentification. #AlgorithmicBias#CodedBiasmetrotimes.com/news-hits/arch…
5 - Daniel Santos, a teacher in Houston, was FIRED after an ”automated assessment tool” didn’t count all his caring, qualitative work with students — undervaluing his performance and labeling him a “bad teacher”. #AlgorithmicBias#CodedBiaschron.com/news/houston-t…
6 - Some have said #AlgorithmicBias is just a technical problem and with better data, the issues will be fixed.
But how systems are used is just as important as how well they work!
7 - The more people know and understand #AlgorithmicBias, the more we can upend the harm being created.