Evan Greer is on Mastodon and Bluesky Profile picture
💻 Director @fightfortheftr 🎸#SpotifyIsSurveillance out @GetBetterRecs x @DonGiovanniRecs 💅 she/they, opinions mine. 🐘 https://t.co/R6N5dN1Co7

Sep 20, 2020, 8 tweets

It feels like part of the reason that mainstream tech discourse has latched on so much to the specific problem of bias in AI, especially facial recognition, is that people are uncomfortable questioning the validity of institutions like policing.

It's much easier and safer to say "This software might be biased and therefore police shouldn't use it until it works right" than it is to say "this software will help police perform the functions of policing faster, and more efficiently, and that in and of itself is a bad thing"

The same could be said for corporations looking to use AI and things like face recognition for marketing or customer experience, etc. Yes, bias in these systems can exacerbate discrimination, but using software to extract ever more profit from humans is problematic from the start

Most mainstream articles about facial recognition basically say something like "Privacy advocates have raised concerns that the software exhibits racial and gender bias." This is true, those flaws are deeply concerning especially the way this software is being used RIGHT NOW.

But the reality is that facial recognition surveillance will still be used to enforce white supremacy if and when the algorithms improve and the bias issues are "addressed." Layering tech on top of inherently unjust systems simply automates and amplifies their injustice.

None of this is to diminish the absolutely crucial work being done by researchers and advocates exposing and documenting the ways that current facial recognition and other AI systems exhibit systemic racial and gender bias. That information is crucial to have a real discussion

But my point is that pundits and tech writers consistently using "bias" as shorthand for what is actually a much broader set of systemic problems misleads people into thinking that this is an issue that can be easily fixed by addressing the bias, retraining the software etc.

It's not hard to see parallels with the ways reformists approach systemic racism in policing: they call for more training, more investigations, more bureaucracy and safeguards, rather than recognizing that some things just need to be abolished. Facial recognition is one of em.

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling