Twitter's photo cropping algorithm has been shown to have a problem—it seems to prioritize displaying white faces over others. And Twitter isn't alone. (thread) theguardian.com/technology/202…
Zoom has a similar issue:
Even though Twitter's feature here isn't the same as face recognition, its algorithm still ends up prioritizing some users over others—a good example of how features that use “machine learning” often end up producing unexpected results.
Most face recognition software also has this problem. Earlier this year, in what should be a red flag to lawmakers, IBM, Amazon, and Microsoft admitted the harm that this technology causes. eff.org/deeplinks/2020…
Three of this year’s EFF Pioneer Award winners have done essential work on facial recognition’s higher error rates for people of color: Joy Buolamwini, Dr. Timnit Gebru, Deborah Raji eff.org/awards/pioneer…
Face surveillance erodes everyone’s privacy, chills free speech, and has an outsized negative impact on minority communities. Governments should not use these tools. Rather, they must face the facts about how damaging this technology is to the people they have a duty to protect.
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.
