Profile picture
Joanna Booth @stillawake
, 14 tweets, 3 min read Read on Twitter
Fun facts from the STEM conference I helped moderate a couple of weeks ago;

women weren't used to test air bags and it ended up killing women -- they are shorter and drive closer to the wheel;
leevinsel.com/blog/2013/12/3…
"But a female dummy didn’t become a mandatory part of frontal crash tests until [2012]. For all this time, the average American guy stood for us all.

That may have had a substantial impact on women’s auto safety.
If airbags are designed for the average male, they will strike most men in the upper chest, creating a cushion for their bodies and heads. Yet small women might hit the airbag chin first, snapping their heads back, potentially leading to serious neck and spinal injuries.
In some cases, according to tests with female mannequins, small women were almost three times as likely as their average male counterparts to be seriously injured or killed. A study of actual crashes by the University of Virginia’s Center for Applied Biomechanics found
that women wearing seatbelts were 47 percent more likely to be seriously injured than males in similar accidents."
Women weren't used to test AI, their voices and faces weren't used for facial and voice recognition:
eandt.theiet.org/content/articl…
And you can't deal with gender bias without also talking about racial bias:

"For the darkest-skinned women, the systems were so poor at determining gender that they may as well have been guessing at random (46. 8 to 46.8 per cent error rate)."
(@Eurocentrique trying to remember more examples)
"almost every digital assistant uses a female name and voice. Siri, Google Assistant, Cortana, and Alexa all reinforce the stereotype of the female administrator.
“It’s much easier to find a female voice that everyone likes than a male voice that everyone likes,” Stanford communications professor Clifford Nass tells CNN. "
swaay.com/alexa-siri-sop…
'AI’s gender bias is [...] pervasive. The artificial intelligence sector is expected to grow from $21 billion to $190 billion between 2018 and 2025, and the employment demographic is overwhelmingly male.'
'AI’s diversity issues affect women as well as other gender minorities like transgender and non-binary individuals, and these diversity issues also continue beyond gender. “Cultural diversity is big too,” says Heather Knight, founder of Marilyn Monrobot Labs in New York City.
Racial underrepresentation in the tech world compounds issues for women from minority ethnic groups. Gender and racial bias in AI are significant enough to have an effect on the way the algorithms themselves are developed, which could have lasting consequences for society'
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Joanna Booth
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!