Big news: @jeremyphoward & I have moved to his home country of Australia (he is not a USA citizen & has been wanting to return for years). I’m excited about the move, although it is bittersweet 1/
The last year in the USA has been horrifying. I’m lucky in so many ways: we were able to isolate pretty strictly as a family of 3 (a privilege that many did not have) & our daughter thrived at home with us, although it was still hard to go for a year with... 2/
no in-person childcare; not seeing any of my friends; hearing regularly that mass preventable death is okay as long as it is mostly people with chronic illness (like me), the elderly, & BIPOC dying; and worry of not being able to access an ER or ICU 3/

These are not the only or even the primary reasons for our move, but they did help shape my thinking.

I have so many loved ones in the USA & I will certainly come back to visit. I still love San Francisco. 4/
I’m sad to leave my dream job as Director of the Center for Applied Data Ethics at USF. I will be joining the USF Data Institute Advisory Board to stay involved. 5/
I’m excited about Australia & what the future holds. We will be living in Brisbane. We are still in mandatory quarantine (on day 11 of 15). I’m looking forward to connecting with others in Queensland working on data ethics, algorithmic bias, surveillance, & more in the future 6/

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Rachel Thomas

Rachel Thomas Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @math_rachel

20 Feb
Calculating the souls of Black folk: predictive analytics in the child welfare system

powerful & informative talk by @UpFromTheCracks at @DataInstituteSF Center for Applied Data Ethics seminar series, video now online Slide reading "calculating the souls of Black folk: pre
If there was a material benefit from the family regulation system (child welfare system), middle class white people would be seeking it out for their kids.
The child welfare system is not biased, it is racist.

Racist in the Ruth Wilson Gilmore sense of the word: racism is a state-sanctioned and/or extralegal production & exploitation of group differentiated vulnerability to premature death.
Read 4 tweets
19 Feb
Is your machine learning solution creating more problems than it solves? @JasmineMcNealy on how in focusing narrowly on one problem, we may be missing many others @DataInstituteSF CADE Seminar Series slide reading: "what's the problem? are you creating mo
21 States Are Now Vetting Unemployment Claims With a ‘Risky’ Facial Recognition System:

"legitimate claimants have also been rejected by the company’s machine learning & facial recognition systems — leading to massive delays in life-sustaining funds"

onezero.medium.com/21-states-are-…
Central considerations when designing algorithms:
- *Who* is the prototype?
- Continuous evaluation & auditing
- We need to normalize STOPPING the use of a tool when harm is occurring. We don't need to keep using a tool just because we've already started. @JasmineMcNealy slide: The Must Be's: Central Considerations. Who is the pro
Read 8 tweets
4 Feb
The founder of @gridai_ (@_willfalcon) has been sharing some falsehoods about fastai as he promotes the pytorch lighting library. I want to address these & to share some of our fast ai history. 1/
Because our MOOC is so well-known, some assume fast.ai is just for beginners, yet we have always worked to take people to the state-of-the-art (including through our software library). 2/
I co-founded fast.ai w/ @jeremyphoward in 2016 & worked on it full-time until 2019. I've been focused on the USF Center for Applied Data Ethics for the past 2 yrs, so I can’t speak about the current as much, but I can share our history. 3/

Read 21 tweets
30 Dec 20
In computational systems, we are often interested in unobservable theoretical constructs (eg "creditworthiness", "teacher quality", "risk to society"). Many harms are result of a mismatch between the constructs & their operationalization -- @az_jacobs @hannawallach Why measurement? When we re...
A measurement model is "valid" if the theoretical understanding matches the operationalization. There are many ways validity can fail. Slides showing: - a chart w...
Some types of Validity
content: does measurement capture everything we want
convergent: match other measurements
predictive: related to other external properties
hypothesis: theoretically useful
consequential: downstream societal impacts
reliability: noise, precision, stability Validity of a measurement o...
Read 4 tweets
26 Dec 20
I made a playlist of 11 short videos (most are 7-13 mins long) on Ethics in Machine Learning

This is from my 2 hrs ethics lecture in Practical Deep Learning for Coders v4. I thought these short videos would be easier to watch, share, or skip around

Screenshot of part of youtu...
What are Ethics & Why do they Matter? Machine Learning Edition
- 3 Case Studies to know about
- Is this really our responsibility?
- What is ethics? @scuethics
- What do we teach when we teach tech ethics? @cfiesler

Software systems have bugs, algorithms can have errors, data is often incorrect.

People impacted by automated systems need timely, meaningful ways to appeal decisions & find recourse, and we need to plan for this in advance

Read 12 tweets
23 Dec 20
Interested in improving diversity in AI, or in tech in general? I have done a bunch of research on this and have some advice 1/
First, what doesn’t work: shallow, showy diversity efforts (even if they are well-intentioned) aren’t just ineffective, they actively cause harm.

Spend time thinking through your strategy & making sure you can back it up 2/

medium.com/tech-diversity… It is painful watching tech...
For example, if you start a “women & allies” email list and then fire a Black woman for being honest on it, it probably would have been better not to have the email list in the first place 3/

Read 23 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!