David Gray Widder Profile picture
not what is normal, what is right? "Ethical AI" goes in scare quotes (un)censored sensor unplugger art:IG@davidthewid PhD@CMU, exNASA/MSR/IntelLabs
Mar 29 8 tweets 2 min read
🚨 New paper, at CSCW'24!



We examine how existing teams discuss ethics, and how their power dynamics limit what ethical critique can be raised.

And we ask: can speculative futures games help expand these limits? (Hint: 🚫No, not directly)

🧵... arxiv.org/pdf/2403.19049…
Title, author and abstract of linked paper. 🃏 We observe 3 corporate teams and 1 activist group play a game designed to help ask: what could go wrong with AI?

🗣 After, we interview each person individually:  what did they feel able to say? how did this differ vs their teams' usual ethics discussions?
Oct 16, 2023 14 tweets 4 min read
🚨New paper! 🚨

On *incentives* in AI/ML communities, why AI is … the way that it is. :/

This work is an "origin story" of LLMs as told by researchers around since long before the public eye and ✨hype✨

And many are conflicted… 🧵

📄pdf: arxiv.org/abs/2310.07715
For example, interview participants spoke of how they're incentivized to produce marginal improvements to get papers accepted, like a 2% model accuracy increase.

Numbers go brrrrr… Image
Sep 21, 2022 7 tweets 1 min read
tip to new PhD students from a 5th year: *show your advisor tables*

it really doesn't matter what you put in the tables.

PhD advisors LOVE tables. you can put data in tables, and they love that, but don't stop there.

you can put action items, paper sections, RQs, related work, whatever. put it in a table. 100% success rate.
Sep 21, 2022 18 tweets 7 min read
⛓What do modular software supply chains mean for "Ethical AI"?

Developers release discrete modules, or compose existing modules into finished AI systems.

@dawnnafus and I show that this makes it hard to imagine or take responsibility for AI harm.

📄: arxiv.org/abs/2209.09780 A screenshot of the first p... From interviews with developers building "fundamental" libraries, end-user products, & things in between, we show:

High in the "supply chain", harm feels remote and unimaginable.

Lower in the "supply chain", devs feel unable or unwilling to pay ethical debt accrued upstream. Image
Sep 19, 2022 7 tweets 3 min read
.@clegoues and I spent the last 45 mins in lively conversation, primarily arguing what a software "bug" is.

Q: #SoftwareEngineering practitioners and researchers:
How would you define "bug"? @clegoues Please RT! and hottakes/informal defs are very welcome!
Apr 28, 2021 5 tweets 2 min read
there are research sensors in our new CMU offices:

microphones measuring sound (not voice), 8x8 low res infrared camera, accelerometer detecting doors closing and other vibrations.

this is opt-OUT not opt-IN: data collected & is used for research, except mic, which is opt in. Image one use case the faculty member in charge of the research suggested was tracking if our offices are vacuumed by cleaning staff.

this seems like algorithmic management of lowly paid employees to me.