Account Share

 

Thread by @yonatanzunger: "This thread very much echoes my feelings: Wylie built a weapon, understanding what uses its buyers had in mind, and it did exactly what was […]"

, 24 tweets, 5 min read
This thread very much echoes my feelings: Wylie built a weapon, understanding what uses its buyers had in mind, and it did exactly what was intended. And he hasn't come to a full moral reckoning with that. 1/
I didn't come up in computer science; I used to be a physicist. That transition gives me a rather specific perspective on this situation: that computer science is a field which hasn't yet encountered consequences.
Chemistry had two reckonings, in the late 19th and early 20th centuries: first with dynamite, and then with chemical weapons. Physics had its reckoning with the Bomb. These events completely changed the fields, and the way people come up in them.
Before then, both fields were dominated by hope: the ways that science could be used to make the world a fundamentally better place. New dyes, new materials, new sources of energy, new modes of transport; everyone could see the beauty.
Afterwards, everyone became painfully, continuously aware of how things could be turned against everything they ever dreamed of.
I don't know the stories from chemistry as well. In physics, I can tell you that everyone, from their first days as an undergrad (or often before), encounters this and wrestles with it. They talk about it in the halls or late at night, they worry about it.
They occasionally even rap about it, like @acapellascience (a physicist, btw) did. (The lyrics are worth listening to carefully)
This isn't to say that physicists are all pacifists. The rift between Edward Teller and J. R. Oppenheimer after the war was legendary, and both of them had very real reasons to believe what they did: Teller to love the Bomb, Oppenheimer to hate it.
(For those wondering: Teller was part of that generation of Central Europeans who saw exactly how bad things could get in so much detail. They were famous for their determination to make sure things were safe *at all goddamned costs.*
They were infamously not messing around, even though they took a wide range of approaches to it; consider that Edward Teller, John von Neumann, Henry Kissinger, and George Soros were all part of that.)
For a long time, it frightened me that biology hadn't yet had this moment of reckoning — that there hadn't yet been an incident which seared the importance of ethics and consequences into the hearts of every young scientist. Today, it frightens me more about computer scientists.
Young engineers treat ethics as a speciality, something you don't really need to worry about; you just need to learn to code, change the world, disrupt something. They're like kids in a toy shop full of loaded AK-47's.
The hard lesson which other fields had to learn was this: you can never ignore that for a minute. You can never stop thinking about the uses your work might be put to, the consequences which might follow, because the worst case is so much worse than you can imagine.
Even what Chris Wylie did is only the beginning. You hand authoritarian regimes access to modern data science, and what happens? You create the tools of a real panopticon, and what happens?
Those of you in CS right now: if you don't know if what I'm saying makes sense, pick up Richard Rhodes' "The Making of the Atomic Bomb." It's an amazingly good book in its own right, and you'll get to know both the people and what happened.
Think about this problem like SRE's, like safety engineers. Scope your failure modes out to things involving bad actors using the systems you're building. Come up with your disaster response exercises.
If you can do it without wanting to hide under a table, you're not thinking hard enough. There are worse failure modes, and they're coming for you. And you will be on deck to try to mitigate them. //
Short postscript: As several people have pointed out, many fields of biology *have* had these reckonings (thanks to eugenics and the like), and civil engineering did as well, with things like bridge collapses in the late 19th century.
Civil engineering responded to this by developing codes of ethics and systems of professional licensure which shape it to this day. I've been wondering about this a lot, recently: whether we should be doing the same in CS.
That is, ethical codes with teeth, and licensing boards with the real ability to throw someone out of the profession, the way boards can in engineering, medicine, or law.
There's a very serious risk to this: such boards also create barriers to entry to a profession, even to the extent of creating artificial shortages of professionals to drive up wages (nudge nudge, AMA), and can so actively "de-diversify" a field.
For a field where diversity (in its deepest sense) is critical to assessing operational and ethical dangers, that itself could be disastrous; one would have to find a robust way to ensure that it didn't happen.
I've been wanting to talk to numerous people in this and related fields (eg @lizthegrey, @LeaKissner, @avflox, @anildash, and @amac just to name a handful out of dozens) about this question: Is it time for enforceable ethics in CS?
And if so, how should we go about doing this, and who should we involve in the process?
Missing some Tweet in this thread?
You can try to force a refresh.
This content can be removed from Twitter at anytime, get a PDF archive by mail!
This is a Premium feature, you will be asked to pay $30.00/year
for a one year Premium membership with unlimited archiving.
Don't miss anything from @yonatanzunger,
subscribe and get alerts when a new unroll is available!
Did Thread Reader help you today?
Support us: We are indie developers! Read more about the story
Become a 💎 Premium member ($30.00/year) and get exclusive features!
Too expensive?
Make a small donation instead. Buy us a coffee ($5) or help for the server cost ($10):
Donate with 😘 Paypal or  Become a Patron 😍 on Patreon.com
Trending hashtags
Did Thread Reader help you today?
Support us: We are indie developers! Read more about the story
Become a 💎 Premium member ($30.00/year) and get exclusive features!
Too expensive?
Make a small donation instead. Buy us a coffee ($5) or help for the server cost ($10):
Donate with 😘 Paypal or  Become a Patron 😍 on Patreon.com