Profile picture
Liz Fong-Jones @lizthegrey
, 40 tweets, 15 min read Read on Twitter
Next up: @sweetpavement on ethics and inclusion at design time, not at runtime. #WSC2018Conf
@sweetpavement Starting off by taking a deep breath, because we all need it. [ed: yeah, we all do.]

Rowan is an SRE at buzzfeed, and she joined because it provided reliable journalism about things ranging from the Kardashians to Robert Muller & Black Lives Matter. #WSC2018Conf
Their engineer culture is to expect to make sure that even when we choose between doing things right and fast, the fast version is still ethical. #WSC2018Conf
Think about the fact that we carry cellphones that can report our location, or that Facebook knows what text you typed and then deleted. #WSC2018Conf
and yet pop culture says that engineers are insulated from the ethical choices, because they never have to explain it with more than "it's too complicated, you wouldn't get it." and rockstar tech ninjas who disrupt things. #WSC2018Conf
We need to promote equity and human dignity.

Oppenheimer said: "when you see something that is technically sweet, you go ahead and do it, and only argue what to do with it afterwards." #WSC2018Conf
These problems didn't start with Cambridge Analytica -- think Uber's God Mode, inclusion of rootkits on Sony CDs that sent listening habits back to Sony, etc. #WSC2018Conf
There are plenty of stories that tell us that we need a continuous and ethical design practice.

It costs us the trust of our users and long-term growth if we don't do this. #WSC2018Conf
But what are the benefits? Intellectual rigor which makes us smarter. It requires empathy to be a safe place to discuss our needs, which leads to more inclusive spaces. #WSC2018Conf
We need to prevent harms by prioritizing ethics. It demands more than a single simple answer to protect the dignity of human rights and potential. #WSC2018Conf
Let's talk about harassment: "don't feed the trolls" vs. "drag the haters."

We think problems are "too hard" because they can't be automated. #WSC2018Conf
Technology is not neutral. Twitter has facilitated both revolution and hatred. Groups for queer teenagers and abused women, as well as for white supremacists on Facebook.

Is it an inevitable balance? @sweetpavement doesn't think so. #WSC2018Conf
The absence of ethical awareness doesn't only impact Twitter. It impacts all of us in our daily work.

Example: the paperclip optimizer, as applied to ML. Think about the implications of auto-tagging your friends in birthday features, except in protest crowds too #WSC2018Conf
We are contributing to entrenching bias and endangering people.

Like doctors and lawyers and civil engineers, we have life and death impacts upon people.

We need to not pursue quick profit. Write down your principles and treat them like a contract #WSC2018Conf
Begin fresh every time when analyzing impacts of launched and unlaunched features, not just on new products. Tiny changes can have massive impact. #WSC2018Conf
Example: applications that block content unless there's a password or you get approval from a supervisor.

It's a public web filter. But we need to have conversations about their impact. Errors in them block educational content. #WSC2018Conf
Always ask how what we build can be misused by our worst enemies, says @sweetpavement

Ethics have to do with minimizing harm and promoting community and maintaining our social interactions. #WSC2018Conf
The first time you get hacked, or your platform is exploited to send 500 pizzas to someone's house, or you wind up in someone's house.. all your good intentions vanish. #WSC2018Conf
We need to consider ethics early so that they don't become the crises we're concerned about.

Unaddressed ethical concerns become security and risk concerns. #WSC2018Conf
Take the example of the web filter. It logs every site it blocks. The librarians go through the logs and look for real educational content to add to the allowlist.

So they're saving it to a local database and one hacker can get in and find out who is trans, disabled #WSC2018Conf
and then it can be weaponized. So we need to prevent this before it happens.

We need to diversify perspectives at every stage. We need to *overrepresent* vulnerable groups to make sure we get their feedback. #WSC2018Conf
Always ask what kinds of users aren't in the room and seek out their voices.

Hear not just risks, but ethics behind user stories. And apply your mitigations broadly to as many people as you can. #WSC2018Conf
Vulnerability is critical on both sides. Hear things from the perspective of the marginalized. Realize that there's a power dynamic.

But honesty and candor can't be used as smokescreens for bullying. #WSC2018Conf
Reading same IEEE quote from earlier: to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment #WSC2018Conf
Algorithmic Justice League study: 34.7% of black women were misclassified compared to tenths of a percentage of white men.

Black women were misclassified as black men, AND the results were fed into police databases. Lethal consequences. #WSC2018Conf
ML is not a magical box. We need to make sure that we account for impacts.

Think about other cases: net neutrality would endanger freedom of access to information and cause people with less socioeconomic status to lack information e.g. from wikipedia. #WSC2018Conf
Before things reach production, we need to account for the disproportionate ways new tech can affect marginalized people & communities e.g. through bias and structural violence. #WSC2018Conf
"If we have the ability to change the world, we have the ability to break the world. We need. to. be. responsible." --@sweetpavement #WSC2018Conf
Tech companies do poorly at communicating with stakeholders. Our EULAs aren't readable to anyone but lawyers.

We owe users well-formatted changelogs, for instance. They depend upon behavior of our products. #WSC2018Conf
To communicate transparently, we need to say what cases we support, what potential ethical concerns exist [disclosing responsibly without inciting abuse], how to mitigate risk, and accountability when there are vulnerabilities. #WSC2018Conf
Be accountable and transparent. It's the only path forward to regain trust. It demonstrates you want to protect rights and dignity, and will do better. #WSC2018Conf
Our users don't care whether you intended them no harm.

You have to accept responsibility that you made it possible to happen. Responsibility isn't blame. "Blame creates guilt, which creates the opposite of accountability." --@sweetpavement #WSC2018Conf
Decline to build the unethical and support those who declined.

"If I don't, someone else will" is a crap justification. Recent open letters and resignations demonstrate that accountability is not isolated or a fringe concern. #WSC2018Conf
Failing to adhere to ethical principles needs to have consequences.

Emotional and intellectual safety are important ways of supporting other people. Safety, not necessarily comfort; discomfort is a useful signal to us. #WSC2018Conf
We need to take risks, and decline to do things that are unethical. Whether it is declining a project, changing teams, or leaving a job.

We may even have to risk our visas and residences in our countries. Many of us have built our lives around work. #WSC2018Conf
The outlay of time and effort early avoids the risk of later crises and high risk later.

It's a bargain to consider ethics early compared to having to replace engineers down the line. #WSC2018Conf
We need to socially and materially support each other. Provide each other referrals. Help each other out with our fuck you funds. Not everyone can have one. #WSC2018Conf
Closing the talk: Your scientists were preoccupied with whether they *could*, they didn't stop to think if they *should*." from Jurassic Park.

We fetishize disruptiveness, yet "that quality got me kicked out of the Girl Scouts" --@sweetpavement #WSC2018Conf
Tech culture is broken. Culture is intentional, it doesn't happen just because good people showed up. It requires setting good rules and ethics.

"Technology is science made manifest for human use. We need to consider human rules to govern it." -@sweetpavement [fin] #WSC2018Conf
"Unethical technology is evil." We need to not have the conversation stop here. We can make a difference if we start trying. #WSC2018Conf
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Liz Fong-Jones
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!