, 33 tweets, 7 min read Read on Twitter
@alexstamos is giving the keynote at USENIX Security 2019. He's starting out by being nervous and saying he's not going to say new stuff.

His title of "adjunct professor", where (as we know) "adjunct" is Latin for "not really a".

But why speak? Because we're not doing so hot.
This is very clear in the way that the public is treating us. Silicon Valley is a whole show devoted to making fun of us. The tone has changed over time; darker

That's something we need to take responsibility for. There are fundamental problems with how we teach and do security
Everything people do over the last 10 years has been revolutionized by technology but the way that we treat people has not kept up.

We're way too in love with complexity. That's how we choose what to do and what to concentrate on.
Stamos Hierarchy of the Actual Bad Stuff that happens online to real people.
The vast majority is abuse: people using products correctly but harmfully to hurt others.
The rest is InfoSec: breaking confidentiality, integrity, etc. Most of *that* is password issues.
When it comes to machines being broken into, most of *that* harm comes from patching issues. Equifax was caused by not patching. It's not that technically interesting.

The next biggest chunk is simple config errors, like leaving your S3 bucket open to the public.
... then old application vulnerabilities that we've all seen before (XSS, etc)

... then next up is USENIX stuff.
The top pixel is 0-days.
And a tiny little subpixel is side-channel attacks.

This is still good to look at! It will trickle down into more harm if we don't fix it.
But we should think about the ratio of time we're spending on that huge "abuse" bucket and that tiny little sub-pixel.
We have no good model for the relationships between people and governments and big tech companies.

This is an optimization problem, like fast-vs-correct-vs-cheap. In the space of speech control, much more complicated. Lots more tradeoffs (star not triangle) and poorly-understood
For example: FTC consent decree on Facebook that means that people can't scrape public data. At the same time the FBI is scraping Facebook to stop white supremecists.

This is the sign of a fundamental societal disagreement.
This is a sign that tech companies are not being fundamentally honest about what's going on. Even with as bad as the internet is, that's something that takes many thousands of people working *very* hard.

Companies don't release numbers other than CSAI stats (required to!)
NCMEC (the org that handles CSAI tips) got 18+ million tips last year. That's a *huge* number.

The reality is that a talk called "pediatric strangulation: an integrated response" has to be repeated multiple time at the Crimes Against Children conference because it's so popular
The world is a terrible place. And tech companies aren't talking about it enough, so lawmakers can't understand the state of the Abuse world and thus can't make good tradeoffs about it.
We are not collectively learning from our mistakes. After the FSB hacked Yahoo, @alexstamos had to testify in front of Congress, all sorts of depositions. In a deposition, you can't claim the 5th. You have to truthfully answer questions.

There were 1000s of hours of depositions.
That's a lot of investigation. At the end a lot of lawyers asked for $85 million. And that's what happened.

That's *all* that happened. Our society is set up to divvy up blame and make lawyers rich. We're not set up to make things better. That's why we keep having the same bugs.
And so every time there's a problem, the company spends a huge amount of time to figure out the root cause and how to fix it... and never shares that.

[ Tweeter note: that's because everyone's so damn scared of being sued ]
When there's a transportation issue, there's a report and investigation at the @NTSB and it's visible publicly so we can learn from it. For airplane issues, pilots are highly incentivised to self-report issues (legal immunity).
In security, people don't have that immunity. We don't talk about it.

We only talk about breaches, not vulnerabilities and near-misses because people are so scared of being sued.
Lessons from Yahoo:
1. If you aren't moving forward, you are drowning. Tech keeps moving and you have to keep up to have security.

2. Tech giants can be too responsive to Wall Street. If you have to make the numbers look good every quarter, then you have to do unhelpful things.
... Exec pay is pegged to stock price, and so companies end up being too responsive to Wall Street and not responsive enough to long-term concerns.
3. IV in --> phone off (don't send profanity-laden woozy emails)

[ Tweeter note: good life lesson ]
Successful think that their culture is why they succeeded and they become rigid around that. This is really dangerous when you also venerate the decisions of the past. Facebook had a real problem with that. FB used to be the land of FarmVille. That doesn't work well for elections
Google and Facebook give huge access but are huge centralization. And those execs think that the past decisions that got them there are the right decisions for the future, and that's not the case.
For example, Wired article from yesterday talked about protecting "divas". That's something Google has given to the rest of us and it's *bad*. It's poisonous.
The key to predicting harm is empathy.

The key to empathy is team diversity.

Look around the room. This not America. This is definitely not representative of the world. This is the group of people who are deciding what it means to be harmed or not.
We have huge blind spots because a lot of harm isn't consonant with our experience. We need to welcome in people with different experience.

#PEPR19 #soups19 and #ENIGMA are real models for this. #PEPR19 was the most gender-diverse conference that USENIX has ever run. Congrats!
Adjust incentives towards harm not complexity or headlines.

[Points at really great talks that cover really hard problems that we don't talk enough about -- older folks' experience, abuse, etc]
Academic CS needs to focus on supporting other disciplines. Software is eating the world, but CS is about CS. How are we going to support other disciplines that depend on CS.
Take these lessons back to the classroom. A bunch of people in this area (including me, actually, I need to finish that) are writing classes based on what we learned the hard way. That will all be open-sourced.
The nerds inherited the Earth. We need to do better.
SOMEONE JUST STOOD UP AND CLAIMED THAT THINGS LIKE PATCHING AND PASSWORDS ARE EASY TO SOLVE.

[ If anyone thinks that this is easy, I would be happy to explain why it's not. ]
SOMEONE ELSE JUST CLAIMED ABUSE ISN'T HARD EXCEPT FOR SCALE AND WE JUST NEED TO HARNESS RESEARCH ON SCALE.

[ A lot of people-stuff is really hard. Alex is doing a good job is pointing out that what is right for people is not always the same as privacy. There are other goods. ]
[I don't want to be dismissive -- but these problems can be really hard, even at a one-off micro-scale. They are fundamentally decisions about human interactions which can be cross-cultural and you're often missing a lot of context. Pls learn why they're hard, then help us scale]
[ End of livetweet, tip your waitstaff, etc. ]
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Lea Kissner
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!