As one of my last doctoral coursework presentations, I spent time talking to my colleagues about the ethical dilemmas surrounding offensive security tool release. The outsider input was fascinating. Here's a thread to share some of that... 1/
Now keep in mind, my colleagues here are primarily educators. K-12 and university teachers, administrators, educational researchers, and so on. A few industry-specific education people as well, but none from infosec like me. 2/
My goal was to present the issue, explain why it was an ethical dilemma, and collectively discuss ethical perspectives that could influence decision-making. I withheld any of my opinions to let them form their own but gave lots of examples of OSTs and their use. 3/
I presented 3 typical release options. Open release to the world, no release to anybody, or a limited release to trusted third parties. I think most followers here understand the basic pros and cons of those things. Each option transgresses some moral principle for most folks. 4/
Right off the bat, lots of comments about industry-based ethical standards. We don't have any that everyone is required to uphold, in part because we aren't required to have licenses. Things like medical boards (medicine) and the bar (law) establish these in other fields. 5/
Folks were surprised we don't have more concrete things to fall back on. Most educators have pretty specific ethics/honor codes from their institutions, regional boards, or states. 6/
The place where most of us would be bound to any ethical code are with the certifications we obtain or our workplaces. But, those workplace ethics codes are not likely to apply to this issue unless you work at an infosec company specifically. 7/
We also talked about individual responsibility. I asked if folks would feel responsible if they released a tool used to attack someone else. Most said yes. A key question that arose was the DEGREE of impact from the OST release as well as the CERTAINTY of its use. 8/
I also asked if folks would feel responsible if they developed an OST and did NOT release it, and companies got attacked by a similar adversary tool but could not stop/detect it. Most said no, although some said it depends on the relationship with those victim entities. 9/
Another theme was the notion of scale. Particularly, the level of effort to weaponize an OST versus effort to build protection/detection around it. Also, who gets left behind because of an inability to build defenses against the tool/technique? 10/
Some folks were initially surprised that many small businesses don't have dedicated security people. I referenced some average security salaries and most of them got it pretty quick after that (although that is just one facet of that issue). 11/
We did some exercises around common ethical perspectives. For example, most agreed that a utilitarian view (the most good for the most people) would lean towards the open release of OSTs if you're going with volume-based impact alone. 12/
In another viewpoint, most thought that Kant's Categorical Imperative (what is right for one is right for all) would dictate not releasing OSTs because of the universal principle of doing no harm. 13/
In a pragmatism-based perspective, most said it would be highly situational and you would make the call based on the impacts and perceptions that came from the release of similar OSTs in the past. 14/
Of course, all those perspectives have their drawbacks and most folks apply facets of multiple perspectives in their decision making (even if they don't realize it or know these names for them exist). 15/
We used something called the Lonergan/Baird method to analyze the decision. As part of that, I got a great question... "Do the professional benefits of releasing an OST create some bias?" -- Absolutely! 16/
Releasing OSTs is generally a good career move. It can lead to promotions, better jobs, raises, speaking gigs, consulting opportunities, and more. That can obviously create some biased decision-making. 17/
That's the value of some of these ethical decision-making frameworks. They help you think of topics through different lenses, even if you already know a lot about the topic. We used Lonergan/Baird, Goldman's Four Square, and Badaracco's 5 timeless questions. 18/
There were a lot of great questions I couldn't answer. Things about base rates of attacks, the proliferation of tools, and so on. Things we just don't have numbers on yet. I did reference Paul Litvak's excellent paper here: vb2020.vblocalhost.com/uploads/VB2020….
More of this! 19/
Towards the end, I asked if folks thought that practitioners should start from a place of open release unless they could justify otherwise, or start from a place of no release unless they could justify otherwise. It was about 60/40 in favor of starting from no release. 20/
Other takeaways...
- Nearly everyone thought this would be a case-by-case decision
- Everyone thought that this needs a lot more research
- Nearly everyone was surprised the industry doesn't have more universal ethical standards considering the nature of the work
21/
More takeaways...
- Some were surprised our industry doesn't require any licensure
- Many were surprised to learn that OSTs weren't regulated in some way legally
- Many were surprised to learn that OST release is often a decision made by individuals
22/
We identified some questions that could be answered on a case by case basis to make decisions regarding OST release. I combined the ideas and wrote those out here.
23/
There's a roadmap for a lot of research to be done that informs those decisions.
Also, that second to the last question 🔥🔥🔥
24/
There you have it. Educators are a lot more risk-aware than you might think, but they also approach risk differently. I find that most perspectives are useful but few conclusions are, so do with that what you will.
25/
And for me? My thoughts are still that we need to do a lot more research to understand this topic. I wrote about that here: chrissanders.org/2020/07/resear….
I also created a Socratic framework for discussing OST release in educational environments here: chrissanders.org/2020/07/socrat….
26/
Overall, this was a fun and useful exercise and I learned a lot. At some point I'd love to teach an infosec ethics course to have more of these discussions around this topic and others -- it'd need to be live online or face to face though. A lot of nuances.
27/27
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Although I had met Alan, I didn't know him well. However, his signature hangs on my wall as part of the SANS Difference Makers award he had a hand in awarding me in 2018. 1/
From what I know of him, he was a big part of making sure this award existed because he believed that we should use technology to make people's lives better, and a lot of his life was committed to that idea. I think that's a sentiment most of us can get behind. 2/
When we think of people whose contributions shaped the state of education in computer security, Alan is likely in the top portion of that list. When you consider the transformative power of education on people's lives, it's easy to see how many people he impacted in some way. 3/
It doesn't matter if you don't have a lot of teaching experience as long as you are well-spoken. I'll work with you and teach you principles of curriculum design and adult learning to help turn your expertise into effective learning content.
Here are some comments from a few of our course authors who I've worked with during this process so far.
I think one of the best 1-2 punches we've got going right now is our CyberChef course + our RegEx course. I consider both pretty necessary skills for analysts of multiple sorts (SOC, IR, and Malware RE).
CyberChef is maybe my most used tool in investigations these days other than the SIEM or a terminal. That course gives you a taste of regex but then the RegEx course makes you comfortable there. You also get a free copy of RegEx Buddy with that course.
You also get the strong 1-2 punch of Matt's Australian accent and Darrel's British accent 😍
Some of the work I'm most proud of from my time at Mandiant was pioneering the building of investigative actions *into detection signatures* as they were written. This had profound impact across the detection and product teams, and made the tool so much more accessible.
We included two main components: a plain English version of the question that the analyst needed to answer, and the search syntax that would provide data to answer the question. It manifested in the UI as something I named "Guided Investigations".
This helped detection engineers write much better signatures because they had to think more deliberately about the consumer of the alerts (the analyst). It led to higher quality detection logic and clearer metadata, including rule descriptions.
A nice, concise table from the @codeorg State of CS Ed report this year showing the adoption of the 9 key CS policies at the individual state level.
How does your state rank there? Are you surprised by any of it?
Here are the 9 policies referenced in the table. The Code.org advocacy coalition recommends these things be in place at a state level to make computer science a fundamental part of the state's education system.
Basically, there's a lot more to getting CS in schools than the state govt telling districts to do it or making a requirement. There needs to be curriculum and goals, teachers need to be trained and certified (pre and in-service), and all that has to be paid for and coordinated.
There are a lot of ways that folks distinguish between blue team roles. My focus is on investigative work and cognitive skills, so I divide those roles into the mental model shown in this diagram. 1/
The primary characteristic that distinguish these investigative roles is their common place in the incident identification and response process. You might be familiar with that process acronym of PICERL, but it appears in many forms: csrc.nist.gov/publications/d…. 2/
In the diagram, the functional portion of the PICERL process is at the top. Each role is listed below that with where it typically fits in relative to those phases. Preparation and Lessons Learned phases are excluded since those are pre and post-investigation steps. 3/