My Authors
Read all threads
1/ I feel that any textbook on "Cybersecurity 101" should start with stories like this. Politics is more integral to cybersecurity than other subjects.
2/ Ideally, cybersecurity should simply be about risk vs. reward, costs vs. benefits. Instead, it's become a moral fight. It's your moral duty to be secure even if you get no benefit from it. It's a moral transgression when you don't do what you are supposed to.
3/ Take Richard Clarke's quote that you "deserve" to be hacked if you spend less on security than free coffee for your employees -- "deserve" means "punished for your moral transgressions".
4/ Most all security guides tell people to choose "strong passwords". No, it's not really a thing. It's a trope, the feeling that the flaw is moral weakness, and the fix is to be strong.
5/ The real thing about passwords is "don't reuse the same password across all your accounts". That's so important that you can pretty much forget every other recommendation about passwords as long as you remember that one thing. But it doesn't fit the morality trope.
6/ The term "technical debt" is almost always used incorrectly, because people believe "debt" is morally bad and must be avoided. Bean counters know that "debt" is capital and must be embraced.
7/ Despite people like Bruce Schneier spending so much time criticizing "snake oil" in the cybersecurity industry, it's still a fixture behind everything we do. Technical battles take back seat to non-technical battles.
8/ No matter who good my product, as a vendor I'm ultimately selling to a largely non-technical market where buyers can't tell the difference between my product and snake oil.
9/ Why, then, as a vendor should I ever invest in technical quality? That seems like a foolish strategy. Luckily for the industry, there are many vendors who nonetheless produce good products despite the foolishness for doing so.
10/ You don't know what happened with the notPetya or Mirai worms. That's because the political narrative of what people want to discuss has wholly displaced the technical evaluation of what happened.
11/ With notPetya, what people want you to know is that it was based on a vuln weaponized by the NSA. What you should care more about is the supply chain risk of autoupdate, and the flaw in Windows networks that lead to lateral movement via PsExec.
12/ With Mirai, what people want you to know is that IoT is insecure and we need political solutions mandating security. The reality is that since Mirai, over 10 billion IoT devices have been added to the Internet, while the problem Mirai exploited has decreased.
13/ I am good. Therefore, if you oppose me, then you must be bad -- unreasonable, criminal, with some sort of evil agenda.

It's what politicians claim when people criticize them ('fake news'). It's what vendors claim when researchers disclose vulns in their products.
14/ Most everyone partakes in our increasingly politically polarized society, where it's less and less about underlying arguments/statements and more and more about simply what side you are on. The same is true of vuln disclosure.
15/ That's the political basis behind the original story at the top of this thread. It's not simply "shooting the messenger" so much that once you decide they are against you, then every action gets seen in a new light.
arktimes.com/arkansas-blog/…
16/ Is making trivial changes to URLs a violation of the CFAA's law against "unauthorized access"? such as changing "foo.php?articleId=5" to "foo.php?articleId=6"? The answer depends upon politics.
17/ One of the core principles of cybersecurity is known as "Kerckhoff's Principle", from the 1880s, describing cryptography. It says that encryption algorithms should be public so that everyone can pick them apart looking for flaws. The only secret is the password/key.
18/ This is the politics of control. When you keep the algorithm secret, you are in control. When there are flaws, it's okay if the enemy discovers them and uses them against you, because the enemy will keep this fact secret.
19/ What's not okay is when your own side discovers flaws in your encryption algorithm, exposing your mistakes to ridicule. For almost every company, the public learning flaws in your security is more important than hackers learning flaws in your security.
20/ "I just want to be a firewall admin and not worry about politics". This isn't going to happen. You'll become a political football. Some want more rules shutting down things their political rivals in the company. Others want you to open things up to allow their stuff.
21/ So many technical terms aren't -- they are political terms. A good example is "defense in depth". It's a political term that means "I want a bigger budget".
22/ The original purpose of "defense in depth" meant removing security: take away security from the perimeter in order to put it in the depths. If you are using the term to justify why you need to remove security, then you are using it correctly.
23/ It's what happens when you have a fixed budget. If you want to spend more in one place, it necessarily means spending less somewhere else.
24/ How it's actually used is to justify a larger budget, to add another layer of security, to build out your empire, to have more control. Everyone is convinced that if they just had a larger budget, they could do more.
25/ Of course, every department in a company is convinced they need a larger budget, so that part is nothing new. The thing about cybersecurity is that added argument that it's a moral imperative, that there's a need that transcends beancounting.
26/ Thus, the thing I want to teach students at the start of their careers is to not get sucked into the moral crusade. Cybersecurity is a fun technical challenge, the crusade will just drain your soul.
27/ "Infosec burnout" will happen. You'll be convinced you are on the right moral side of everything, and yet, you'll still lose the political battles. Your company will say "we take security seriously", so you think you are doing the right thing, but beancounters will stop you.
28/ Back in the in the early days of WiFi I was at a business conference. I asked the audience to raise their hands if WiFi security was their number 1 priority. The entire audience raised their hands.
29/ They were all liars. I then asked if they used WiFi, and they all raised their hands. If then asked if they thought their WiFi was secure -- only one person raised his hand (he used a VPN).
30/ If cybersecurity was their #1 concern, and they thought WiFi was insecure, then they wouldn't be using WiFi. QED.

What this shows is that no matter how much people tell you cybersecurity is important, it's not.
31/ Infosec burnout is what happens when all the evidence is on your side, and everyone tells you that you are right -- and yet you still lose the battle ... every ... single ... time.
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Robᵉʳᵗ Graham😷

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!