Computer security is really, really important. It was important decades ago, when computers were merely how we ran our financial system, aviation, and the power grid. 1/
Today, as more and more of us have our bodies inside of computers (cars, houses, etc) and computers in our body (implants), computer security is *urgent*.
Decades ago, security practitioners began a long argument about how best to address that looming urgency. 2/
The most vexing aspect of this argument was a modern, cybernetic variant on a debate that was as old as the ancient philosophers - a debate that Rene Descartes immortalized in the 17th Century.
You've doubtless heard the phrase, "I think therefore I am" (*Cogito, ergo sum*). 3/
It's from Descartes' 1637 *Discourse on the Method*, which asks the question, "How can we know things?" Or, expansively, "Given that all my reasoning begins with things I encounter through my senses, and given that my senses are sometimes wrong, how can I know *anything*?" 4/
Descartes' answer: "I know God is benevolent, because when I conceive of God, I conceive of benevolence, and God gave me my conceptions. A benevolent God wouldn't lead me astray. 5/
Thus, the things I learn through my senses and understand through my reason are right, because a benevolent God wouldn't have it any other way." 6/
I've hated this answer since my freshman philosophy class, and even though the TA rejected my paper explaining why it was bullshit, I *still* think it's bullshit. 7/
I mean, I'm a science fiction writer, so I can handily conceive of a wicked God whose evil plan starts with *making you think He is benevolent* and then systematically misleading you in your senses and reasoning, tormenting you for His own sadistic pleasure. 8/
The debate about trust and certainty has been at the center of computer security since its inception. When Ken "Unix" Thompson accepted the 1984 Turing Prize he gave an acceptance speech called "Reflections on Trusting Trust":
It's a *bombshell*. In it, Thompson proposes an evil compiler, one that inserted a back-door into any operating system it compiled, and that inserted a back-door-generator into any compiler it was asked to compile. 10/
Since Thompson had created the original Unix compiler - which was used to compile every other compiler and thus every other flavor of Unix - this was a pretty wild thought experiment, especially since he didn't outright deny having done it. 11/
Trusting trust is still the most important issue in information security. Sure, you can run a virus-checker, but that virus checker has to ask your operating system to tell it about what files are on the drive, what data is in memory, and what processes are being executed. 12/
What if the OS is compromised?
Okay, so maybe you are sure the OS isn't compromised, but how does the OS know if it's even running on the "bare metal" of your computer. 13/
Maybe it is running inside a virtual machine, and the *actual* OS on the computer is a malicious program that sits between your OS and the chips and circuits, distorting the data it sends and receives. 14/
This is called a "rootkit," and it's a deadass nightmare that actually exists in the actual world.
A computer with a rootkit is a brain in a jar, a human battery in the Matrix. 15/
You, the user, can ask the OS questions about its operating environment that it will answer faithfully and truthfully, and *those answers will all be wrong*, because the *actual computer* is being controlled by the rootkit and it only tells your OS what it wants it to know. 16/
20 years ago, clever Microsoft engineers proposed a solution to this conundrum: "Trusted Computing." They proposed adding a second computer to your system, a sealed, secure chip with very little microcode, so little that it could all be audited in detail and purged of bugs. 17/
The chip would be securely affixed to your motherboard, so any attempt to remove it and replace it with a compromised chip would be immediately obvious to you (for example, it might encapsulate some acid in a layer of epoxy that would rupture if you tried to remove the chip). 18/
They called this "Next Generation Secure Computing Base," or "Palladium" for short. They came to the Electronic Frontier Foundation offices to present it. It was a memorable day:
My then-colleague Seth Schoen - @EFF's staff technologist, at the time, the most technically sophisticated person to have been briefed on the technology without signing an NDA - made several pointed critiques of Palladium:
But his most salient concern was this: "what if malware gets into the trusted computing chip?" 21/
The point of trusted computing was to create a nub of certainty, a benevolent God whose answers to your questions could always be trusted. The output from a trusted computing element would be ground truth, axiomatic, trusted without question. 22/
By having a reliable external observer of your computer and its processes, you could always tell whether you were in the Matrix or in the world. It was a red pill for your computer. 23/
What if it was turned? What if some villain convinced it to switch sides, by subverting its code, or by subtly altering it at the manufacturer?
That is, what if Descartes' God was a sadist who *wanted* to torment him? 24/
This was a nightmare scenario in 2002, one that the trusted computing advocates never adequately grappled with. In the years since, it's only grown more salient, as trusted computing variations have spread to many kinds of computer. 25/
The most common version is the UEFI - ("Unified Extensible Firmware Interface") - a separate OS often running on its own chip (or running in a notionally "secure" region of your computer's processors) that is charged with observing and securing your computer's boot process. 26/
UEFI poses lots of dangers to users; it can (and is) used by manufacturers to block third-party operating systems, which allows them to lock you into using their own products, including their app stores, letting them restrict your choices and pick your pocket. 27/
But in exchange, UEFI is said to deliver a far more important benefit: a provably benevolent God, one who will never lie to your operating system about whether it is in the Matrix or in the real world, providing the foundational ground truth needed to find and block malware. 28/
So it's a big deal that @Kaspersky has detected a UEFI-infecting rootkit (which they've dubbed a "bootkit"), which they call #CosmicStrand, which can reinstall itself after your reformat your drive and reinstall your OS:
Cosmicstrand does some *really* clever, technical things to compromise your UEFI, which then allows it to act with near-total impunity and undetectability. Indeed, Kaspersky warns that there are probably *lots* of these bootkits floating around. 30/
If you want a good lay-oriented breakdown of how Cosmicstrand installs a wicked God in your computer, check out @dangoodin001's excellent @ArsTechnica writeup:
But despite its long tenure, Cosmicstrand was only just discovered. 32/
That's because of the fundamental flaw inherent in designing a computer that its owners can't fully inspect or alter. 33/
If you design a component that is supposed to be immune from owner override, then anyone who compromises that component *can't be detected or countered by the computer's owner*. 34/
This is the core of a two-decade-old debate among security people, and it's one that the "benevolent God" faction has consistently had the upper hand in. 35/
They're the "curated computing" advocates who insist that preventing you from choosing an alternative app store or side-loading a program is for your own good. 36/
Because if it's possible for you to override the manufacturer's wishes, then malicious software may impersonate you to do so, or you might be tricked into doing so. 37/
This benevolent dictatorship model only works so long as the dictator is both perfectly benevolent and perfectly competent. We know the dictators aren't always benevolent. 38/
Apple won't invade your privacy to sell you things, but they'll take away ever Chinese user's privacy to retain their ability to manufacture devices in China:
But even if you trust a dictator's benevolence, you can't trust in their perfection. 39/
Everyone makes mistakes. Benevolent dictator computing works well, but fails badly. Designing a computer that intentionally can't be fully controlled by its owner is a nightmare, because that is a computer that, once compromised, can attack its owner with impunity. 40/
ETA - If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
Tomorrow (July 30) at 3PM, I will be reading and signing my picture book "Poesy the Monster Slayer" at the @DarkDel booth at @MidsummerScream in #LongBeach, CA:
It's the last day to sponsor me for the @ClarionUCSD Write-A-Thon! I'm writing 10,000 words on my prison-tech thriller "The Bezzle" and raising scholarship money for the Clarion SF/F workshop, which I graduated from in 1992.
One million US homes are built on floodplains. It would cost $200B to relocate the people who live in them. If we do that, we will save $1T. Those homes are doomed. 1/
If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
“Lines of flow and curves of equal wind intensity, U.S.A.” November 28, 1905. 8 AM. Dynamic meteorology and hydrography. 1910. Inverted color. nemfrog.tumblr.com/post/690428953…