It's been a while since I've visited this topic, and with some vacation coming up I think I might want to dive into it some more. I would really like to find some kind of solution to this.
I'm going to dump some thoughts about approaches I've already tried in this thread.
What worked: Nothing, literate programming tools are terrible.
Failed Approach #3: Formal Modelling Tools
I've tried quite a few and I'm a fan, my main gripe is around integration - most of the more powerful analysis tools are heavily wrapped up within themselves and don't lend themselves to composition or extension.
Failed Approach #4: Write my own Modelling Tool(s)
Why they failed: Same as #3, lack of reusable components means any non-trivial analysis either requires fully adopting an existing project to extent it or writing from scratch.
Why they failed: Fundamentally the wrong modelling approach, too hierarchical, most lack expressive formalization.
Failed Approach #6: Mind Mapping.
All mind mapping software is either clunky, or proprietary with no hope of extensibility. Also a lack of formalization.
Failed Approach #7: Tagged Papers in something like Zotero
Managing papers/references is a plus, but I fundamentally need something more than that. Think I got some basic extensibility working at some point but quickly abandoned it.
In my ideal tool I would be able to relate e.g. a set of papers about a problem / a formal model of the problem / an implementation to solve the problem / free form notes about the problem all together - in addition to linking those parts and the whole to others.
And at some level it needs to be executable / checkable. I'm less concerned with capturing knowledge than I am with further analysing a given problem space. Markdown isn't going to cut it.
I make-do right now with a combination of folders, custom rust workspaces, a mess of modelling files and my own brain, but I continually hope that there exists a tool that is much closer to my ideal than that.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Most software is a composition of layers of fossilised snapshots of the organizational understanding of the problem those layers were originally designed to solve - but we rely on it as if it is continually maintained critical infrastructure.
One of those has to give.
What a system was designed to do, what it does, what it is supposed to do, and what people use it for are 4 mostly unrelated concepts.
Asbestos problems, where you've introduced a dependency to serve purpose but due to "unforeseen" danger now needs to be completely replaced, suck - but competent organizations have continuous deployment pipelines and dependency tracing . At worst, at least you know of the danger.
My ongoing rage at papers that define an adversary strong enough to compromise any participant at will, but weak enough that they are incapable of arbitrary protocol violations.
"We permit the adversary to perform all of these actions, but we assume they will never lie to the other members about it"
"We assume the adversary is omnipotent in regards to the internal state of protocol participants, but they will definitely never introduce arbitrary network delays."
Many of the major trends in crypto right now are leading to the development of structures that are fundamentally aligned with anarcho-mutualism (community ownership and control, community credit).
You have to wipe off the icky layer of rentier capitalism settling on the surface.
I started programming simple real mode operating systems in my teens and it's fluctuated as a hobby for me over the course of the last 2 decades.
In between I've built hobby emulators, (dis)assemblers, fuzzers, compilers, and uncountable weird hybrids.
I'm going to assume you know at least one high level programming language. If you don't then you should learn one. Any one will do. People may tell you the choice matters, it doesn't.
The basic principles you will learn in one are transferable to others.
I spent my recent evenings writing an operating system in an assembly language that I also developed to compile to a custom bytecode that I also designed to run on an virtual machine that I also implemented.
A meditation on recursive complexity and what actually makes me happy.
It is completely useless. All that work, and you can only run a few commands, and one of them is QUIT.
I have never loved a piece of software more.
The kernel is 832 lines of custom assembly. ~300 are dedicates to embedding binary data like font bitmaps.
Encrypted communication tools should be designed such that devs *don't* have access to things like "where [and when] accounts are created, how [data] travels, which [messages] are fastest to spread"
Basically this. The underlying expectation that "responsible encryption" requires some kind of metadata surveillance to be safe seems to be to be a deeply flawed narrative that can only result in greater and greater privacy harms.