1. Sorry, I was busy to reply, but here's my non-pornographic take :). I think physicists (egged on by popularists) have deliberately mythologized the issues around these topics, usually by citing Feynman,
Einstein, etc.
2. The larval Lagrangian, and its adult action (integral of L.dt) has nothing to do with thermodynamics a priori. It is only a prescription for generating equations from a compact algebraic form.
3. The action principle is subtle b/c it looks like, and is often represented as, a dynamical quantity,
but that's misleading. It's really "just" a generating functional. A generating function(al) is an algebraic form
which generates something when you hit it with an operation.
4. You can write action/Lagrangians for all kinds of things, not just T-V. And the reason the generating function works for energy etc is essentially because of differential geometry. The derivative operation seeks out gradient landscapes. Things rolling in potential wells. etc.
5. So, when you hit the action with a variation of the dynamical variable, it generates the equation of motion
for that variable, by deconstructing the shape of how all the energy transactions add up.
6. When you hit the action with a variation of space, it generates momentum. And the continuity of variables in space becomes a proxy for the conservation or continuity of momentum.
7. When you hit the action with a variation of time, it generates the total energy. And the continuity of the variables in time implies the conservation of energy. So variables are not allowed to change discontinuously (e.g. at a boundary) else energy is not conserved.
8. It's beautiful, but not magical. Note that the total energy is not really defined completely, because it's an arbitrary accounting parameter for transactions,
9. Now, this last part gives you a clue about any system, from a single material particle to a gas or bulk thermodynamical system too. And, this brings us to the macroscopic matter of *energy distribution* and histograms.
10. Entropy began with some mysticism as a way of getting energy equations to balance in heat engines. It vaguely represented the amount of energy that had become unavailable to convert into work, because it was too evenly distributed. You need rich | poor boundary to do work.
11. A detail in passing: entropy doesn't have the dimensions of energy and is not a function of variables, so someone found that the temperature acts as an effective integrating factor for entropy to make it into "non-free" energy TdS.Continuity of temperature, gives equilibrium.
12. A brilliant Boltzmann discovered that you could get the right answer for this fraction by counting the various ways the system could move energy around (called degrees of freedom).
12a. Pretty clever, and he invented a new model of entropy based on counting all the bulk states macroscopically--thus inventing statistical mechanics.
13. Later, Shannon (aka information theory) showed that entropy is actually just a summary characteristic and generating function for any probability distribution. and von Neumann found similar uses in QM.
14. Cleverly, if you vary the entropy with respect to probability, and say that the expectation value of some variable is constant, you derive the Boltzmann distribution, where the variables is energy. exp(-E/kT). That's because entropy also acts as a generating function(al).
15. I hope some of this is helpful. It's important to separate statistical averages (foliation) from underlying causal phenomena. They are related, but the statistical odour doesn't wag the dog. Boundaries are the way to do work from statistics: pistons, & wagging puppydog tails.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1. Here is a short weekend thread about the social brain hypothesis, which I have been fortunate to make a small contribution to from the Promise Theory of trust.
2. The Social Brain Hypothesis is the idea that the human neocortex evolved in response to advantages of living in groups and managing social relationships. The evidence for this is quantitative, but it has remains controversial due to ingrained moral and religious philosophy.
3. Interest in the cognitive ability of animal groups traces back to observations by H. Jerison who noted how animals fall into classes associated with their relative brain to body mass ratio. The correlation was not perfect because we need to look at relative brain-body mass.
1. It's config management time, so here is a thread about configuration languages in IT. Config languages are not unique to IT, they exist in all areas of science and engineering to describe layout or "state".
2. In physics, we talk about bodies and particles at coordinates or trajectories in an arena of space and time (sometimes described as a graph). In IT, for space we have computers, directories, and files or databases. For time we have the UTC clock or we have transaction numbers.
3. Issues like portability, consistency, and redundancy are described by ideas like translational invariance and symmetries in mathematical sciences. We don't have speak of these concepts in computer science, because there is a culture of taking machines for granted.
1. Segal's Law: "If you have one clock, you know what time it is. If you have more than one, you're not sure." An eyebrow later, everyone gets this. Yet, in distributed computing, we don't always stop to think what it means for time sensitive operations.
2.Your phone app knows your latest tweets, but if you check a web browser too, you're no longer sure. Probably phones and browsers direct to different servers that are not quite in sync. It might take minutes to see the same up to date content in the browser.
3.Suppose you ask, "What's my bank balance right now?" "Balance" is a temporal snapshot? When is "now"? For whom, and where? At first we tend to think of only one person looking at data, but payments are made asynchronously at different times, in different locations.
1. This is a thread about Quantum Mechanics and how we interpret it. The way QM is presented bothers me from time to time (at least when I have time to think about it). Indeed, time plays the key role in the whole issue.
2.This is probably no evil conspiracy (unless there is!!), but perhaps people get stuck because of the history, the Bohr doctrine, general stubbornness, or hiding behind mathematics that doesn't hold all the answers. Anyway, here goes.
3.A short and oversimplified parody of the way QM is presented is this: Classical is any system described by properties at rest, e.g. a coin with either head or tails, but can't be both at the same time. Quantum is "everything everywhere all at once" and can't be understood!
1. Let's think more about multi-client consistency. Data ledgers are shared transaction logs used as systems of record (monolithic/distributed). IT treats shared data sharing as a technical infrastructure problem, but it's really about scale-dependent behavioural design. (N=46)
2. Ledgers keep track of goods, services, and payments in buying and selling. The approach is used in many other scenarios too. It seems like keeping account should be easy. We've been doing it for millennia. Someone pays, someone delivers, but it's not just data: it's a process.
3. Whether money or barter, we add each exchange to an unfinished list, because closure isn't instantaneous or atomic. Each time you decide whether you have enough to pay or not. Money is just a networking technology. Some packets exchanged are small, some large.
1. This is the second thread about databases and distributed systems (still a headache for enterprises). Part 2: thinking about data consistency using car parking/rental as an analogy. Replication is used for backups and also to argue in favour of "digital twins". (N=55)
2. Suppose we think of a database roughly as a parking/rental service for data, reading is borrowing and writing is parking. We have short stay parking, long stay parking, valet parking etc. See if this helps us to demystify data "consistency".
3. "Consistency" ultimately means that what we see follows a pattern calibrated to some reference source of truth. We can speak about i) self-consistency of a model or process, ii) consistency of observations with a model, iii) consistency of copies with an source orgin.