Here is what I’ve learned + common misconceptions.
These two are different things, and a lot of confusion happens when authors don’t specify what type of entropy are they talking about.
But informational entropy of unshuffled one is zero while the one for shuffled is much higher.
E.g. a more ‘ordered’ solid at a higher temperature has higher entropy than more ‘disordered’ liquid at lower temperature
With more energy input, molecules can occupy more states of energy (kinetic, potential, vibrational, rotational, etc.)
The more spread out the distribution, the more likely each event is, the more space required to communicate sequence of events.
Entropy is ALWAYS defined subjectively because it’s a measure of uncertainty.
But if you know only an aggregate number of a system (say temperature), it makes sense to ask about entropy (because it tells you what you DONT know about system)
It’s about uncertainty. Over time, you will be more uncertainty about a specific system because things in it interact with each other (and you can’t keep track of everything)
I’m diving further into entropy in the coming days. It’s a fascinating subject and a cause of many misconceptions.
I definitely want to avoid thinking in analogies :) invertedpassion.com/thinking-in-an…