1/ I’ve been trying to understand ENTROPY for the last couple of days.

Here is what I’ve learned + common misconceptions.
2/ There are two types of entropy: thermodynamical entropy and informational entropy.

These two are different things, and a lot of confusion happens when authors don’t specify what type of entropy are they talking about.
3/ For example, the thermodynamic entropy of a shuffled card deck is exactly the same as an unshuffled one.

But informational entropy of unshuffled one is zero while the one for shuffled is much higher.
4/ The idea of ‘entropy is disorder’ is misleading (and is getting removed from textbooks).

E.g. a more ‘ordered’ solid at a higher temperature has higher entropy than more ‘disordered’ liquid at lower temperature
5/ Thermodynamic entropy is simply a measure of the number ways energy can be stored in a system.

With more energy input, molecules can occupy more states of energy (kinetic, potential, vibrational, rotational, etc.)
6/ So with higher temperature, there is more uncertainty about how exactly is energy stored in that system, hence higher entropy.
7/ Similarly, When gas expands, there is more uncertainty around positions of the molecules (and hence more ways/more uncertainty about how energy is stored in the system). So as volume increases, entropy increases.
8/ Informatonal entropy is a measure of how ‘expensive’ is communicating a sequence of events that get generates from a probability distribution.

The more spread out the distribution, the more likely each event is, the more space required to communicate sequence of events.
9/ The more ‘peaky’ the distribution is, the more likely you are to get certain events and the less space in total required to communicate sequence of events (you can encode more efficiently)
10/ ET Jaynes had linked the two measures of entropy by suggesting that information entropy gives rise to thermodynamic entropy in physics because state of molecules (velocity, position) is a random variable and temperature, pressure are peaks of those probability distributions.
11/ This link is not widely accepted, and unless you understand the nuances, for all practical purposes thermodynamic entropy has a very different meaning than informational entropy.
12/ Precisely speaking, it doesn’t mke sense to ask what is the entropy of the system.

Entropy is ALWAYS defined subjectively because it’s a measure of uncertainty.
13/ If you know everything precisely about a system, it’s entropy is ZERO no matter how disorderly it may seem.

But if you know only an aggregate number of a system (say temperature), it makes sense to ask about entropy (because it tells you what you DONT know about system)
14/ Also the second law of thermodynamics is not that disorder will increase or things will crumble.

It’s about uncertainty. Over time, you will be more uncertainty about a specific system because things in it interact with each other (and you can’t keep track of everything)
You can follow @paraschopra.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: