What's "free energy"? I don't mean energy you get for free. I mean the concept from physics: roughly, energy that you can use to do work.

More precisely, free energy is energy that you can use to do work at constant temperature. But why the fine print?

(1/n)
A red-hot rock has a lot of energy due to the random motion of its molecules. You can't do anything with this energy if the rock is in an equally red-hot furnace. You can if you put it in contact with something colder: you can boil water, make steam and drive a piston.

(2/n)
The thermal energy in a red-hot rock can't do work in an environment at the same temperature. So this energy is not "free energy".

But if the rock is moving, it has "free energy". You can do work with this energy - even in an environment at the same temperature!

(3/n)
Amazingly, there's a formula for free energy, which turns it into a precise and useful concept. It's

F = <E> - TS

where <E> is the system's expected energy, T is its temperature and S is its entropy.

(Experts will now start to complain, but I know what I'm doing.)

(4/n)
Why do I say "expected" energy?

"Expected" means "average" or "mean". We're actually doing probability theory here, since our rock (or whatever) may have randomly moving parts. Concepts like "temperature" and "entropy" also involve probabilities.

(5/n)
What's the basic idea of

F = <E> - TS ?

I like to say: free energy is the energy minus the energy due to being hot. The "energy due to being hot" is temperature times entropy.

(6/n)
But what's really going on here! In which situations does "free energy" make sense?

It's very general. We can define free energy whenever we have a finite set X with a probability distribution p and real-valued function E on it, and a number T called "temperature".

(7/n)
We can define the "entropy" S of a probability distribution p on a finite set X. It's

S = -sum p(i) log(p(i))

where we sum over all points i of X. This is biggest when p is smeared-out and flat, smallest when p is zero except at one point. It measures randomness.

(8/n)
We can also define the "expected value" of any function E: X -> R when we have a probability distribution p on a finite set X. It's

<E> = sum p(i) E(i)

where we sum over all points i of X. This is just the average value of E, weighted by the probability of each point.

(9/n)
So, now you know the definition of the "free energy"

F = <E> - TS

for any number T, any real-valued function E: X -> R and any probability measure p on any finite set X.

Learning why this is so great takes longer! You need to learn what you can do with it.

(10/n)
If you want to learn a tiny bit more about why free energy is important, try Section 1 of this paper:

https://arxiv.org/abs/1311.0813 

Here Blake Pollard and I quickly explain why a system in equilibrium at some temperature T will minimize its free energy.

(11/n, n = 11)
You can follow @johncarlosbaez.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: