1 Answer
1

Thank you for your interesting question, which lead me to quite a nice little result. I couldn't find a nice thermodynamic meaning for the cross entropy, but I did find one for a related quantity called the Kullback-Leibler divergence. As far as I know, this is a new result.

The cross entropy between two probability distributions is defined as
$$
H(P, Q) = -\sum_i p_i \log q_i.
$$
These two probability distributions should both refer to the same set of underlying states. Normally in thermodynamics we think of a system only having one probability distribution, which represents (roughly) the range of possible states the system might be in at the present time. But systems can change over time. So let's imagine have a system (with constant volume) that's initially in equilibrium with a heat bath at a temperature $T_1$. According to the usual principles of statistical mechanics, its state can be represented by the probability distribtion
$$
p_i = \frac{1}{Z_1} e ^{-\beta_1 u_i},
$$
where $\beta_1=1/T_1$ (I've set Boltzmann's constant equal to 1 for clarity) and $Z_1$ is a normalisation factor called the partition function. The $u_i$ are the energy levels of the system's permitted microscopic states.

Now let's imagine we pick up our system and put it in contact with a heat bath at a different temperature, $T_2$, and let it come to equilibrium again. Since no work has been done, all the $u_i$ values will be unchanged and we'll have a new distribtion that looks like this
$$
q_i = \frac{1}{Z_2} e ^{-\beta_2 u_i},
$$
where $\beta_2 = 1/T_2$.

It's a standard result from statistical mechanics that
$$
S_2 = H(Q) = \log(Z_2) + U_2\beta_2.
$$
Solving this for $\log(Z_2)$ and substituting into the cross entropy formula we have
$$
H(P,Q) = S_2 - \beta_2(U_2 - U_1) = S_2 - \frac{U_2 - U_1}{T_2}.
$$
Physicists are, generally speaking, afraid of any quantity that has entropy units, and if they see one they like to multiply it by a temperature in order to make it look like an energy. If we multiply this by $T_2 = 1/\beta_2$ we get
$$
T_2H(P,Q) = T_2S_2 - \Delta U.
$$
It's possible that this might have a nice thermodynamic interpretation in terms of something like the maximum amount of work that we can extract from doing this transformation under a particular set of circumstances --- but if it does then I haven't seen it just yet. The expression looks tantalisingly like a change in free energy ($\Delta U - T\Delta S$), but it's not quite the same.

However, we can get a much more interesting result if we note that in information theory, the Kullback-Leibler divergence (aka information gain) is often seen as more fundamental than the cross entropy. The KL-divergence is defined as
$$
D_{KL}(P\|Q) = \sum_i p_i \log\frac{p_i}{q_i} = H(P,Q)-H(P),
$$
which in our case is equal to
$$
S_2 - S_1 - \frac{U_2-U_1}{T_2} = \Delta S - \Delta U/T_2.
$$
This is much more interesting than the result for the cross-entropy, because it does have a clear thermodynamic interpretation. When we put the system in contact with the second heat bath, its entropy changes by $\Delta S$, and the entropy of the heat bath changes by $-\Delta U/T_2$. (This is because entropy is heat divided by temperature - an amount $\Delta U$ leaves the system, so $-\Delta U$ enters the heat bath.) So the KL-divergence is just the total change in entropy after we put the system in contact with the new heat bath. I'm quite excited about this because I didn't know it before, and I don't think anyone else did either!

We can even take this a bit further. Let's imagine putting a heat engine in between the system and the second heat reservoir. So we'll try to extract some useful work from the flow of heat that takes place as the system and the heat bath equilibriate. If we do this the total change of entropy becomes $\Delta S + (-\Delta U - W)/T_2$. This has to be greater than 0, which means that $W\le T_2\Delta S - \Delta U$.

Now, if we do that physicist thing of multiplying $D_{KL}$ by $T_2$, it becomes $T_2\Delta S - \Delta U$, which is the value for the maximum work that we just calculated. So while the thermodynamic meaning of the cross-entropy isn't clear to me, the KL-divergence does seem to have a nice interpretation in terms of work.

Great. Is exactly the answer that i was looking for.
–
Emanuele LuzioApr 21 '12 at 14:03

I have a doubt. If Dkl represents the maximum work we can extract from two thermodynamic bath, why it is not symmetric? Namely Dkl(Q|P)/=Dkl(P|Q).
–
Emanuele LuzioNov 7 '13 at 16:11

@EmanueleLuzio it's because the heat baths don't play symmetrical roles: we put the system first in contact with one and then the other. If we did it in the opposite order the maximum work would be $T_1D_{KL}(Q\|P)$ instead.
–
NathanielNov 7 '13 at 17:03

Sorry, but I don't understand. You are telling me that if I put a system of T1 in contact with a bath of T2 the work I can extract is different from putting a system of T2 in contact with a bath of T1?
–
Emanuele LuzioNov 7 '13 at 17:29

@EmanueleLusio in general yes, it can be different. It's because the waste heat from the heat engine has to go into the heat bath. The temperature if the waste heat bath determines the maximum efficiency of the engine, which is different in the two cases.
–
NathanielNov 8 '13 at 2:35