Creationism: the one instance where it's considered perfectly normal behaviour to brutally beat a patch of grass where twenty years ago, there used to be a horse.

Monday, June 23, 2008

Thermodynamics for Creationists

Okay, so in a recent discussion, the second law of thermodynamics came up. Honestly, I can't believe that I'm having to point this out again, but the truth of the matter is that the second law of thermodynamics in no way prohibits evolution. All the second law of thermodynamics states is that the total entropy of the universe must increase over time.

So let's start with the basics. What is entropy?

The thermodynamic definition of entropy is k*ln(multiplicity)

Okay, what is multiplicity?

let' start with a simple conceptual example (which, surprisingly enough, turns out to have a real world thermodynamic version (the two state paramagnet))

Let's say you flip four coins.

How many possible ways are there for it to come up all tails? 1 How many ways are there for it to come up with exactly one heads? 4 How many ways for two heads? 6 How many ways for three heads? 4 And for all four heads? 1

Those numbers are what are called multiplicities.

Let's look at that again. Define a microstate as the specific state. ie, tails-heads-tails-tails would be one microstate. heads-tails-tails-tails would be a different microstate; but they're both part of the same macrostate (one heads). The fundamental assumption of thermodynamics is that in any given macrostate (all tails, one heads, and so on), the microstates are indistinguishable from each other. So, assuming that your coins are fair, the probability of a microstate is equal to any other.

In other words, multiplicity is effectively a measure of probability. multiplicity of some macrostate/multiplicity of all possible macrostates of a system gives the probability of that specific macrostate.

Back to our definition of entropy.

S = k*ln(multiplicity)... log is a monotonic function, so bigger it is, bigger S will be. k is boltzman's constant.

This is what entropy, essentially, truly is. The reason we say it tends to increase is because, as long as the fundumental assumption of statistical mechanics holds, higher entropy states directly correspond to higher probability states. ie, the high entropy states are literally the most likely states of reality. However, the key point here is total entropy.

Systems can easily take a entropy drop in one area so long as there's a corresponding entropy increase elsewhere. In fact, that's pretty much exactly how your refrigerator works.

First, consider if you have two systems, one with multiplicity A, the other with multiplicity B. What's the total multiplicity? A*B, right?

so therefore the two entropies would be the sum, since ln(A*B) = ln(A) + ln(B)

accept that so far?

Great. Now, let's define temperature. Temperature is not actually defined as being proportional to the energy of an object. That's a nontrivial fact about temperature. Temperature nowadays is defined thermodynamically as the reciprocal of the partial derivative of entropy with respect to energy with volume and number of particles (and external magnetic field and other such things) held constant when relevant.

Now, what does that actually mean? it means an object for which a slight change in energy would net a huge change in entropy (in the same direction) would have a low temperature, while big change in energy causing small change in entropy would have a high temperature, right? This also works backwards, if two objects, one hot one cold, both undergo a change in temperature of an equal quantity, the cold one will have a greater change in entropy than the hot one. That's pretty much the definition of the term. Again, if you're having trouble with this, I recommend you pick up a good thermodynamics text. I recommend Kittel and Kroemer's "Thermal Physics." I don't know which edition it's in now, but you should be able to find it on Amazon.

So place those two objects together. What's going happen? Well, the highest probable states are those that correspond to the highest entropy, right?

So... if energy goes from the cold object to the hot object, a lot of entropy will be lost from the cold object, while only a little entropy will be gained by the hot object. So a net decrease in entropy.

In the other direction, there'd be obviously a net increase in entropy. This is why energy goes from hot to cold. And it's a nontrivial fact, as I mentioned.

All of this is to point out the obvious: local decreases in entropy are not only possible; but they happen as a matter of routine in nature. Take crystal formation for example, or the operation of your refrigerator. All the second law of thermodynamics states is that the entropy of the universe has to go up. It doesn't matter if the entropy of a living being, or even of the entire planet earth drops, as long as there's an increase in entropy elsewhere.

No comments:

Me.

I'm a 20-something post-doc with the Center for Cardiovascular Research in Honolulu, Hawaii. I study the transcriptional effects of HIF-1 in cardiomyocytes and my work centers around developing a novel mathematical model describing the downstream effects of HIF-1. I've been interested in the ID/Creationism/Evolution debate for some time now, and I thought it was about time that I put some of my thoughts out there in a way that people would actually consider reading.