How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?

December 26th 2007, 06:11 PM

Jhevon

Quote:

Originally Posted by ThePerfectHacker

How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?

you did alchemy?

(i don't know the mathematical definition of entropy. i learnt about that in a general chemistry class, and this was basically what was said. it seems to frivolous to be mathematically defined.)

December 26th 2007, 06:17 PM

CaptainBlack

Quote:

Originally Posted by ThePerfectHacker

How is entropy defined mathematically? This is a question I have all the way from 11th grade. I know that when we were doing Alchemy in high school I remember the teacher saying that there is a mathematical definition for entropy. But since I knew he was an alchemist and not a mathematician I never asked. I am sure it is hard, I assume it has to do with Boltzman's Constant, maybe?

There are at least two types of Entropy, the Thermodynamic Entropy and
Shannons Information Entropy. You will find the expressions for both on the
Wikipedia page for entropy, the first under "Microscopic definition of entropy"
and both under "Entropy and information theory"

RonL

December 27th 2007, 04:15 AM

colby2152

Quote:

Originally Posted by CaptainBlack

There are at least two types of Entropy, the Thermodynamic Entropy and
Shannons Information Entropy. You will find the expressions for both on the
Wikipedia page for entropy, the first under "Microscopic definition of entropy"
and both under "Entropy and information theory"

Shannon's Information Entropy is one of the most brilliant constructions in Communication theory. And subsequently his channel capacity theorem is very powerful too. When communication engineers design their codes, they know they can never cross what is known as Shannon's limit :(

Shannon:Electrical Engineering <- Newton:Physics

Anyway the point of my post is that there is a nice story on why it's called entropy and the connection with physical entropy:

Quote:

Originally Posted by Wiki

My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'