Entropy in everyday life

Well is disorder "needed" for order? I suppose in the sense that the concept or order needs to be defined relative to its opposite.

Click to expand...

To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?

Google AdSenseGuest Advertisement

To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?

Google AdSenseGuest Advertisement

To my way of thinking anyway, it would seem difficult to define order, without also defining disorder, and perhaps vice versa. Both are necessary for balance / equilibrium. What are your thoughts on that?

Click to expand...

I see no reason why a system can't be perfectly ordered, with zero entropy.

It might be simpler to think of a concrete example, such as a deck of cards.
When the cards come out of the pack they are in a specific order...

Hang on... that's not right...

What constitutes "order" in a deck of cards is arbitrary. We say it's ordered because we consider the numbers to be important. But that's subjective.
Objectively speaking, A23 is exactly as "ordered" as 3A2.

I think we have two options - we can look at entropy as a measure of disorder within a system or a measure of the energy dispersal in the system. It can be subjective or objective, depending on what you're assessing. My idea of disorder (using the messy house example) might be different than another person's opinion of disorder.

I think we have two options - we can look at entropy as a measure of disorder within a system or a measure of the energy dispersal in the system. It can be subjective or objective, depending on what you're assessing. My idea of disorder (using the messy house example) might be different than another person's opinion of disorder.

The popcorn example would be objective.

Am I right/wrong?

Click to expand...

A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

Click to expand...

So, if entropy is basically the measurement of the energy dispersal, how is it (also) a measurement of the uncertainty of a system? -Or- Do we assume that energy dispersal automatically leads to uncertainty? We've discussed that it's a measure of the amount of energy unavailable to do work, would that equate to the uncertainty (disorder)? That is what confuses me, I think.

exchemist said: ↑
A measure of energy dispersal is the most correct way to think of entropy, to my understanding. There is a link between that and the degree of disorder of the system measured in other ways, e.g. physical arrangement e.g. solid vs liquid etc.

So, if entropy is basically the measurement of the energy dispersal, how is it (also) a measurement of the uncertainty of a system? -Or- Do we assume that energy dispersal automatically leads to uncertainty? We've discussed that it's a measure of the amount of energy unavailable to do work, would that equate to the uncertainty (disorder)? That is what confuses me, I think.

Click to expand...

There is no uncertainty . That is mathematical concept .

To the three dimensional physical Universe perspective , all energy states have all ways existed , hence the " uncertainty " becomes certain .

There is a cycle of all energy states . All energy states are part of the cycle .

From the extreme energy heat energy of the Galaxy and Quasars , to the extreme cold of the Cosmic Web ( of which we can't see , because it gives no electromagnetic wave energy , no light waves ).

What constitutes "order" in a deck of cards is arbitrary. We say it's ordered because we consider the numbers to be important. But that's subjective.
Objectively speaking, A23 is exactly as "ordered" as 3A2.

I'm confused... Is order/entropy defined subjectively?

Click to expand...

Suppose you know the order of a deck of cards, then the cards are shuffled and placed in a deck face down. The known order and the shuffled one have an entropy.

What you know is subjective--you know an initial ordering and that there is a new ordering, which you have no information about.

Suppose you know the order of a deck of cards, then the cards are shuffled and placed in a deck face down. The known order and the shuffled one have an entropy.

What you know is subjective--you know an initial ordering and that there is a new ordering, which you have no information about.

Click to expand...

Right. Which suggests entropy (at least this version of it) is arbitrary.

I could
1] take an ordered deck of cards,
2] declare it in perfect order (because today I happen to like sequential human numbers, and also happen to like spades more than hearts),
3] shuffle it
4] calculate the increase in entropy of the new state
and then
5] declare that the new state is in perfect order (because it spells out my birthday) and that entropy is magically zero again.

Sorry, entropy is arbitrary is not the proper phrase; the proper phrase would be: entropy is dependent on the propert(ies) of interest.

Right. Which suggests entropy (at least this version of it) is arbitrary.

Click to expand...

Well, many people are saying, there can be only one version.

I could
1] take an ordered deck of cards,
2] declare it in perfect order (because today I happen to like sequential human numbers, and also happen to like spades more than hearts),
3] shuffle it
4] calculate the increase in entropy of the new state
and then
5] declare that the new state is in perfect order (because it spells out my birthday) and that entropy is magically zero again.

Click to expand...

Yes, that's acceptable except the part that says "entropy is magically zero", because the entropy in 4] is 'between' the cards in 1] and in 3].

It's what happens at 3] that is actually the more interesting part. Here is where you now, before you go to 4] and look at the order of the deck, expect to see some 'randomness'. You don't expect that after shuffling a deck of cards they are in the same order; you don't expect to see a repeating pattern either.

In information theory, a message with unexpected information has more "content" than one with expected information. It's a bit back-to-front.

Yes, that's acceptable except the part that says "entropy is magically zero", because the entropy in 4] is 'between' the cards in 1] and in 3].

It's what happens at 3] that is actually the more interesting part. Here is where you now, before you go to 4] and look at the order of the deck, expect to see some 'randomness'. You don't expect that after shuffling a deck of cards they are in the same order; you don't expect to see a repeating pattern either.

Click to expand...

Except that randomness is arbitrary too.

For all we know, the 52 cards now happen to count out the first 52 digits of pi ... "in perfect order".