I know that the Tsallis($S_q$) entropy is called nonextensive information measure in the sense that if $P$ and $Q$ are two probability distributions then $S_q(P\times Q)=S_q(P)+S_q(Q)+(1-q)S_q(P)S_q(Q)$. My question is what is meant by nonextensive statistical physics? What is its connection with Tsallis entropy maximisation?

Hi Ashok, welcome to Physics Stack Exchange! On this site each individual question should be posted separately, so I removed your second item from this post. I encourage you to post it as a separate question.
–
David Z♦Sep 19 '11 at 5:28

1 Answer
1

Normally entropy is seen as an extensive thermodynamic coordinate, i.e. proportional to the mass of particles: Volume increases by considering a larger amount of gas in the "same state", while pressure will stay the same. So for simple systems entropy of a system that is a combination of 2 systems will be sum of individual entropies. Here it is less. This means that not the whole direct product of states of A and states of B is accessible to the combined system.
Consider a state space of 3x3 pixels for a particle A alone: 9 states. consider B to be an identical system: also 9 states. Now assume you can put A and B in the same 3x3 pixel space and allow them to also sit in the same pixel: 9x9=81 states, however suppose that they interact such that they can not sit in the same pixel:9x9 - 9=72 states so in this second example entropy is not additive for combination of systems. while it is in the first...

Thanks @propaganda. I understand roughly. I will take some more time to understand better as i am not a physics student. I accept your answer though.
–
AshokJan 27 '12 at 8:34

2

I know that the papers with the "nonextensive/Tsallis entropy" keyword say the same things that you do but it makes no sense, @propaganda. There isn't any "new kind of statistical mechanics" here. The entropy is exactly additive in particles if they're perfectly non-interacting (ideal gas) but in almost all systems, they are interacting (real gases, liquids, or anything else) and this is normally taken into account when calculating the proper good old logarithm-based entropy. Various additivities break down due to the interactions; however, no evidence for the Shannon entropy to break down...
–
Luboš MotlFeb 1 '12 at 6:36

1

Also, what I find misleading about the non-additive formula for $S(P\times Q)$ is that it tries to pretend that $S(P\times Q\times R)$ have to be determined after that. Effectively, the formula introduces some "two-body-interaction-like" modifications of entropy but pretends that there isn't any three-body or higher-body interaction. This is just meaningless. And the more complex interactions - which are really relevant for the very-many-body entropy - do matter. At most, this business may inspire one to get some new probability distributions but there are infinity of others, too.
–
Luboš MotlFeb 1 '12 at 6:40