One way to think about temperature and a small system is to allow the small system to exchange energy with a much larger system (i.e. a reservoir of energy). The temperature of the reservoir is well defined. The reservoir is so large that, as the small system draws energy from the reservoir, the temperature of the reservoir remains unchanged.

Thus, the small system fluctuates from one quantum state to another quantum state. The probability for the small system to be in any one of its quantum state, of course, follows the Boltzmann distribution.

This picture is what we mean by saying "a system is held at a fixed temperature". In this sense, we can hold an arbitrarily small system at a fixed temperature.

I have collected my notes on temperature and the Boltzmann distribution, along with several other topics, in Statistical Mechanics. In these notes, I have deliberately treated the subject as an experimental science: all quantities must be measurable, including the number of quantum states of an isolated system. I'm particularly happy with the section on temperature. Several students have kindly mentioned to me that they appreciate this attitude.

Henry, Zhigang's explanation of temperature is spot on. However, if you want another "intuitive" notion regarding temperature of, say a nanotube (which you quoted), then you may think of temperature as the measure of the velocity or kinetic energy of the atoms. In fact, this notion is used to compute the temperature of a molecular dynamics ensemble. While at the level of each atom, all you can calculate is its kinetic energy, when you have an ensemble, statistical mechanics provides a route to compute the so-called temperature.

When N is small, say N=100 which is possible for today's technology, the classical statistical mechanics adopted in molecular dynamics, which calculates the temperature based on the information of each atom, velocity, d6es not make sense.

Sure, for very small number of atoms, stat mech may not be the solution but I am speaking of physical intuition. While the concept of "hotness" etc. are somewhat useless at that scale, relating temperature to the degree of motion of an atom provides an intuitive understanding of what temperature means at that scale

Yes, you are right. Therefore, the temperature computed based on the sum of atomic kinetic energy, as in many classic molecular dynamics calculations, cannot be used quantitatively, but only qualitatively for physical intuition.

But it should make sense, as the molecules/atoms vibrate or move around, they do so because they have velocity. And when we talk about molecular dynamics wont we say KE = (1/2)mv2 where KE = (3/2)NkT (for a 3D system, K - Boltzmann's constant?)

In teaching a course last spring, I updated my notes on temperature. The central aim of the notes is to reconcile the empircal notion of temperature and its statistical meaning. Hope that the notes help.

I read your lecture notes on “Statistical Mechanics” posted in http://imechanica.org/node/288. They are very interesting and useful to me. I can understand the effort in reconciling the empirical notions and their statistical meaning.

However I feel that your notes are not complete, or self-sufficient in other word, in giving statistical meaning. The “quantum state” is the fundamental concept in your notes on isolated system, temperature, probability, entropy, free energy, and etc. Yet the “quantum state” itself is undefined.

The fundamental postulate is also intriguing. It says that a system isolated for a long time is equally probable to be in any of its quantum states. That our world is described by quantum states may be regarded as a law itself.

These are excellent questions, but generally useful answers do not exist. They belong to nonequilibrium phenomena.

For one class of nonequilibrium phenomena, the system is not in equilibrium as a whole, but the system can be divided into small elements, each element being in a state of equilibrium. One can follow the thermodynamics of irreversible processes to anaylze how the system as a whole approaches a state of equilibrium.

A classical example is conduction of heat in a solid. While the entire body is not in a state of equilibrium, each small element of the body is in its own state of equilibrium. It is not meaningful to talk about the temperature of the entire body, but one can talk about the temperature of each element of the body, and apply the equilibrium thermodynamics to every element. One then describes the conduction of heat by using a kinetic law (i.e., Fourier's law).

As another example, in my course on advanced elasticity, I describe nonequilibrium phenomena in gels.

We may have entered the muddy water of philosophy of science, of which I'm not qualified to comment, but I agree with Zhenyu, if a notion has no measurable consequence, you may not be bothered to define it. If a notion does have experimental consequence, it might be possible to define a quantity through experimental consequence.

The notion of hotness can be quantified through the experimental consequence of thermal contact.

In the case of two macroscopic systems, we can talk about the temperature of either system.

In the case of a small system in contact with a heat reservoir, however, we do have a well defined way to characterize the system: the Boltzmann distribution.

In the derivation of the Boltzmann distribution for a small system, the system and the reservoir are in thermal contact, exchanging energy. All other modes of interactions between the system and the reservoir are blocked: no exchange of molecules, of volume, or anything else.

This block requirement works fine for macroscopic system.

However, for a nanoscale system the energy changes are associated with volume change which cannot be ignored as in macroscopic systems.

Therefore, the Boltzmann distribution needs to be modified to characterize nanoscale systems.

Dear Henry: I wish this were my definition. This is the definition in textbooks. For example, in Kittel and Kroemer, and in Landau and Lifshiz. A while back I wrote a fanatic short review of K-K. It was that book that made me "see" the subject.

I'm not entirely sure why you insert "average" in the above definition.

I'm imagining a system that can exchange energy with the rest of the world, while all other modes of interaction are blocked. Thus, the system is not an isolated system. However, when the energy of the system is held at a constant value U, the system becomes an isolated system, and has a certain number of quantum states Ω. As the energy changes, the number of quantum states changes as a function of the energy, Ω(U). This function characterizes a family of isolated systems.

For example, an electron and a proton form a hydrogen atom. The two-particle system can absorb energy by receiving photons. Thus, the system may be viewed as a family of isolated systems. According to quantum mechanics, the function Ω(U) takes the following values:

Ω(-13.6 eV) = 2, Ω(-3.39 eV) =8, Ω(-1.51 eV) = 18,...

For a large system, the discreetness in energy is small compared to the total energy, so that U may be treated as a continuous variable, and Ω(U) may be treated as a continuous function. In this case, an analysis of thermal contact leads to the definition of temperature:

Thank you, Henry, for initiating this thread of discussion. As you pointed out, the subject is really hard. I'm not sure if I have got everything correct. Please do point out if you find that I'm doing things wrong.

Give you two isolated systems A1 and A2. For system A1, the number of quantum states is N1; and for system A2, the number of quantum states is N2.

To simplify the quantum mechanics calculations, we assume A1 is a cubical box with side length a and containing only 1 electron; and so for A2, a cubical box with side length a and containing only 1 electron.

Now bring A1 and A2 together by putting the two boxes side by side. The system A=A1+A2 occupies space (a * a * 2a). Solving the Shrodinger equation for two electrons confined in the box A (not easy!), this gives the number of quantum states as N.

Obviously, N does not equal to N1+N2.

The fundamental assumption in statistical mechanics that N=N1+N2is violated!

You put your finger on an important point. In an introductory discussion of thermal physics, we analyze system in weak interaction, so that N=N1 x N2 is true. As I discussed in my notes on temperature, physically it means that all the interfacial states are negligible.

In your example, you have two electrons, so that the "interfacial state" is all you get, not negligible. In such case, we have no reason to assume N=N1 x N2.

But this is not the end of the game. Any time you get an interfacial phenomenon, you will violate N=N1 x N2. You will introduce excess quantities.

I'll keep it short (lest I am off-the-mark). The additivity of entropy S = S1 + S2 assumes extensivity, whereas for systems that interact/exhibit long-range interactions (gravitational field is an example), the notion of entropy has been generalized to describe such non-extensive systems. Such entropic measures (Renyi, Tsallis and others are well-known) lead to non-extensivity, and the presence of an additional term (S = S1 + S2 + S1*S2*(1-q)). Tsallis statistics as it has come to be known is purported to be a generalization of Gibbs-Boltzmann statistics. Tsallis's entropic measure is non-extensive when q is not equal to 1 (q = 1 reduces to Shannon's measure). Tsallis entropy has received (for and against) much attention in the physics literature and also in areas outside it (networks, turbulence, finance, etc.) and it is used to describe many a complex systems (edge of chaos) that exhibit power-law distributions. An article that appeared in Science summarizes the gist of it all; Tsallis's original article appeared in 1988. So, instead of the Shannon entropy measure, use of Tsallis's non-extensive entropic measure in Jaynes's maximum-entropy formalism has been pursued.

As discussed above, you can hold arbitrarily small system at a fixed temperature by letting the small system exchange energy with a heat reservoir. In doing experiment with a nanotube, perhaps a holder will allow the nanotube to exchange energy with the environment.

I'm not familiar with how calculations are carried out, but at least conceptually and in experiments, nanotubes and such present no fundamental challenge to statistical thermodynamics.

Let two nanotubes, one with temperature T and the other with the same temperature T, have thermal contact by putting them together. After reaching thermal equilibrium, the final temperature of the system is not T.

Therefore nanotubes, and other nanoscale system, present a serious challenge to statistical thermodynamics.

Reading through this line of discussions, I will have to agree with Zhigang (and some others), in the sense that there is even no need to be bothered by how the temperature should be defined for truly isolated systems, be they classical/macroscopic or quantum (such as those containing only very few numbers of elemental entities). Such systems, treated typically using microcanonical ensembles, are fully described by physical quantities such as the total (and conserved) energy, the volume, etc., with no specification of the temperature.

T claims its physical identity when a system is exchanging energy with its outside world, but by then the system is no longer isolated.

Thermodynamics at nanoscale is a indespensible issue, in the face of the rapid miniaturization of many technical devices such as computer chips, the trend to utilize quantum mechanical systems, and the biological cell response in an nano-environments.

Currently, I think it is not clear to what extent the thermodynamic concepts can be extended to the nanoscale and which limits should be taken into account.

Although this interesting discussion that I have tried to follow is not in my particular field, the current point has reminded me a paper published by the American Physical Society in 2002.

This is the paper just in case you have not seen:

Wang et al., "Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales", Physical Review Letters 89, 050601 (2002).

Here is another one:

Carberry et al. "Fluctuations and Irreversibility: An Experimental Demonstration of a Second-Law-Like Theorem Using a Colloidal Particle Held in an Optical Trap", Physical Review Letters 92, 140601 (2004).

Dear Alper, thanks for pointing out the interesting paper of Wang et al.. I copied the first paragraph of their paper and pasted it here. I also provided a useful link for "Loschmidt's paradox".

Inventors and engineers endeavor to scale down machines and engines to nanometer sizes for a wide range of technological purposes. However, there is a fundamental limitation to miniaturization as small engines are not simple rescaled versions of their larger counterparts. If the work performed during the duty cycle of any machine is comparable to thermal energy per degree of freedom, then one can expect that the machine will operate in ‘‘reverse’’ over short time scales. That is, heat energy from the surroundings will be converted into useful work allowing the engine to run backwards. For larger engines, we would describe this as a violation of the second law of thermodynamics, as entropy is consumed rather than generated.This has received little attention in the nanotechnology literature, as there was no quantitative description of the probability of entropy consumption in such small engines. The only thermodynamic statement available was the second law itself, stating that, for large systems and over long times, the entropy production rate is necessarily positive. Even the foundations of statistical mechanics were unsettled as thermodynamicists questioned how the second law of thermodynamics could be reconciled with reversible microscopic equations of motion. Loschmidt’s paradox (http://en.wikipedia.org/wiki/Reversibility_paradox) states that in a time reversible system, for every phase-space trajectory there exists a time-reversed antitrajectory. As the entropy production of a trajectory and its conjugate antitrajectory are of identical magnitude but opposite sign, then, so the argument goes, one cannot prove that entropy production is positive.

Dear Alper, thanks for pointing out the interesting paper of Carberry et al.. I copied the abstract of their paper and pasted it here.

The puzzle of how time-reversible microscopic equations of mechanics lead to the time-irreversible macroscopic equations of thermodynamics has been a paradox since the days of Boltzmann. Boltzmann simply sidestepped this enigma by stating ‘‘as soon as one looks at bodies of such small dimension that they contain only very few molecules, the validity of this theorem [the second law of thermodynamics and its description of irreversibility] must cease.’’ Today we can state that the transient fluctuation theorem (TFT) of Evans and Searles is a generalized, second-law-like theorem that bridges the microscopic and macroscopic domains and links the time-reversible and irreversible descriptions.

Prof. Zhigang Suo points out that temperature and energy have the same dimensions. This is an insightful statement, but requires certain qualifications to be made.

1. The equivalence between the two concepts is obtained after assuming many prior assumptions. It might be interesting to trace the route through which the equivalence is obtained.

First, a primarily kinetic description of the system is assumed. The system is taken to primarily consist of moving point-masses; let's call them particles. The motion of the particle is taken to remain restricted to a certain finite region of space. The total number of particles in the region may be taken to stay unchanged over any length of time. The collisions with the bounding surface usually are taken to be "perfectly" elastic.

The reason the particle is taken as a point-mass is that by doing so, the internal structure of the particle can be treated as of no consequence to the statements made within the framework of the theory. The same assumption also does away with any theoretical complications arising due to the spatial extents of the mass, e.g., complications due to moments, directions in collisions, etc.

Note that we talk about the mass and energy *of* the system. In other words, we do make meaningful statements about the system in and by itself, i.e., even if there is nothing outside of it. Thus, something outside of the system is not necessary to impart meaning to a given system itself or to any of its properties. In particular, no external reservoir is necessary to define temperature of the system.

The system here is identified as the sum total of the masses and their motions within a particular spatial region, together with the boundaries of the region and the special properties of the boundary such as the perfectly elastic collisions they give rise to.

Provided conditions such as above are met, even the entire universe can be taken, in an abstract sense, to be an isolated system. In any case, it's worth noting that every isolated system is actually an abstraction.

2. If the kinetic theory of matter (e.g. the atomic hypothesis of Dalton) is applied to a thermodynamically macroscopic system, then, completeness requires that every macroscopic property have some description in the kinetic-theoretic terms as well.

The kinetic-theoretical concept equivalent to temperature in thermodynamics turns out to be the *kinetic* energy of particles (e.g. atoms).

Note, only the kinetic energy is involved here, potential energy in any of its forms is not. The reason is that the kinetic theoretical abstraction primarily omits any form of interaction between the constituent particles making up the system.

Now, the above manner is not how actual atoms or subatomic particles behave. The small broken pieces of matter such as atoms (and, electron, protons, neutrons, even quarks...) do interact with the other small pieces. But this fact is against the spirit of the kinetic theory.

The interaction between particles is the fundamental reason why real gases deviate from the ideal gas law. It also is one fundamental reason why the temperature<->energy equivalence has only a limited, i.e. qualified, validity.

3. Note, temperature is primarily a property of the system--not of its constituents.

However, as a result of the fact that there can be no interactions among constituents, the simple additivity rule can be assumed. Hence, it is possible to linearly subdivide a given system into a number of constituent *systems.*

The constituent systems may conveniently be built so as to isolate a single particle, e.g. an atom.

It is then that one may ascribe a temperature to a single constituent, e.g., an atom.

4. Note, by the premises of kinetic theory, it really does not matter whether the system description carries one particle or N particles.

Even a subsystem may carry any number of particles. If the decomposition is not in integral terms, it's even permissible to think of a subsystem as carrying a fractional number of particles.

The kinetic abstraction itself is silent on these issues and its structure is equally applicable to all such possibilities.

5. Let's apply the above principles to a concrete case. Consider a moving car. It is possible to think of an abstract system consisting of this car and nothing else in the world.

If this abstract system is taken to exchange no matter or energy with anything else, then it can be taken as an isolated system. (Note, isolated systems are necessarily abstractions.)

Inasmuchas the car may be represented as a point-mass in an abstract system, it is also permissible to ascribe a "temperature" to that system.

Note, the actual hotness of the car (say due to its running engine), or the possibility of measuring it using a thermometer, has been summarily ignored right in the act of modeling it as a point-mass. Hence, its real temperature (say 27 degrees Celsius) does not at all enter into the discussion. The temperature of the system we mean here is the one which the car possesses by virtue of its motion--i.e. out of its kinetic energy.

Finally note, the issue is not really speaking microscopic vs macroscopic, or QM vs classical, etc. The issue is: Can you meaningfully speak of something in the terms of the kinetic theory or not, that's all.

6. If a principle that correlates the kinetic energy of a nanotube (say due to vibrations of the ionic cores and waves in the lattice gas) with something external can be discovered, then, even an experimental measurement of the temperature of the nanotube will be possible.

So, the question "Can you measure the temperature of a nanotube?" is not one ruled out by basic physical principles. It is to be answered by through developments in the experimental science.

7. To conclude:

- The description of temperature in terms of energy is not at all quite on the same footing as its description as a sensory quality.

- The temperature<->energy equivalence is not an unqualified one. Indeed, this equivalence denotes a high-level abstraction.

- Temperature is a property of the system, primarily, not of the point-masses. But the premise of composition allows us to ascribe temperature to individual entities--by assuming smaller systems encompassing the individual entities.

- Accordingly, even a single atom, electron, quark, etc. can be taken to possess a temperature. The procedure is meaningful in the above context.

- Yet, since such a description involves high-level abstractions, a certain amount of restrain must be exercised while using the term "temperature," especially when scientists make statements to lay people.

For example, the statement that a temperature of a billion K was reached in a laboratory can be careless if there is no explanatory context provided. In today's technology, such a temperature is reached only for a single constituent particle. But lay people take it to mean as if a whole assembly palpable at the gross scale was heated to that temperature. To not take the care to spell out the explanatory context is to mislead the public.

- It is alright to think of the temperature of nanoparticles even if the particles are made of only a few hundred atoms.

Dear Ajit: Thank you very much for your comments on my notes on temperature. I wish to reiterate a few points.

In the notes, I adopted the following definition of temperature T:

1/T = change in the logarithm of the number of quantum states divided by the change in the energy of the system, everything else being fixed.

This temperature has the same dimension as energy.

To preserve the notion of hotness, any monotonically increasing function of T can equally well be called the temperature.

In particular, the SI unit asks you to rescale the above T with a linear factor, such that the triple point of a pure water is by definition 273.16K (exact). The conversion factor k (known as the Boltzmann constant), can be determined experimentally by this definition. See a website of NIST for detail of the defintion of Kelvin as a unit of temperature.

Thus, k is a factor to convert two units of hotness, just like 2.54 is the factor to convert inch to centimeter. (If I were Boltzmann, I'd be amused to be reduced, without my knowledge, to a conversion factor!)

Just because the two quantities have the same unit does not mean they are equivalent. I don't think that we should say that temperature and energy are equivalent. The above definition of T has made their nonequivalence quite clear.

As outlined in my notes on temperature, the above definition does lead to experimental procedure to measure temperature, as well as measure the number of quantum states.

It's nice that you reproduced the pertinent part of your notes for convenience of the reader's reference.

Frankly, though, I cannot understand why you wouldn't say the two are equivalent once the appropriate context was being specified. (Not necessarily the one I mentioned, but let's assume that a proper context has been given....)

IMHO, any time the physical dimensions of two quantities are identical, and the two concepts also refer to the same or similar phenomena, then the cognitive policy of theoretical integration requires that one also try and see the way in which the two concepts may be related, even equivalent.

In saying "equivalent," one still keeps the two concepts separate. The concepts still do not become "interchangeable" or "identical." Yet, there are contexts where the theoretically similar concepts are much more than being merely "analogous."

The parallels between the kinetic theory and thermodynamics (or particles and fields, speaking more broadly) is one such a context where, IMHO, it's perfectly OK to say "equivalent". On the other hand, the out-of-plane displacements of a soap membrane and the potential of an electrostatic field could never become "equivalent"--they must remain "analogous."

The basic test, I would suppose, is that in drawing theoretical parallels between two theories describing the same sphere of physical phenomena, if the two concepts go on to produce exactly the same physical effects, then, within the scope of the comparison, the two concepts should be regarded as equivalent--on the grounds that their "valence" (or capacity to produce effects) was equal.

But I suspect by now all this hair-splitting must have become positively boring, so let me stop!

It's just that your observation about the identity of the dimensions was really fine and got me thinking... It's a very fundamental point you made in your notes, and it's worth investing some thinking about... It's something like Planck's constant (having units of the physical "action") or the physical meaning of the constancy of speed of light (which Feynman asks in his Lectures to ignore through scaling, somewhat similar to the way you note Boltzmann's constant as a constant of scaling).

Before closing, let me add a word: I am new to iMechanica, being here for only about a month or so. But, it has been my pleasure to be a member here. Sometimes one does write at length. (In my case, most of the times!) Then, there are those other times when one feels like taking a shot in the dark, to be not very precise! I can understand that... In any case, the informality of a blog site preciesly is its strength! I am happy to have this chance to write more informally than in a paper, but more formally (and for a wider audience) than in a personal level talk...

So, my special thanks go to you and also to others (say the moderators) for taking this initiative...

I too initially found the example amusing but for a somewhat different reason--out of the fact that despite having clearly spelt out interchangeability as different from equivalence in my *immediately* above post, you still ask the reader of your site to consider an example of interchangeability as if it were an example of equivalence.

So, initially, I was amused.

But soon later on, I took a more serious view of it. This is the second time within a week or so that a superfluous post has been added after my entry, mandating nothing but reiterations. That too from people one would ordinarily take to be responsible, understanding and mature. I ignored the occurrence the first time, but now I must speak out.

It is this necessity to reiterate which leads me to suspect if people here consider having the last post as a prestige issue of sorts or what!

If so, I wouldn't mind being as speedy in leaving this forum as I was in joining it.

So what if it's Harvard, Stanford, or Cambridge or any other university out there! Truth, in my mind, is above all those institutions, let alone a very small community that is iMechanica. When truth becomes so easily dispensable as the click of a button and an addition of an entry just to have it as a last entry (on the lines of having the last laugh of sorts), I would have nothing to do with that kind of a forum--even if the sponsoring institution made a rather large song and dance about "VERITAS" in the logo. I could do better not to post at such a site. After all, I do have a personal Web site of my own, and Yahoo!, msnbc.com and other newsmakers *have* been following my case mindlessly for years too! So, if the whole matter is to keep praising publicly some people and to indirectly drive some others to a position that, to a casual reader, would show a lack of temperament on those others' part, then I know this is routine in Internet, but I would have none of it.

Once again, note, mindlessly adding entries not recognizing the contents of the previous posts is happenning for the second time in just one week for posts made by me. Once again, ask yourself if you are *not* giving an example of interchangeability in the name of equivalence.

Do let me know, publicly, if I should discontinue to be a member here.

That said, in the meanwhile, coming back once more, and from my side, for the last time, to the same physics as I explained right above:

If you really are consistent in holding the meanings of the terms, there indeed is no contradiction in the statement you offered at all. So, there is no reason for amusement. Consider this rephrasing of your statement (on the lines of equivalence, but not interchangeably, which simplycannot be done as I already said):

"When two systems are in thermal contact, the temperature of the system which has the higher level of kinetic energy goes down whereas the temperature for the system which has a lower level of kinetic energy goes up."

Tell me, what is wrong with this? And was is so hard to think up this example? Especially after having gone through my well explanatory post?

I have benefited from this thread of discussion. If anything that I posted has ever offended you, please accept my sincere apology. It is wholly unintentional. This message is not intended as the last word on anything. I simply feel that I should respond to your requests.

In online discussions, our intentions, unaided by facial expressions but quickened by a push of a button, may not be clear to each other. Such is the limitation of this mode of communication. One may recall the same situation in early days of emails. By now most of us have accepted that emails are less formal than snail mails.

We may think of iMechanica as an infinite hallway, where we can "meet" colleagues and discuss anything. People can express their thoughts freely, superfluous or profound or anything in between, just as in a conversation in a real hallway.

I have enjoyed your posts, including those in this thread. If you judge that the benefit of such a communication to you outweighs posts that you judge to be superfluous, then I would sincerely urge you to stay.

Now let us return to the topic of our discussion: temperature and energy. I think I see where you come from. Correct me if I have misinterpreted what you are trying to say.

For a classical gas, the temperature is proportional to average kinetic energy of particles. Indeed, for a classical gas, the temperature and the average kinetic energy are equivalent and interchangeable.

I have added the word average. I think that you would agree the word average is significant here. It makes the temperature an intensive quantity. The total kinetic energy is an extensive quantity. Thus, in thermal contact, the sum of the energies in the two systems is constant, but the sum of the temperatures of the two systems need not be constant. In equilibrium, the energies of the two systems need not be equal, but the temperatures of the two systems are equal. Even in this special case of classical gas, the temperature is not equivalent or interchangeable with total energy.

The statement you made between quotation marks will be correct if you replace the word "system" by the phrase "classical gas", and replace the the phrase "level of kinetic energy" by "average kinetic energy".

A generally applicable definition of temperature is (I apologize to repeat this again):

1/T = change in the logarithm of the number of quantum states divided by the change in the energy of the system, everything else being fixed.

This definition makes it clear that temperature involves two distinct quantities: the energy of a system, and the number of quantum states of the system. Both quantities depend on the size and the kind of the system (a piece of cheese or a glass of wine). Yet when two systems reach thermal equilibrium, the temperatures of the two systems are equal. Thus, temperature has this property that energy does not have.

Your apology was not necessary for a matter like this, and an apology wouldn't have been sufficient by itself either! But forget it! The way I look at it, when there is that sincerity to your words, *that* is enough by itself.

For the time being, I have decided to stay with the forum. Yet, I simply cannot be light-hearted about it; not yet. One man doesn't make a forum.

-----

There is my experience of Americans, who often are mere power lusters but packaged in smooth or civilized appearances. Then, there are those arrogant and suppressive Harvard graduates and the companies they run that one does come across. I can cite Mr. Manu Parpia, the man, and Geometric Software, and 3D PLM Software, India, the companies he runs as a CEO. Something similar can be said for Stanford graduates too. I can directly cite Sandip Johri, V/P of Hewlette-Packard, a batch-mate of mine at COEP, and a personal friend *once upon a time*. Other examples where my dignity or my job applications or similar matters have suffered injustice or ill-treatment at the hands of Americans can be cited rather easily too. For example, at the hands of the IIT Bombay alumni in the San Francisco Bay Area (such as Mr. Jaggi Ayyangar), IITians who now are American citizens, people whose power-lust sometimes exceeds that of the American-borns themselves. (One route of harassment used by all such Americans--born in the USA or otherwise--has been by en-masse buying the staff and advertisement space in media, esp. Times of India, Pune edition. The reader may ask Dileep Padagonkar, the then editor, for more details. I am sure he knows me.)

If the reader thinks I have gone too far, he is requested to know more about my competence and skills by visiting my Web site: http://www.JadhavResearch.info, and then note that over the last 6.5 years, I have earned any money at all for precisely 5 months and a few days--not even for a half year (full six months). 6 years is a long time. People finish PhDs and buy houses in that time. Further, I have summarily been denied all my chances in CAE, despite all that hue and cry you undoubtedly hear about India's talent et cetera. (If in doubt, ask the Governor of the State of Virginia in the USA.) And this condition of mine is despite (or may be, because of) Harvard graduates. If this is not suppression, what else do you think would be?

All such factors do play a role in deciding how I should proceed with respect to anything being run by Americans or in America, the more prestigious, the more cautious I must be. (And no, all this writing about America and Americans is *not* a blanket moral certificate of better behavior by other nations or people.)

The above is not an explanation; it is as brief but direct an indication as I can at all give. (This was about third revision to make it further brief, and I cannot make the matter more condensed without losing the personal flavor about it--and then, whose decision and life is it anyways that I am talking about, here?)

-----

Coming to the physics proper. Let me just note some thoughts without explaining them. Apparently, these points will need separate articles.

-- Any extensive property can be turned into an intensive one; simply divide by mass.

-- Your "definition" of temperature has the following roots. Start with the definition of entroy dS = deltaQ/T and Boltzmann's equation: dS = -k ln (w). (delta Q, for inexact differential.) Substitute, rearrange, and scale to remove Boltzmann's constant, k. Then, bring in the interpretation of w as in accordance with a particles approach, here and by your post, quantum states. Note, if the resulting equation is to be taken as giving us a *definition* of temperature then one already is very much within the kinetic-theoretical abstraction.

-- I note that you wish to differentiate classical gas and collection of quanta. I think this insistence is understandable but not really necessary. However, it will take a few more papers just to get going. I will publish them after my PhD thesis is done. Decades of QM developments have got it wrong, and I don't at all blame you personally (or anyone else) for honestly believing so. If anything, I appreciate your concern and the clarity of your expression.

-- I will also explain later more on equivalence in a separate post.

-- I plan not to post for a while. I plan to take my PhD thesis to a better stage of completion and then come back. May be, one week later or so.

Before closing, one more point, a clarification. This point is no reflection on Zhigang or admin or any other member at all but just a noting. Since some part of this post of mine indeed is dramatic, I want to note that if in future there arise some issues (say legal issues) for the iMechanica management out of this post, I will be only happy to leave the forum--but I will not on my own retract it. (My attitude is, and would be: "It moves!")

The quotation by Thomas Jefferson was not on “American approach on getting things done”, but on his views of mathematics (he was enthusiastic about mathematics).

A complete quotation should be:

"…… Having to conduct my grandson through his course of mathematics, I have resumed that study with great avidity. It was ever my favorite one. We have no theories there, no uncertainties remain on the mind; all is demonstration and satisfaction."

Laser excitation of a crystal produces a local source of thermal energy, composed of quantized lattice vibrations (phonons). Phonon imaging can be used to examine the propagation and scattering of high-frequency phonons in crystals.

It is interesting to think about this: Human being knew about temperature and thermodynamics long before they discovered quantum mechanics. But now they have to define temperature based on the quantum states.

I found all you discussions are very stimulating and interesting, so I would like share with you some of my thoughts, and possibly stir the water even more...

Thare have been similar dissussions on wether thermodynamics (equilibrium theory) can be applied to small systems of finite size (such as nanostructures), because they are in principle "metastable" (relative to bulk). For example, one intriguing question whether a finite system (a few atoms) ever melts? If it does, what's the melting temperature?

Temperature, as well as any other thermodynamic state variable, an statistical mean (ensemble average) of certain thermodynamic measure. In dependent of system size, such average (or measure) can always be performed (as suggested Zhigang and others). One concern is fluctuation becomes increasing bigger as the size decreases, so the deviation from the mean value increases, which sometimes makes the analysis "questionable"? For example, temperature fluctuations make the melting point ill defined.

Technically, in fact, some other thermodynamic quantities are even harder to define for the small systems than temperature. As discuss here, one may always calculate temperature by the Boltzman distribution of quantum states or avearging over kinetic energy of atoms (in MD simulation), but the volume of a finite system is in fact somewhat "arbitrary", without a definite position of boundary.

So, extending the question beyond temperature, one asks whether thermodyanics can be applied to a small system that can never be truely "equilibrated" due to large fluctuations? Nevertheless, it has been practically used so far (such as many many MD simulations) possible due to the following reasons: (1) even though metastable, their life time is so long that the nanostructures can be considered at "equilibrium"; (2) cases where fluctuations are not too large to confuse the issue; (3) There is no other better way to treat it, so we do the best we can for now...

I'm not sure entropy and temperature are concepts for a continuum. Let us look at the entropy first. For an isolated system, the entropy is simply the logarithm of the number of quantum states of the system:

S = logΩ.

For example, a hydrogen atom can be an isolated system, and we can speak of its entropy. The values of its entropy are listed elsewhere in this thread.

A more general definition seems to belong to Gibbs. I described this definition in my notes on entropy. As I remarked in my response to Sukumar a while back, one can define entropy simply as a property of any probability distribution. Thus, one can speak of the entropy of rolling a die, or entropy of whether tomorrow will rain, for example.

For an isolated system, the fundamental postulate says that every quantum state is equally probable. Then Gibbs's definition is reduced to S = logΩ. This much I did in my notes. What I didn't have time to do in the notes was to develop this line of reasoning into a useful idea. In any case, the entropy has nothing to do with continuum.

We have talked about temperature extensively in this thread of discussion. In particular, one way to hold a small system at a fixed temperature is to allow it to exchange energy with a reservoir of energy. In this picture, one does not speak of the temperature of the system, but speaks of the fluctuation of the system among various quantum states.

As an ancient example of a small system, consider a vacancy in a crystal lattice. We hold the vacancy to a fixed temperature simply by holding the crystal at a fixed temperature. The vacancy can trade energy with the crystal. We can then speak of various states this vacancy is in, e.g., whether it is energetic enough to move to another lattice site.

Perhaps a key to apply thermal physics to small systems is to carefully define an excess quantity. For it is not obvious how to privatize energy to a vacancy. Which part of the energy belongs to the crystal, and which part of the energy belongs to the vacancy?

Although temperature and entropy can be defined for a small system (even for a single electron or a single atom), but in practice (applications) statistical mechanics is used as a bridge between the behavior of macroscopic matters and the laws of nature governing microscopic dynamics of their constituents.

I checked the nanotechnology development, and found that the world's smallest thermometer used a single carbon nanotube filled with liquid gallium. (My initial question and argument seem funny here -:)

Liquid gallium is used in the nanothermometer because the gallium volume changes linearly with temperature - rising up and down the tube - at a consistent rate when the tube was exposed to different temperatures. It maintained this consistency between 50 and 500 °C.

The instrument is so sensitive that it can measure the temperature change that occur when small groups of molecules react with each other.

Physicists have made a bizarre discovery: the concept of temperature is meaningless in some tiny objects. Although the concept of temperature is known to break down on the scale of individual atoms, research now suggests that it may also fail to apply in rather larger entities, such as carbon nanotubes.

The blossoming field of nanotechnology relies on being able to manipulate materials that are made from just a few thousand atoms. Carbon nanotubes, for example, are tiny cylinders that could be used to make miniature electronic devices.

Ortwin Hess from the University of Surrey, Guildford, UK and colleagues say that if you took the temperature at one end of a 10-micrometre nanotube, it would not necessarily have the same temperature as the other end, no matter how long it was left to reach a thermal equilibrium. Such a nanotube is about as long as a sheet of paper is thick.

I read the first parts of the book Thermal physics written by C.Kittel under the suggestion of Prof. Suo. I post this comment to explain what I learn form that book.

First, the thermal contact plays a signifcant role in the definition of the temperature and entropy. As for two systems, we can obtain the total degeneracy of all the accessible configurations after the thermal contact. If the number of particles in at least one of the two systems is very large, the numbers of that total configurations can be replaced by the number of the states in the most probable configuration. Only in this case, the additivity of the entropy is valid.

As defined in the lectures of Prof. Suo:1/T = change in the logarithm of the number of quantum states divided by the change in the energy of the system, everything else being fixed. The introduction of temperature is to describe the equilibrium state of two systems under thermal contact. It is noted that this equilibrium state is just the most probable configuration. The formalism of T is also derived from the maximum of total degeneracy of all the accessible configurations. In this sense, we can think T is corresponding to the most probable configuration. However, the states expect the most probable configuration can be observed only when the number of particles in at least one of the two systems is very large. If the two systems are both small, then we can see that many different states expect of the most probable configuration which can be represented by the temperature.

Thus, for a small system with only few atoms, we can define the temperature of this kind system via letting it contact with a very large system. However, when we make two small systems together, how can we obtain the final temperature of these two systems, even though we know the temperature of them before contact. If we make they contact with a large system, this may destroy the states of the real systems and make them have the same temperature of the large system itself.

I am not sure whether my understanding is reasonable or not, however, I hope this can make any help for this topic.

For an isolated ergodic Hamiltonian system with kinetic energy quadratic in the momenta (all standard assumptions), there is a clear definition of the temperature based on the derived result of equipartition. This does not require the notion of thermal contact for the definition, and an entropy for such a system can be defined (non-extensive) to be consistent with the macroscopic thermodynamic relation between reciprocal temperature and the derivative of the entropy w.r.t energy.

Now how well this definition corresponds with the 'level of hotness' we physically perceive, and think we measure and also think as understand as temperature, that I have not decided in my mind with certainty (I think the question may be undecidable), but there is a clear definition to hang on to without involving thermal contact.

From having skimmed over this voluminous thread, it seems that many here do not like this definition (most must be aware of it since this is a standard notion?).

I summarize some basic statistical mechanics in the microcanonical ensemble, all learnt from the book by Berdichevsky referred to in the paper.

I recommend the book highly to students of mechanics - Berdichevsky, being a student of Sedov's and having degrees in both solid and fluid mechanics writes like a mechanician, and after a long search I finally found a book where the fundamentals are laid out clearly - no philosophy. He also makes the connection between statistical mechanics in the microcanonical and canonical distributions in the limit of large N and there are nice connections with when fluctuations become important, a fact that can be captured by working in the microcanonical setting.

Some of the Manchester foundation year students will join in this discussion. Welcome!

Foundation year project title: Thermodynamics of nanoscale small systems

This project explores the thermodynamics of nanoscale small systems and its possible applications. For a system going to the scale of a few (say less than 100,000) atoms, which is possible for today’s technology, one cannot even define the temperature. Is this true or not? This is a bizarre discovery: some concepts for large systems, like temperature, are meaningless in some tiny objects.

I skimmed over much of this discussion, but I would suggest you read up on the work of the pioneers in this field, e.g. Terrell Hill, Ali Mansoori, also possibly Gian Beretta’s thermodynamics
textbook, and his work on single particle system quantum thermodynamics variables,
if I remember correctly. Here’s a starter page:

The purpose of this paper is to point out (and to
use) the relations of certain statistical concepts with "statistical" thermodynamics.

(A) It is observed that Gibbs's "canonical distribution"of energy is precisely what statisticians have later labeled a "distribution of the exponential type". It follows that a rigorous treatment of the canonical law can be based upon the concept of "sufficiency", which is thereby related to the physical idea of "thermal equilibrium" and to the "zero-th principle of thermodynamics". In other words, the theory of physical fluctuations can be based upon "principles" very similar to those of the "phenomenological", or "classical, non-statistical" thermodynamics. Naturally, our results will be less detailed than those of statistical mechanics. However, the foundations of the latter theory still raise a host of unanswered
problems, and it seems good in the meantime to show that the less powerful phenomenological theory has a wider scope than is commonly thought (see also [15]). The possibility of a purely phenomenological approach to statistical thermodynamics is not in itself a new idea. A procedure somewhat similar to ours has indeed been long ago suggested in Szilard's admirable, but very difficult and neglected, paper [18]--not to be confused with his [19]. Of course, Szilard used a quite different vocabulary; but, with hindsight, one may now say that he has co-invented the concept of sufficiency with R. A. Fisher; by showing that, under certain regularity conditions, Gibbs's canonical law is the only probability distribution with a single scalar sufficient statistic, Szilard also anticipated the results of G. Darmois [2], B. O. Koopman [10] and E. J. G. Pitman [16], but was partly anticipated by Poincare [17].

(B) The second thesis of the paper is independent of Szilard, and concerns the concept of temperature. For systems with a canonical
energy, the temperature is the parameter of the Gibbs distribution; as such it is undefined for isolated systems with a determined energy. However, it is necessary to generalize the concept of temperature to isolated systems. Several definitions have been proposed and, although they all safely converge mutually for the usual very large systems, the temperature remains athematically ambiguous for small isolated systems; it also becomes physically meaningless. We shall show that the temperature for systems-in-isolation should be viewed as a statistical estimate of the parameter of a conjectural canonical distribution, from which the presently isolated system may be presumed to have once been drawn. This interpretation explains the nature of the ambiguity of the concept of temperature; it also meets the actual practice of physicists; finally, some of the a priori conditions, which the physicists impose upon their "estimators", turn out to correspond to the statistical conditions of consistency, unbiasedness, and efficiency. Physicists also use two very interesting variants of consistency and unbiasedness, which we shall study under the names of "self-consistency" and "self-unbiasedness". The most commonly used temperature, due to Ludwig Boltzmann, turns out to be the maximum likelihood estimator. In summary, we hope to show that it is a great pity that mathematical and physical statistics should have developed largely independently of each other, while using the same concepts. By combining the rigor of modern statistics with the intuitive vigor of thermodynamics, both should be served well. However, as things stand, the mathematical statistician should not hope to unearth in the literature of physics any result as yet unknown to him. An important open problem suggested by this paper is the following. When sufficiency and estimation are defined in the most general terms, it seems that one should also be able to generalize the scope of thermodynamics. However, an approach such as that of P. R. Halmos and L. J. Savage [5] could not be applied to thermodynamics without substantial restrictions, as we shall show in Section 7. It remains to study these restrictions in greater detail, before one can assert that a non-void generalization of thermodynamics is possible. The problem is addressed to both mathematicians and physicists. We shall strive to reduce to the minimum the detailed knowledge of physics required to read this paper. If the reader's appetite for information about thermodynamics has been awakened, he could do no better than to make use of references [11] and [20].