Tag Info

When you say "why aren't things being destroyed", you presumably mean "why aren't the chemical bonds that hold objects together being broken". Now, we can determine the energy it takes to break a bond - that's called the "bond energy". Let's take, for example, a carbon-carbon bond, since it's a common one in our bodies.
The bond energy of a carbon-carbon ...

Let's suppose you have a Rubik's cube that's made of a small number of atoms at a low temperature, so that you can make moves without any frictional dissipation at all, and let's suppose that the cube is initialised to a random one of its $\sim 2^{65}$ possible states. Now if you want to solve this cube you will have to measure its state. In principle you ...

The resolution to Maxwell's demon paradox is mostly understood to be through Landauer's principle, and it is one of the most compelling applications of information science to physics. Landauer's principle asserts that erasing information from a physical system will always require performing work, and particularly will require at least $$k_B T \ln(2)$$
of ...

The partition function is strongly related to a very useful tool in probability theory called the moment generating function(al) of the probability distribution.
For any probability distribution $p$ of some random variable $X$, the generating function $\mathcal{M}(z)$ is defined as being:
\begin{equation}
\mathcal{M}(z) \equiv \langle e^{zX}\rangle
...

$\hbar$ does not need to appear in classical statistical mechanics. You are free to replace it with any quantity with units of angular momentum, say $\hbar_{\mathrm{C}}$. As long as this is choosen smaller than the size you can experimentally probe (i.e., as long as you don't ask questions of the theory that contain structure on this length scale or below) ...

A footnote on Wikipedia actually explains this.
The $h$ appearing in the partitioning of the phase space is not Planck's constant, it is simply the size of a phase space cell in which we do not distinguish single states anymore. It's initially arbitrary. Since Planck's constant provides a natural scale below which we should not apply classical thinking, it ...

Here's an intentionally more conceptual answer: Entropy is the smoothness of the energy distribution over some given region of space. To make that more precise, you must define the region, the type of energy (or mass-energy) considered sufficiently fluid within that region to be relevant, and the Fourier spectrum and phases of those energy types over that ...

Unitarity of quantum mechanics prohibits information destruction. On the other hand, the second law of thermodynamics claims entropy to be increasing. If entropy is to be thought of as a measure of information content, how can these two principles be compatible?
I don't think there's anything inherently quantum-mechanical about this paradox. The same ...

Frustrated total internal reflection is an optical phenomenon. It's such a close analogue to quantum tunneling that I sometimes even explain it to people as "quantum tunneling for photons". But you can calculate everything about it using classical Maxwell's equations.

The reasoning in the question is correct. If you have a box with gas particles placed in half of a box but otherwise uniformly random and with random velocities then it is overwhelmingly likely that it entropy will increase with time, but if reverse the velocities, you will still have randomly distributed velocities and the same argument will apply. By time ...

The partition function contains so much information because it is directly related to the free energy,
$$F = - k_B T \ln(Z) \, .$$
The physical assumption behind considering $F$ as a thermodynamic potential is that the statistics of the system as described by the canonical ensemble.
In turn, the applicability of the canonical ensemble is a direct ...

$h$ factor
The factor of $1/h^{3N}$ is a total hack.
The integral over phase space has dimensions, whereas $Z$ only makes sense if it's dimensionless.
The $h$ factors are there to make $Z$ dimensionless.
Suppose you have a system with only one particle in one dimension.
Then the integral in phase space goes over one position variable, $dq$ and one momentum ...

There is quite a big controversy these days about the correct definition of the entropy in the microcanonical ensemble (the debate between the Gibbs and Boltzmann entropy), which is closely related to the question.
Everyone agrees that the correct definition of the density matrix is given by
$$\rho(E)=\frac{\delta(E-H)}{\omega(E)},$$
where $H$ is the ...

Here is a proof following Ojima, "Lorentz Invariance vs. Temperature in QFT", Letters in Mathematical Physics (1986) Vol. 11, Issue 1 (1986) 73-80. The first two pages of the paper are available for free here, but the website wants money for more of the paper. (Click the orange "Look Inside" button if the paper doesn't open automatically.) Fortunately, the ...

Standard Monte Carlo samples the canonical (NVT) ensemble. So it maintains constant temperature but the potential energy is free to fluctuate - both up and down.
This will only seem odd if you incorrectly imagine the equilibrium state of a system to correspond to that with the minimum energy. The equilibrium state is actually determined by the minimum free ...

The entropy of a system is the amount of information needed to specify the exact physical state of a system given its incomplete macroscopic specification. So, if a system can be in $\Omega$ possible states with equal probability then the number of bits needed to specify in exactly which one of these $\Omega$ states the system really is in would be ...

In terms of the temperature, the entropy can be defined as
$$
\Delta S=\int \frac{dQ}{T}\tag{1}
$$
which, as you note, is really a change of entropy and not the entropy itself. Thus, we can write (1) as
$$
S(x,T)-S(x,T_0)=\int\frac{dQ(x,T)}{T}\tag{2}
$$
But, we are free to set the zero-point of the entropy to anything we want (so as to make it convenient)1, ...

I can see different subtleties in Landau's argument. First of all, it isn't entirely clear what is meant by "there are only seven additive constants of motion". To give an example, consider a single particle hamiltonian: $$H=\frac{\mathbf p^2}{2m}+m\omega ^2\frac{\mathbf q ^2}{2}.$$
For this hamiltonian there are several conserved quantities: $$e(\mathbf ...

The notion of temperature is all about how the equilibrium an otherwise isolated system shifts when the system's internal energy changes. So you do not need to worry about whether this internal energy is kinetic, potential, whatever.
Actually the temperature is not quite the ensemble average kinetic energy. Your statement is true for an ideal gas and also ...

In some sense yes. The temperature is defined as an imaginary time in Matsubara Green's functions or some path integrals. Thus, a negative inverse imaginary temperature can be considered as a time. Here is a quotation from Alexander Altland, Ben Simons "Condensed Matter Field Theory":
"Thus, real time dynamics and
quantum statistical mechanics can be ...

This phenomenon has been studied by Vella and Mahadevan and written up in the American Journal of Physics (http://scitation.aip.org/content/aapt/journal/ajp/73/9/10.1119/1.1898523). It's called the Cheerios effect.
If the cereal pieces clump together away from the edges of the bowl, they gravitate toward a slight concavity in the surface caused by water ...

Yes, neutrinos should obey Fermi-Dirac statistics and yes, the Pauli Exclusion Principle should operate for neutrinos.
But let's examine how dense the neutrino population has to be for this to be important.
The Fermi momentum is given by
$$ p_F = \left( \frac{3}{8\pi}\right) h n_{\nu}^{1/3} $$
where $n_{\nu}$ is the neutrino number density.
In order to be ...

There are of course many books out there, quantum statistics is a really well-established field, so regardless of suggestions here you should really look further on your own as well and find one that suits you best. But here are a few that I've used in the past that you may find useful:
Statistical mechanics: A survival guide by Mike Glazer and Justin ...

Giving the value simply of $k_B T$ is generally more useful, because I can plug that into anything. Sure, I might need to know the ideal gas energy, and multiply by $3/2$. But maybe I need to put it into a partition function, and I just need $k_B T$. Maybe I'm worried about a harmonic oscillator and I just have the two degrees of freedom. The 3/2 is ...

Here's an elegant way to show that any linear combination of (anti)symmetric states is always (anti)symmetric. We use Dirac notation here for the states, and we assume, for simplicity, that we are dealing with a two-component system so that states of the system are linear combinations of products $|\psi\rangle = |\psi_1\rangle|\psi_2\rangle$.
First, we ...

Ultimate physical motivation
Strictly in the sense of physics, the entropy is less free than it might seem. It always has to provide a measure of energy released from a system not graspable by macroscopic parameters. I.e. it has to be subject to the relation
$${\rm d}U = {\rm d}E_{macro} + T {\rm d} S$$
It has to carry all the forms of energy that cannot be ...

There's really two questions here, one about the definition of distribution functions and one about the derivation of the BBGKY hierarchy. I will address them in turn.
Definitions
Let's define, for convenience, $\mathbf{r}^n = \mathbf{r}_1,\dots,\mathbf{r}_n$ and $\mathbf{r}^{(N-n)} = \mathbf{r}_{n+1},\dots,\mathbf{r}_N$.
Next, let's denote the ...

Prahar is correct that generally we have an energy contribution of ${1 \over 2} kT$ per degree of freedom in a system - so that atoms in a gas of atoms (e.g. Helium) will have an average energy of ${3 \over 2} kT$.
Often people talk about thermal energy being '$kT$' because of the exponential expression in
$N_i = N_0 {g_i \over g_0} e^{-{E_i \over kT}}$
...