We can only approach absolute zero asymptotically because we can't suck heat out of a system. The only way we can get heat out is to place our system in contact with something cooler and let the heat flow from hot to cold as it usually does. Since there is nothing colder than absolute zero, we can never get all the heat to flow out of a system.
We can ...

Let's suppose you have a Rubik's cube that's made of a small number of atoms at a low temperature, so that you can make moves without any frictional dissipation at all, and let's suppose that the cube is initialised to a random one of its $\sim 2^{65}$ possible states. Now if you want to solve this cube you will have to measure its state. In principle you ...

The separation does not violate the 2nd law of thermodynamics, because the oil and water phases being separate is a lower energy state.
The water molecules strongly interact with each other, forming hydrogen bonds. The protons of water are shared between two oxygen atoms of two different water molecules, forming a constantly changing network of molecules. ...

The resolution to Maxwell's demon paradox is mostly understood to be through Landauer's principle, and it is one of the most compelling applications of information science to physics. Landauer's principle asserts that erasing information from a physical system will always require performing work, and particularly will require at least $$k_B T \ln(2)$$
of ...

First, strictly speaking a neutron star is not a nucleus since it is bound together by gravity rather than the strong force.
Measuring a surface temperature for any star is deceptively simple. All that is needed is a spectrum, which gives the luminous flux (or similar quantity) as a function of photon wavelength. There will be a broad thermal peak somewhere ...

If the particles are not point-like, they will take up some volume. As the gas is compressed, the collision frequency will rise more quickly, which will make the pressure-volume curve change. The corrections in the Van der Waals model of a real gas accounts for the volume of the particles. Also if they have internal structure, that structure can have ...

DavePhD's answer explains the specifics. The separation decreases the enthalpy of the oil-water mixture. But there's one more step:
When the enthalphy of the dressing decreases by $\Delta H$, it causes the entropy of the dressing and its surrounding environment to increase by $\Delta H / T$.
The reason is: The decrease in enthalpy releases heat, which ...

Here's an intentionally more conceptual answer: Entropy is the smoothness of the energy distribution over some given region of space. To make that more precise, you must define the region, the type of energy (or mass-energy) considered sufficiently fluid within that region to be relevant, and the Fourier spectrum and phases of those energy types over that ...

Frustrated total internal reflection is an optical phenomenon. It's such a close analogue to quantum tunneling that I sometimes even explain it to people as "quantum tunneling for photons". But you can calculate everything about it using classical Maxwell's equations.

It's intuitive if you are careful in remembering what is held constant.
$$\mathrm{d}U = T\mathrm{d}S - p\mathrm{d}V + \mu \mathrm{d}N$$
and
$$\mu = \left(\frac{\partial U}{\partial N}\right)_{S,V}$$
means that the entropy and the volume are kept constant. Remember that entropy is a measure of how many microstates you have available to realize the ...

There is quite a big controversy these days about the correct definition of the entropy in the microcanonical ensemble (the debate between the Gibbs and Boltzmann entropy), which is closely related to the question.
Everyone agrees that the correct definition of the density matrix is given by
$$\rho(E)=\frac{\delta(E-H)}{\omega(E)},$$
where $H$ is the ...

The reasoning in the question is correct. If you have a box with gas particles placed in half of a box but otherwise uniformly random and with random velocities then it is overwhelmingly likely that it entropy will increase with time, but if reverse the velocities, you will still have randomly distributed velocities and the same argument will apply. By time ...

Standard Monte Carlo samples the canonical (NVT) ensemble. So it maintains constant temperature but the potential energy is free to fluctuate - both up and down.
This will only seem odd if you incorrectly imagine the equilibrium state of a system to correspond to that with the minimum energy. The equilibrium state is actually determined by the minimum free ...

The entropy of a system is the amount of information needed to specify the exact physical state of a system given its incomplete macroscopic specification. So, if a system can be in $\Omega$ possible states with equal probability then the number of bits needed to specify in exactly which one of these $\Omega$ states the system really is in would be ...

In terms of the temperature, the entropy can be defined as
$$
\Delta S=\int \frac{dQ}{T}\tag{1}
$$
which, as you note, is really a change of entropy and not the entropy itself. Thus, we can write (1) as
$$
S(x,T)-S(x,T_0)=\int\frac{dQ(x,T)}{T}\tag{2}
$$
But, we are free to set the zero-point of the entropy to anything we want (so as to make it convenient)1, ...

Unitarity of quantum mechanics prohibits information destruction. On the other hand, the second law of thermodynamics claims entropy to be increasing. If entropy is to be thought of as a measure of information content, how can these two principles be compatible?
I don't think there's anything inherently quantum-mechanical about this paradox. The same ...

The notion of temperature is all about how the equilibrium an otherwise isolated system shifts when the system's internal energy changes. So you do not need to worry about whether this internal energy is kinetic, potential, whatever.
Actually the temperature is not quite the ensemble average kinetic energy. Your statement is true for an ideal gas and also ...

Here's an elegant way to show that any linear combination of (anti)symmetric states is always (anti)symmetric. We use Dirac notation here for the states, and we assume, for simplicity, that we are dealing with a two-component system so that states of the system are linear combinations of products $|\psi\rangle = |\psi_1\rangle|\psi_2\rangle$.
First, we ...

Ultimate physical motivation
Strictly in the sense of physics, the entropy is less free than it might seem. It always has to provide a measure of energy released from a system not graspable by macroscopic parameters. I.e. it has to be subject to the relation
$${\rm d}U = {\rm d}E_{macro} + T {\rm d} S$$
It has to carry all the forms of energy that cannot be ...

Here is a proof following Ojima, "Lorentz Invariance vs. Temperature in QFT", Letters in Mathematical Physics (1986) Vol. 11, Issue 1 (1986) 73-80. The first two pages of the paper are available for free here, but the website wants money for more of the paper. (Click the orange "Look Inside" button if the paper doesn't open automatically.) Fortunately, the ...

A footnote on Wikipedia actually explains this.
The $h$ appearing in the partitioning of the phase space is not Planck's constant, it is simply the size of a phase space cell in which we do not distinguish single states anymore. It's initially arbitrary. Since Planck's constant provides a natural scale below which we should not apply classical thinking, it ...

There's really two questions here, one about the definition of distribution functions and one about the derivation of the BBGKY hierarchy. I will address them in turn.
Definitions
Let's define, for convenience, $\mathbf{r}^n = \mathbf{r}_1,\dots,\mathbf{r}_n$ and $\mathbf{r}^{(N-n)} = \mathbf{r}_{n+1},\dots,\mathbf{r}_N$.
Next, let's denote the ...

It is not possible to have a state with four indistinguishable particles such that $P_{12} \psi = -\psi$ and $P_{34} \psi =\psi$, for an algebraic reason. Namely, the exchange operators have to form a representation of the permutation group $S_4$. It is rather well known that there are exactly two representations of $S_n$: the trivial representation where ...

What is entropy, more than disorder.
Mathematically, entropy is just a measure of spread of a probability distribution: The lower the entropy, the more spiked the distribution.
In statistical mechanics, a state is generally only partially defined via some macroscopic constraints, and entropy is a measure of microscopic indeterminacy. Maximizing entropy ...

The definition of temperature from kinetic theory gives temperature as a function of the average energy of the massive particles, not necessarily their speed (this is how you can have photon gasses with different temperatures). If the particles are point particles with no internal structure, that would imply that there is no upper bound on the temperature ...

I believe it was Boltzmann who first made the connection between entropy and micro states. chapter 12 of "Classical and Statistical Thermodynamics" by Ashley H. Carter discusses Boltzmann's arguments. To summarize from that book:
Entropy ($S$) corresponds to a particular configuration of an ensemble of particles called a macro state. A macro state can be ...

No. You wouldn't say that pair of beams has a temperature. Temperature is defined by the zeroth law of thermodynamics, which states that if $A$ is in thermal equilibrium with $B$ and $B$ is in thermal equilibrium with $C$ then $A$ is in thermal equilibrium with $C$ and $A$, $B$ and $C$ are said to have the same temperature.
Temperature is fundamentally a ...

Negative temperatures are only defined for systems where there are a limited number of energy states. Consider rising the temperature of such a system, then as the temperature starts rising, particles begin to move into higher energy states, and as the temperature continues rising, the number of particles in the lower energy states and in the higher energy ...