In 1874, Josef Loschmidt criticized his younger colleague Ludwig Boltzmann's 1866 attempt to derive from basic classical dynamics the increasing entropy required by the second law of thermodynamics.

Increasing entropy is the intimate connection between time and the second law of thermodynamics that Arthur Stanley Eddington later called the Arrow of Time. (The fundamental arrow of time is the expansion of the universe, which makes room for all the other arrows.) Despite never seeing entropy decrease in an isolated system, attempts to "prove" that it always increases have been failures.

Loschmidt's criticism was based on the simple idea that the laws of classical dynamics are time reversible. Consequently, if we just turned the time around, the time evolution of the system should lead to decreasing entropy. Of course we cannot turn time around, but a classical dynamical system will evolve in reverse if all the particles could have their velocities exactly reversed. Apart from the practical impossibility of doing this, Loschmidt had shown that systems could exist for which the entropy should decrease instead of increasing. This is called Loschmidt's "Reversibility Objection" (Umwiederkehreinwand) or "Loschmidt's paradox." We call it the problem of microscopic reversibility.

We can visualize the free expansion of a gas that occurs when we rapidly withdraw a piston. Because this is a movie, we can reverse the movie to show what Loschmidt imagined would happen. But Boltzmann thought that even if the particles could all have their velocities reversed, minute errors in the collisions would likely prevent a perfect return to the original state.

To demonstrate the randomness in each collision, which Boltzmann described as "molecular disorder (molekular ungeordnet) we need a program that reverses the velocities of the gas particles, and adds randomness into the collisions. (This is a work in progress.)

Information physics claims that microscopic reversibility is actually extremely unlikely and that the intrinsic path information in particles needed to reduce entropy is erased by matter-radiation interactions or by internal quantum transitions in the colliding atoms considered as a "quasi"-molecule.

Microscopic time reversibility is one of the foundational assumptions of both classical mechanics and quantum mechanics. It is mistakenly thought to be the basis for the "detailed balancing" of chemical reactions in thermodynamic equilibrium. In fact microscopic reversibility is an assumption that is only statistically valid in the same limits as any "quantum to classical transition." This is the limit when the number of particles is large enough that we can average over quantum effects. Quantum events also approach classical behavior in the limit of large quantum numbers, which Niels Bohr called the "correspondence principle."

It may seem presumptuous for an information philosopher to challenge such a fundamental principle of statistical mechanics and even quantum statistical physics as microscopic reversibility.

What "detailed balancing" means is that in thermodynamic equilibrium, the number of forward reactions is exactly balanced by the number of reverse reactions. And this is correct.
But microscopic reversibility, while still true when considering averages over time, should not be confused with the time reversibility of a specific individual collision between particles.

We will examine the collision of two atoms and show that if their velocities are reversed at some time after the collision, it is highly improbable that they will retrace their paths. This does not mean that, given enough particle collisions, there will not be statistically many collisions that are essentially the same as the "reverse collisions" needed for detailed balancing in chemical reactions, for transport processes with the Boltzmann equation, and for the Onsager reciprocal relations in non-equilibrium conditions.

The Origin of Irreversibility

Our careful quantum analysis shows that time reversibility fails even in the most ideal conditions (the simplest case of two particles in collision), provided internal quantum structure or the quantum-mechanical interaction with radiation is taken into account.

Albert Einstein was the first to see this, first in his 1909 extension of work on the photoelectric effect but especially in his 1916-17 work on the emission and absorption of radiation. This was the work in which Einstein showed that quantum theory implies ontological chance, which he famously disliked, ("God does not play dice!"). For Einstein, detailed balancing was not the result of microscopic reversibility, it was his starting assumption.

Einstein's work is sometimes cited as proof of detailed balancing and microscopic reversibility. (Wikipedia, for example.) In fact, Einstein used Boltzmann's assumption of detailed balancing, along with the "Boltzmann principle" that the probability of states with energy E is reduced by the exponential "Boltzmann factor," f(E) ∝ e-E/kT, to derive his transition probabilities for emission and absorption of radiation. Einstein also derived Planck's radiation law and Bohr's second "quantum postulate" Em - En = hν. But Einstein distinctly denied any symmetry in the elementary processes of emission and absorption.

As early as 1909, he noted that the elementary process of emission is not "invertible." There are outgoing spherical waves of radiation, but incoming spherical waves are never seen.

While in the kinetic theory of molecules, for every process in which only a few elementary particles participate (e.g., molecular collisions), the inverse process also exists. But that is not the case for the elementary processes of radiation. According to our prevailing theory, an oscillating ion generates a spherical wave that propagates outwards. The inverse process does not exist as an elementary process. A converging spherical wave is mathematically possible, to be sure; but to approach its realization requires a vast number of emitting entities. The elementary process of emission is not invertible. In this, I believe, our oscillation theory does not hit the mark. Newton's emission theory of light seems to contain more truth with respect to this point than the oscillation theory since, first of all, the energy given to a light particle is not scattered over infinite space, but remains available for an elementary process of absorption...

In a deterministic universe, the path information needed to predict the future motions of all particles would be preserved. If information is a conserved quantity, the future and the past are all contained in the present. The information about future paths is precisely the same information that, if reversed, would predict microscopic reversibility of each and every collision.
The introduction of ontological probabilities and statistics would deny such determinism. If the motions of particles have a chance element, such determinism can not exist. And this is exactly what Einstein did in his papers on the emission and absorption of radiation by matter. He found that quantum theory implies ontological chance. A "weakness in the theory," he called it.

What we might call Einstein's "radiation asymmetry" was introduced with these words,

When a molecule absorbs or emits the energy
ε in the form of radiation during the transition between quantum theoretically possible
states, then this elementary process can be viewed either as a completely or partially
directed one in space, or also as a symmetrical (nondirected) one. It turns out that we
arrive at a theory that is free of contradictions, only if we interpret those elementary
processes as completely directed processes.

The elementary process of the emission and absorption of radiation is asymmetric, because the process is directed, as Einstein had explicitly noted first in 1909, and we think he had seen as early as 1905. The apparent isotropy of the emission of radiation is only what Einstein called "pseudo-isotropy" (Pseudoisotropie), a consequence of time averages over large numbers of events. Einstein often substituted time averages for space averages, or averages over the possible states of a system in statistical mechanics.

a quantum theory free
from contradictions can only be obtained if the emission process, just
as absorption, is assumed to be directional. In that case, for each
elementary emission process Zm->Zn a momentum of magnitude
(εm—εn)/c is transferred to the molecule. If the latter is isotropic, we
shall have to assume that all directions of emission are equally probable.

If the molecule is not isotropic, we arrive at the same statement if the
orientation changes with time in accordance with the laws of chance.
Moreover, such an assumption will also have to be made about the
statistical laws for absorption, (B) and (B'). Otherwise the constants
Bmn and Bnm would have to depend on the direction, and this can be
avoided by making the assumption of isotropy or pseudo-isotropy
(using time averages).

Now the principle of microscopic reversibility is a fundamental assumption of statistical mechanics. It underlies the principle of "detailed balancing," which is critical to the understanding of chemical reactions. In thermodynamic equilibrium, the number of forward reactions is exactly balanced by the number of reverse reactions.
But microscopic reversibility, while true in the sense of averages over time, should not be confused with the reversibility of individual collisions between molecules.

The equations of classical dynamics are reversible in time. And the deterministic Schrödinger equation of motion in quantum mechanics is also time reversible. But the interactions of photons and material particles like electrons and atoms are distinctly not reversible!

An explanation of microscopic irreversibility in atomic and molecular collisions would provide the needed justification for Ludwig Boltzmann's assumption of "molecular disorder" and strengthen his H-Theorem. This is what we hope to do.

In quantum mechanics, microscopic time reversibility is assumed true by most scientists because the deterministic Schrödinger equation itself is time reversible. But the Schrödinger equation only describes the deterministic time evolution of the probabilities of various quantum events, which are themselves not deterministic and not reversible.

When an actual event occurs, the probabilities of multiple possible events collapse to the actual occurrence of one event. In quantum mechanics, this is the irreversible collapse of the wave function that John von Neumann called "Process 1."

Treating two atoms as a temporary molecule means we must use molecular, rather than atomic, wave functions. The quantum description of the molecule now transforms the six independent degrees of freedom into three for the molecule's center of mass and three more that describe vibrational and rotational quantum states.

The possibility of quantum transitions between closely spaced vibrational and rotational energy levels in the "quasi-molecule' introduces indeterminacy in the future paths of the separate atoms. The classical path information needed to ensure the deterministic dynamical behavior has been partially erased. The memory of the past needed to predict the "determined" future has been lost.

Even assuming the practical impossibility of a perfect classical time reversal, in which we simply turn the two particles around, quantum physics would require two measurements to locate the two particles, followed by two state preparations to send them in the opposite direction. These could only be made within the precision of Heisenberg's uncertainty principle and so could not perfectly produce microscopic reversibility, which is thus only a classical idealization, like the idea of determinism..

Heisenberg indeterminacy puts calculable limits on the accuracy with which perfect reversed paths can be achieved.

Let us assume this impossible task can be completed, and it sends the two particles into the reverse collision paths. But on the return path, there is only a finite probability that a "sum over histories" calculation will produce the same (or exactly reversed) quantum transitions between vibrational and rotational states that occurred in the first collision.

Thus a quantum description of a two-particle collision establishes the microscopic irreversibility that Boltzmann sometimes described as his assumption of "molecular disorder." In his second (1877) derivation of the H-theorem, Boltzmann used a statistical approach and the molecular disorder assumption to get away from the time-reversibility assumptions of classical dynamics.

We must develop a deep insight into Einstein's asymmetry between light and matter, one that was appreciated as early as the 1880's by Max Planck's great mentor Gustave Kirchhoff, but was not understood in quantum mechanical terms until Einstein's understanding of nonlocality and the relation between waves and particles in 1909..

It is still ignored in quantum statistical mechanics by those who mistakenly think that the time reversible Schrödinger equation means microscopic interactions are reversible.

Maxwell and Boltzmann had shown that collisions between material particles, analyzed statistically, cause the distribution of positions and velocities to approach their equilibrium Maxwell-Boltzmann distribution.

A bit later, Kirchhoff and Planck knew that an extreme non-equilibrium distribution of radiation, for example a monochromatic radiation field, will remain out of equilibrium indefinitely. But if that radiation interacts with even the tiniest amount of matter, a speck of carbon black was their example, all the wavelengths of the spectrum - the Kirchhoff law - soon appear.

So we can say that the approach to equilibrium of a radiation field has the same origin of irreversibility as that of matter.

Radiation without matter cannot equilibrate. Photons do not interact, except at the extremely high energies where they can convert to matter and anti-matter.

Our new insight is that matter without radiation also cannot equilibrate in a way that escapes the reversibility and recurrence objections, as is taught in every textbook and review article on statistical mechanics to this day.

It is thus the irreversible interaction of the two, light and matter, photons and electrons, that lies behind the increase of entropy in the universe. The second law of thermodynamics would not explain the increase of entropy except for the microscopic irreversibility that we have shown to be the case.

Microscopic irreversibility not only explains the second law, it validates Boltzmann's brilliant assumption of "molecular disorder" to justify his statistical arguments.

Zermelo's paradox was a later criticism of Ludwig Boltzmann's attempt to derive the increasing entropy required by the second law of thermodynamics. It also involves time. Assuming infinite available time, a finite universe with fixed matter, energy, and information will at some point return to any given earlier state.

We now know that even a finite part of the universe cannot return to exactly the same state, because the surrounding universe will have aged and be in a different information state. This is the information philosophy solution to the problem of eternal recurrence, as seen by Arthur Stanley Eddington and H. Dieter Zeh.