Information physics also probes deeply into the second law of thermodynamics to establish the irreversible increase of entropy on a quantum mechanical basis, something that could not be shown by classical statistical mechanics or even quantum statistical physics..

Although "Information physics" is a new "interpretation" of quantum mechanics, it is not an attempt to alter the standard quantum mechanics, for example, extending it to theories such as "hidden variables" to restore determinism or adding terms to the Schrödinger equation to force a collapse. Information physics investigates the quantum mechanical and thermodynamic implications of cosmic information structures, especially those that were created before the existence of human observers. It shows that no "conscious observers" are required as with the Copenhagen Interpretation or the work of John von Neumann or Eugene Wigner.

Information physics proposes to show that everything created since the origin of the universe over thirteen billion years ago has involved just two fundamental physical processes that combine to form the core of all creative processes. These two steps occur whenever even a single bit of new information is created and comes into the universe.

The formation of even a single bit of information that did not previously exist requires the equivalent of a "measurement." This "measurement" does not involve a "measurer," an experimenter or observer. It happens when the probabilistic wave function that describes the possible outcomes of a measurement "collapses" and a particle of matter or energy is actually found somewhere.

Step 2: A thermodynamic process - local reduction, but cosmic increase, in the entropy.

The second law of thermodynamics requires that the overall cosmic entropy always increases. When new information is created locally in step 1, some energy (with positive entropy greater than the negative entropy of the new information) must be transferred away from the location of the new bits or they will be destroyed, if local thermodynamical equilibrium is restored. This can only happen in a locality where flows of matter and energy with low entropy are passing through, keeping it far from equilibrium.

This two-step core creative process underlies the formation of microscopic objects like atoms and molecules, as well as macroscopic objects like galaxies, stars, and planets.

With the emergence of teleonomic (purposive) information in self-replicating systems, the same core process underlies all biological creation. But now some random changes in information structures are rejected by natural selection, while others reproduce successfully.

Finally, with the emergence of self-aware organisms and the creation of extra-biological information stored in the environment, the same information-generating core process underlies communication, consciousness, free will, and creativity.

The two physical processes in the creative process, quantum physics and thermodynamics, are somewhat daunting subjects for philosophers, and even for many scientists.

Quantum Mechanics

In classical mechanics, the material universe is thought to be made up of tiny particles whose motions are completely determined by forces that act between the particles, forces such as gravitation, electrical attractions and repulsions, etc.

The equations that describe those motions, Newton's laws of motion, were for many centuries thought to be perfect and sufficient to predict the future of any mechanical system. They provided support for many philosophical ideas about determinism.

In classical electrodynamics, electromagnetic radiation (light, radio) was known to have wave properties such as interference. When the crest of one wave meets the trough of another, the two waves cancel one another.

In quantum mechanics, radiation is found to have some particle-like behavior. Energy comes in discrete physically localized packages. Max Planck in 1900 made the famous assumption that the energy was proportional to the frequency of radiation ν.

E = hν

For Planck, this assumption was just a heuristic mathematical device that allowed him to apply Ludwig Boltzmann's work on the statistical mechanics and kinetic theory of gases. Boltzmann had shown in the 1870's that the increase in entropy (the second law) could be explained if gases were made up of enormous numbers of particles.

Planck applied Boltzmann's statistics of many particles to radiation and derived the distribution of radiation at different frequencies (or wavelengths) just as James Clerk Maxwell and Boltzmann had derived the distribution of velocities (or energies) of the gas particles.

Note the mathematical similarity of Planck's radiation distribution law (photons) and the Maxwell-Boltzmann velocity distribution (molecules). Both curves have a power law increase on one side to a maximum and an exponential decrease on the other side of the maximum. The molecular velocity curves cross one another because the total number of molecules is the same. With increasing temperature T, the number of photons increases at all wavelengths.

But Planck did not actually believe that radiation came in discrete particles, at least until a dozen years later. In the meantime, Albert Einstein's 1905 paper on the photoelectric effect hypothesized that light comes in discrete particles, subsequently called "photons," analogous to electrons.

Planck was not happy about the idea of light particles, because his use of Boltmann's statistics implied that chance was real. Boltzmann himself had qualms about the reality of chance. Although Einstein also did not like the idea of chancy statistics, he did believe that energy came in packages of discrete "quanta." It was Einstein, not Planck, who quantized mechanics and electrodynamics. Nevertheless, it was for the introduction of the quantum of action h that Planck was awarded the Nobel prize in 1918.

Louis de Broglie argued that if photons, with their known wavelike properties, could be described as particles, electrons as particles might show wavelike properties with a wavelength λ inversely proportional to their momentum p = mev.

p = h/2πλ

Experiments confirmed de Broglie's assumption and led Erwin Schrödinger to derive a "wave equation" to describe the motion of de Broglie's waves. Schrödinger's equation replaces the classical Newton equations of motion.

Note that Schrödinger's equation describes the motion of only the wave aspect, not the particle aspect, and as such it implies interference. Note also that it is as fully deterministic an equation of motion as Newton's equations.

Schrödinger attempted to interpret his "wave function" for the electron as a probability density for electrical charge, but charge density would be positive everywhere and unable to interfere with itself.

Max Born shocked the world of physics by suggesting that the absolute values of the wave function ψ squared (|ψ|2) could be interpreted as the probability of finding the electron in various position and momentum states - if a measurement is made. This allows the probability amplitude ψ to interfere with itself, producing highly non-intuitive phenomena such as the two-slit experiment.

Despite the probability amplitude going through two slits and interfering with itself, experimenters never find parts of electrons. They always are found whole.

In 1932 John von Neumann explained that two fundamentally different processes are going on in quantum mechanics.

A non-causal process, in which the measured electron winds up randomly in one of the possible physical states (eigenstates) of the measuring apparatus plus electron.

The probability for each eigenstate is given by the square of the coefficients cn of the expansion of the original system state (wave function ψ) in an infinite set of wave functions φ that represent the eigenfunctions of the measuring apparatus plus electron.

cn = < φn | ψ >

This is as close as we get to a description of the motion of the particle aspect of a quantum system. According to von Neumann, the particle simply shows up somewhere as a result of a measurement. Information physics says it shows up whenever a new stable information structure is created.

A causal process, in which the electron wave function ψ evolves deterministically according to Schrödinger's equation of motion for the wavelike aspect. This evolution describes the motion of the probability amplitude wave ψ between measurements.

(ih/2π) ∂ψ/∂t = Hψ

Von Neumann claimed there is another major difference between these two processes. Process 1 is thermodynamically irreversible. Process 2 is reversible. This confirms the fundamental connection between quantum mechanics and thermodynamics that is explainable by information physics.

Physicists calculate the deterministic evolution of the Schrödinger wave function in time as systems interact or collide. At some point, they make the ad hoc assumption that the wave function "collapses." This produces a set of probabilities of finding the resulting combined system in its various eigenstates.

Although the collapse appears to be a random and ad hoc addition to the deterministic formalism of the Schrödinger equation, it is very important to note that the experimental accuracy of quantum mechanical predictions is unparalleled in physics, providing the ultimate justification for this theoretical kluge.

Moreover, without wave functions collapsing, no new information can come into the universe. Nothing unpredicatable would ever emerge. Determinism is "information-preserving." All the information we have today would have to have already existed in the original fireball at the universe origin.

The "Problem" of Measurement

Quantum measurement (the irreducibly random process of wave function collapse) is not a part of the mathematical formalism of wave function time evolution (the Schrödinger equation of motion is a perfectly deterministic process). The hypothesized collapse is an ad hoc heuristic description and method of calculation that predicts the probabilities of what will happen when an observer makes a measurement.

In many standard discussions of quantum mechanics, and most popular treatments, it is said that we need the consciousness of a physicist to collapse the wave function. Eugene Wigner and John Wheeler sometimes describe the observer as making up the "mind of the universe." John Bell sardonically asked whether the observer needs a Ph.D.

Von Neumann contributed a lot to this confusion by claiming that the location of a "cut" (Schnitt) between the microscopic system and macroscopic measurement system could be anywhere - including inside an observer's brain. Information physics will locate the cut (outside the brain).

Measurement requires the interaction of something macroscopic, assumed to be large and adequately determined. In physics experiments, this is the observing apparatus. But in general, measurement does not require a conscious observer. It does require information creation or there will be nothing to observe.

Some scientists (Werner Heisenberg, John von Neumann, Eugene Wigner and John Bell, for example) have argued that in the absence of a conscious observer, or some "cut" between the microscopic and macroscopic world, the evolution of the quantum system ψ and the macroscopic measuring apparatus A would be described deterministically by Schrödinger's equation of motion for the wave function
| ψ + A > with the Hamiltonian H energy operator,

(ih/2π) ∂/∂t | ψ + A > = H | ψ + A >.

Our quantum mechanical analysis of the measurement apparatus in the above case allows us to locate the "cut" or "Schnitt" between the microscopic and macroscopic world at those components of the "adequately classical and deterministic" apparatus that put the apparatus in an irreversible stable state providing new information to the observer.

John Bell drew a diagram to show the various possible locations for what he called the "shifty split." Information physics shows us that the correct location for the boundary is the first of Bell's possibilities.

Thermodynamics

The second law of thermodynamics says that the entropy (or disorder) of a closed physical system increases until it reaches a maximum, the state of thermodynamic equilibrium. It requires that the entropy of the universe is now and has always been increasing. (The first law is that energy is conserved.)

This established fact of increasing entropy has led many scientists and philosophers to assume that the universe we have is running down. They think that means the universe began in a very high state of information, since the second law requires that any organization or order is susceptible to decay. The information that remains today, in their view, has always been here. This fits nicely with the idea of a deterministic universe. There is nothing new under the sun. Physical determinism is "information-preserving."

But the universe is not a closed system. It is in a dynamic state of expansion that is moving away from thermodynamic equilibrium faster than entropic processes can keep up. The maximum possible entropy is increasing much faster than the actual increase in entropy. The difference between the maximum possible entropy and the actual entropy is potential information.

Creation of information structures means that in parts of the universe the local entropy is actually going down. Reduction of entropy locally is always accompanied by radiation of entropy away from the local structures to distant parts of the universe, into the night sky for example. Since the total entropy in the universe always increases, the amount of entropy radiated away always exceeds (often by many times) the local reduction in entropy, which mathematically equals the increase in information.

"Ergodic" Processes

We will describe processes that create information structures, reducing the entropy locally, as "ergodic."

This is a new use for a term from statistical mechanics that describes a hypothetical property of classical mechanical gases. See the Ergodic Hypothesis.

Ergodic processes (in our new sense of the word) are those that appear to resist the second law of thermodynamics because of a local increase in information or "negative entropy" (Erwin Schrödinger's term). But any local decrease in entropy is more than compensated for by increases elsewhere, satisfying the second law. Normal entropy-increasing processes we will call "entropic".

Encoding new information requires the equivalent of a quantum measurement - each new bit of information produces a local decrease in entropy but requires that at least one bit (generally much much more) of entropy be radiated or conducted away.

Without violating the inviolable second law of thermodynamics overall, ergodic processes reduce the entropy locally, producing those pockets of cosmos and negative entropy (order and information-rich structures) that are the principal objects in the universe and in life on earth.

Entropy and Classical Mechanics

Ludwig Boltzmann attempted in the 1870's to prove Rudolf Clausius' second law of thermodynamics, namely that the entropy of a closed system always increases to a maximum and then remains in thermal equilibrium. Clausius predicted that the universe would end with a "heat death" because of the second law.

Boltzmann formulated a mathematical quantity H for a system of n ideal gas particles, showing that it had the property δΗ/δτ ≤ 0, that H always decreased with time. He identified his H as the opposite of Rudolf Clausius' entropy S.

In 1850 Clausius had formulated the second law of thermodynamics. In 1857 he showed that for a typical gas like air at standard temperatures and pressures, the gas particles spend most of their time traveling in straight lines between collisions with the wall of a containing vessel or with other gas particles. He defined the "mean free path" of a particle between collisions. Clausius and essentially all physicists since have assumed that gas particles can be treated as structureless "billiard balls" undergoing "elastic" collisions. Elastic means no motion energy is lost to internal friction.

Shortly after Clausius first defined the entropy mathematically and named it in 1865, James Clerk Maxwell determined the distribution of velocities of gas particles (Clausius for simplicity had assumed that all particles moved at the average speed 1/2mv2 = 3/2kT).

Maxwell's derivation was very simple. He assumed the velocities in the x, y, and z directions were independent. [more...]

Boltzmann improved on Maxwell's statistical derivation by equating the number of particles entering a given range of velocities and positions to the number leaving the same volume in 6n-dimensional phase space. This is a necessary state for the gas to be in equilibrium. Boltzmann then used Newtonian physics to get the same result as Maxwell, which is thus called the Maxwell-Boltzmann distribution.

Boltzmann's first derivation of his H-theorem (1872) was based on the same classical mechanical analysis he had used to derive Maxwell's distribution function. It was an analytical mathematical consequence of Newton's laws of motion applied to the particles of a gas. But it ran into immediate objections. The objection is the hypothetical and counterfactual idea of time reversibility. If time were reversed, the entropy would simply decrease. Since the fundamental Newtonian equations of motion are time reversible, this appears to be a paradox. How could the irreversibile increase of the macroscopic entropy result from microscopic physical laws that are time reversible?

Lord Kelvin (William Thomson) was the first to point out the time asymmetry in macroscopic processes, but the criticism of Boltzmann's H-theorem is associated with his lifelong friend Joseph Loschmidt. Boltzmann immediately agreed with Loschmidt that the possibility of decreasing entropy could not be ruled out if the classical motion paths were reversed.

Boltzmann then reformulated his H-theorem (1877). He analyzed a gas into "microstates" of the individual gas particle positions and velocities. For any "macrostate" consistent with certain macroscopic variables like volume, pressure, and temperature, there could be many microstates corresponding to different locations and speeds for the individual particles.

Any individual microstate of the system was intrinsically as probable as any other specific microstate, he said. But the number of microstates consistent with the disorderly or uniform distribution in the equilibrium case of maximum entropy simply overwhelms the number of microstates consistent with an orderly initial distribution.

About twenty years later, Boltzmann's revised argument that entropy statistically increased ran into another criticism, this time not so counterfactual. This is the recurrence objection. Given enough time, any system could return to its starting state, which implies that the entropy must at some point decrease. These reversibility and recurrence objections are still prominent in the physics literature.

The recurrence idea has a long intellectual history. Ancient Babylonian astronomers thought the known planets would, given enough time, return to any given position and thus begin again what they called a "great cycle," estimated by some at 36,000 years. Their belief in an astrological determinism suggested that all events in the world would also recur. Friedrich Nietzsche made this idea famous in the nineteenth century, at the same time as Boltzmann's hypothesis was being debated, as the "eternal return" in his Also Sprach Zarathustra.

The recurrence objection was first noted in the early 1890's by French mathematician and physicist Henri Poincaré. He had found an analytic solution to the three-body problem and noted that the configuration of three bodies returns arbitrarily close to the initial conditions after calculable times. Even for a handful of planets, the recurrence time is longer than the age of the universe, if the positions are specified precisely enough. Poincaré then proposed that the presumed "heat death" of the universe predicted by the second law of thermodynamics could be avoided by "a little patience." Another mathematician, Ernst Zermelo, a young colleague of Max Planck in Berlin, is more famous for this recurrence paradox.

Boltzmann accepted the recurrence criticism. He calculated the extremely small probability that entropy would decrease noticeably, even for gas with a very small number of particles (1000). He showed the time associated with such an event was 101010 years. But the objections in principle to his work continued, especially from those who thought the atomic hypothesis was wrong.

It is very important to understand that both Maxwell's original derivation of the velocities distribution and Boltzmann's H-theorem showing an entropy increase are only statistical or probabilistic arguments. Boltzmann's work was done twenty years before atoms were established as real and fifty years before the theory of quantum mechanics established that at the microscopic level all interactions of matter and energy are fundamentally and irreducibly statistical and probabilistic.

Entropy and Quantum Mechanics

A quantum mechanical analysis of the microscopic collisions of gas particles (these are usually molecules - or atoms in a noble gas) can provide revised analyses for the two problems of reversibility and recurrence. Note this requires more than quantum statistical mechanics. It needs the quantum kinetic theory of collisions in gases.

Boltzmann assumed that collisions would result in random distributions of velocities and positions so that all the possible configurations would be realized in proportion to their number. He called this "molecular chaos." But if the path of a system of n particles in 6n-dimensional phase space should be closed and repeat itself after a short and finite time during which the system occupies only a small fraction of the possible states, Boltzmann's assumptions would be wrong.

What is needed is for collisions to completely randomize the directions of particles after collisions, and this is just what the quantum theory of collisions can provide. Randomization of directions is the norm in some quantum phenomena, for example the absorption and re-emission of photons by atoms as well as Raman scattering of photons.

In the deterministic evolution of the Schrödinger equation, just as in the classical path evolution of the Hamiltonian equations of motion, the time can be reversed and all the coherent information in the wave function will describe a particle that goes back exactly the way it came before the collision.

But if when two particles collide the internal structure of one or both of the particles is changed, and particularly if the two particles form a temporary larger molecule (even a quasi-molecule in an unbound state), then the separating atoms or molecules lose the coherent wave functions that would be needed to allow time reversal back along the original path.

During the collision, one particle can transfer energy from one of its internal quantum states to the other particle. At room temperature, this will typically be a transition between rotational states that are populated. Another possibility is an exchange of energy with the background thermal radiation, which at room temperatures peaks at the frequencies of molecular rotational energy level differences.

Such a quantum event can be analyzed by assuming a short-lived quasi-molecule is formed (the energy levels for such an unbound system are a continuum of, so that almost any photon can cause a change of rotational state of the quasi-molecule.

A short time later, the quasi-molecule dissociates into the two original particles but in different energy states. We can describe the overall process as a quasi-measurement, because there is temporary information present about the new structure. This information is lost as the particles separate in random directions (consistent with conservation of energy, momentum, and angular momentum).

The decoherence associated with this quasi-measurement means that if the post-collision wave functions were to be time reversed, the reverse collision would be very unlikely to send the particles back along their incoming trajectories.

Boltzmann's assumption of random occupancy of possible configurations is no longer necessary. Randomness in the form of "molecular chaos" is assured by quantum mechanics.

The result is a statistical picture that shows that entropy would normally increase even if time could be reversed.

This does not rule out the kind of departures from equilibrium that occur in small groups of particles as in Brownian motion, which Boltzmann anticipated long before Brown's experiments and Einstein's explanation. These fluctuations can be described as forming short-lived information structures, brief and localized regions of negative entropy, that get destroyed in subsequent interactions.

Nor does it change the remote possibility of a recurrence of any particular initial microstate of the system. But it does prove that Poincaré was wrong about such a recurrence being periodic. Periodicity depends on the dynamical paths of particles being classical, deterministic, and thus time reversible. Since quantum mechanical paths are fundamentally indeterministic, recurrences are simply statistically improbable departures from equilibrium, like the fluctuations that cause Brownian motion.

Entropy is Lost Information

Entropy increase can be easily understood as the loss of information as a system moves from an initially ordered state to a final disordered state. Although the physical dimensions of thermodynamic entropy (joules/ºK) are not the same as (dimensionless) mathematical information, apart from units they share the same famous formula.

S = ∑ pi ln pi

To see this very simply, let's consider the well-known example of a bottle of perfume in the corner of a room. We can represent the room as a grid of 64 squares. Suppose the air is filled with molecules moving randomly at room temperature (blue circles). In the lower left corner the perfume molecules will be released when we open the bottle (when we start the demonstration).

What is the quantity of information we have about the perfume molecules? We know their location in the lower left square, a bit less than 1/64th of the container. The quantity of information is determined by the minimum number of yes/no questions it takes to locate them. The best questions are those that split the locations evenly (a binary tree).

For example:

Are they in the upper half of the container? No.

Are they in the left half of the container? Yes.

Are they in the upper half of the lower left quadrant? No.

Are they in the left half of the lower left quadrant? Yes.

Are they in the upper half of the lower left octant? No.

Are they in the left half of the lower left octant? Yes.

Answers to these six optimized questions give us six bits of information for each molecule, locating it to 1/64th of the container. This is the amount of information that will be lost for each molecule if it is allowed to escape and diffuse fully into the room. The thermodynamic entropy increase is Boltzmann's constant k multiplied by the number of bits.

If the room had no air, the perfume would rapidly reach an equilibrium state, since the molecular velocity at room temperature is about 400 meters/second. Collisions with air molecules prevent the perfume from dissipating quickly. This lets us see the approach to equilibrium. When the perfume has diffused to one-sixteenth of the room, the entropy will have risen 2 bits for each molecule, to one-quarter of the room, four bits, etc.