The Feynman Project

Feynman’s Ph.D. 1941 at Princeton

"The problem was, of course, to make a quantum theory of classical action-at-a-distance electrodynamics, and the form I preferred to express it in was the principle of least action, involving particles only, no field, and in which the interaction occurred at two different times ... because I wished to get rid of the infinite degrees of freedom of the field."

Feynman

Toy model, two electrons 1 and 3 each locally interact with a quantum field oscillator 3, but do not locally interact with each other. What is required to eliminate the field oscillator 3 for a delayed action at a distance between electrons 1 and 2? Feynman’s answer: one needs to choose a unique oscillator solution x3 which gives exactly ½ of sum of advanced (i.e., from the future) and retarded (i.e., from the past) delayed classical actions at a distance between the toy electrons 1 and 2. The Feynman equations in standard notation are in these gifs. In ordinary words these equations mean that the effective interaction for electrons 1 and 2 has the important contribution

Integral from past to present of oscillator pump force from 1 and 2 multiplied by influence functional.

The influence functional in the present consists of two parts. The retarded part from the past to the present, plus the advanced part from the future back to the present.

Each part of the influence functional has an integrand which is the product of two factors. The first factor is [sin2pif(time difference)/2pimf] where m is the mass of the field oscillator 3 of frequency f Hz. The second factor is the pump force again. Therefore, the net interaction is a kind of wavelet sin transform of trans-temporal auto-correlations and cross-correlations of the pump forces from the electrons with themselves and with each other both from past to present and from future to present. In other words, what happens now in the present to the pair of electrons depends both on their past and future co-evolution in a globally self-consistent loop of retarded and advanced causation.

Feynman's Space-Time Picture of Non-Relativistic Quantum Mechanics in which the Born probability density is from the combined influence of advanced paths from the future meeting retarded paths from the past at a single event (x,t). Cramer's transactional interpretation is different from Feynman's idea. Cramer uses the same paths from the past with information running both directions forward and backward in time from (x,t).

Reference: Equation 6.9 p. 133 of Jagdish Mehra’s book, The Beat of a Different Drum on the life and work of Richard Feynman, Oxford, 1994. There are many typo errors in the equations in this book especially the part on Wheeler-Feynman theory. Similarly, John Gribbin’sotherwise interesting pop book, Schrodinger’s Kittens, seems to have a serious conceptual error in regard to his description of Cramer’s “handshake” quantum version of the Wheeler-Feynman classical theory when generalized to massive particles like electrons rather than massless photons. Gribbin also mis-describes the meaning of quantum nonlocality in orthodox quantum theory by using the word “aware” when he says that quantum connected electrons are aware of what the other is doing in a faster-than-light way. No they are not aware because that violates Eberhard’s theorem. What can be said is that their simultaneous behavior in the preferred global frame of the Hubble flow of the expanding universe is synchronized forming a meaningful pattern of Jungian-type “synchronicity”. But observers at each electron have no local way of knowing this until much later when there is enough time for a light signal to connect the two observers. However, in post-quantum mechanics with a new hypothetical property of the universe called “back-action”, then quantum randomness is defeated by a controllable intelligent order and the electrons can then be “aware” in the sense that Gribbin wrote it. However, back-action is only a conjecture at this time that would explain both ordinary consciousness and paranormal spooky action at a distance in a unified way.

Juicy quotes of Feynman given exclusively to Jagdish Mehra on the topic of immediate interest. I met Jagdish at Salam’s Institute in Trieste and at Prigogine’s Institute in Brussels 23 years ago.

“When you accelerate an electron it radiates energy and you have to do extra work to account for that energy. The extra force against which this work is done is called the force of radiation resistance. The origin of this force (following the work of the Dutch physicist, Hendrik Antoon Lorentz) was identified in those days as the action of the electron upon itself. The first term of this action, of the electron on itself, gave a kind of inertia (which was relativistically not quite satisfactory). But the inertia-like term was infinite for the point charge. Yet the next term in the sequence gave an energy loss rate which for a point charge agrees exactly with the rate that you get by calculating how much energy is radiated.”

The next sentence attributed to Feynman by Mehra seems to be missing the word “not” that he missed correcting in the galley proofs. Or maybe I am wrong? What do you think?

Homework Problem 1. What is wrong with the following sentence on p. 92 of Mehra’s book?

“So, the force of radiation resistance, which is absolutely necessary for the conservation of energy, would disappear if I said that a charge could act on itself.”

Feynman continued with his account of John Archibald Wheeler’s response to his idea

‘First,’ he said, ‘let us suppose that the return action by the charges in the absorber reaches the source by advanced waves as well as by the ordinary retarded waves of reflected light, so that the law of interaction acts backward in time, as well as forward in time. .... I was enough of a physicist at that time not to say, “Oh, no, how can that be? For today all physicists know by studying Einstein and Bohr that sometimes an idea which looks completely paradoxical at first, if analyzed to completion in all detail and in experimental situations, may, in fact, not be paradoxical. So it did not bother me any more then it bothered Professor Wheeler to use advanced waves for the back reaction -- a solution of Maxwell’s equations which previously had not been physically used.”

Note, the use of the word “back-reaction” here. I use the word “back-action” in a seemingly different context. But is there a deep connection between the two? In the Wheeler-Feynman classical electrodynamical theory that Feynman is describing, we have two electrons exchanging light waves. The “back-reaction” is from the future, but it is an advanced light wave from the absorber electron in the future back on the source electron in the past. The way I mean “back-action” is, on the surface, different. I use Bohm’s ontology. Consider only one electron. This electron has an actual position in space at a definite time that is a “hidden-variable” not found in Bohr’s Copenhagen interpretation of the meaning of quantum theory. According to Bohm, the electron really is a pointlike particle which has a quantum pilot wave attached to it. This pilot wave is not to be confused with a light wave in the way Wheeler and Feynman mean. Light waves move at the speed of light. Quantum pilot waves of slower than light electrons counter-intuitively move faster than light. Furthermore, when there are two electrons that have interacted with each other, they share a common pilot wave in a six-dimensional configuration space that is more than three-dimensional ordinary space that we see and feel with our senses. Now, the pilot-wave does exert a new kind of quantum force on the electron which is qualitatively different from the classical electric and magnetic forces on the electron. Indeed, the new quantum force explains why the point electron can behave like a wave under proper conditions. But, the electron cannot emit or absorb its own quantum pilot waves, nor can it modify them in any direct way, because to do so would impose a controllable coherent order on what is believed to be an irreducible uncontrollable quantum randomness on fundamental natural processes. So this looks very different from the Wheeler-Feynman model. However, let us suppose that orthodox quantum mechanics is only an approximation to a deeper “post-quantum mechanics” in which the electron can directly modify the shape of its own attached pilot wave by emitting new parts and/or absorbing old parts of it. This is what I mean by “post-quantum back-action”. In summary, the Wheeler-Feynman “back-reaction” is an advanced light wave from the future absorber electron back on the emitter electron at the past moment it first emitted a retarded light wave to that absorber electron. This kind of back-reaction involves two different particles at the very minimum. In contrast, post-quantum back-action, is the reaction of the particle to the Bohm quantum force of the particle’s own pilot wave on it. We cannot imagine that one electron emits a quantum pilot wave to another electron which then absorbs it. That is entirely the wrong idea. What we can imagine is that both electrons modify their common pilot wave in six-dimensional “configuration space”. But such a modification violates orthodox quantum mechanics. Note, however, that the quantum pilot-wave of a given electron is indirectly modified only by the action of other electrons and other particles. This is what happens when both the boundary conditions on the pilot-wave are changed and when the interaction Hamiltonian is changed. Therefore, from this point of view, there is a deep similarity between orthodox quantum mechanics and Feynman’s idea that an electron does not interact directly with itself, but only interacts with other electrons, or quarks etc., because the electron cannot modify its own pilot wave, only other electrons etc. can do so.

Feynman continued: “Wheeler used advanced waves to get the reaction back at the right time...”. Wheeler then considered the phase shift of the retarded waves going through a medium of absorber electrons forward in time. Wheeler, according to Feynman, made the adhoc postulate that the advanced waves do not suffer a similar phase shift as they propagate backward in time so that “there will be a gradual shifting in phase between the [advanced] return and the original [retarded] signal so that we would only have to figure that the contributions act as if they come from only... the first wave [Fresnel] zone ...” Wheeler told Feynman to figure out how much advanced and retarded waves are needed to get the right answer for radiation resistance. Feynman told Merhra, “I found that you get the right answer if you use half-advanced and half-retarded [potentials] as the field generated by each charge. That is, one has to use the solution of Maxwell’s equations which is symmetrical in time and the reason why we got no advanced effects at a point close to the source in spite of the fact that the source was producing an advanced field is” that advanced waves from the future absorbers exactly cancel the advanced wave from the source on a test charge close to the source. Feynman and Wheeler concluded “we could account for the radiation resistance as direct action of the charges of the absorber acting back by advanced waves on the source.”

Note that the standard classical cosmological model of our universe expanding from the big bang does not obey the above simple Wheeler-Feynman total absorber future boundary condition for an open universe. The mass needed to close the universe is still missing. This has led John Cramer to postulate an additional total reflector past boundary condition at the big bang singularity itself to cancel the advanced waves that do not get absorbed in the far future of an open universe. I much prefer a universe in which advanced waves deliver precognitive messages to us as we bootstrap our evolution toward higher intelligence.

The Wheeler-Feynman model solved the classical electron self-energy problem because electrons did not act directly on themselves. They also eliminated the infinite number of classical degrees of freedom, i.e., “modes”, in the electromagnetic force field getting a direct advanced and retarded delayed Lagrangian between a finite number of source electrons. Their idea was to eliminate the infinite zero point energy of the electromagnetic quantum vacuum fluctuations. The experimentally observed Casimir electromagnetic zero-point force would have to be a purely finite boundary effect, but it was not easy to make a quantum theory of this. Feynman said

“... the classical theory I was starting out with was not in Hamiltonian form ... because the action being delayed could be represented beautifully by a minimum principle ... it involved only a Lagrangian ... there was no field; it was a direct particle-particle interaction. The only coordinates in the system were of the particles, and there was not going to be an infinite number of degrees of freedom.
... an accelerated charge in otherwise charge-free space does not radiate energy, ... in general, the fields which act on a given particle only arise from other particles, ... these fields are represented by one-half the retarded plus one-half the advanced Lienard-Wiechert solutions of Maxwell’s equations. In a universe in which all light is eventually absorbed, the absorbing material scatters back to an accelerating charge a field, part of which is found to be independent of the properties of the material. This part is equal to one-half the retarded minus one-half the advanced field generated by the charge. It produces radiative damping ... and combines with the field of the source to give retarded effects alone.”

* Note then, in an advanced universe messages from the future would be detectable as modulations in the radiative damping. This might happen in the microtubules inside our nerve cells which act like electromagnetic wave guides filled with “ordered water” according to Hameroff. Sir Fred Hoyle, a Fellow of The Royal Society and a retired astronomy professor at Cambridge University, in his book, The Intelligent Universe (Holt, Rinehart & Winston, 1986), speculates on such advanced waves triggering conscious brain events that cause us to make a decision to do one thing rather than another in spite of our prior intentions.

Consider a toy universe with only two charged particles 1 and 2 in it. The field on particle 1 at time t is (R2 + A2)/2 in the Wheeler-Feynman theory. Note that A2 is emitted by 2 at future delayed time time t+ = t + |r1(t+) - r2(t+)|/c and R2 is emitted at past delayed time t- = t - |r1(t-) - r2(t-)|/c. Both A2 and R2 arrive at 1 at time t. The r denote the instantaneous positions of the particles at the same delayed times.

Write the mathematical identity

(R2 + A2)/2 = R2 + (1/2)R1 - (1/2)A1 - (R1 + R2)/2 + (A1 + A2)2

R1 and A1 are the retarded and advanced SELF-fields of particle 1 on itself at time t. Define

I = R2

II = (1/2)R1 - (1/2)A1

III = - (R1 + R2)/2 + (A1 + A2)2

The Wheeler-Feynman boundary condition is that III = 0.

II is shown by Dirac to give the radiation damping force on 1
which nonrelativistically is

(2/3)(e^2/c^2)d^3r1/dt^3

which involves the “jerk” i.e., the third time derivative. This implies a classical nonlocality in time with classically unstable runaway solutions in which the electron spontaneously accelerates sucking infinite energy out of the vacuum. One way out of this is to violate causality such that the electron precognitively accelerates before an external signal arrives. This is not a pretty picture by the usual standards.

Why an electron in constant acceleration does not radiate, so that Einstein's equivalence principle is not violated by classical electrodynamics. Remember the classical force of radiation resistance is from the future in the Wheeler-Feynman theory.

Note 1. Feynman's method for computing quantum statistical averages also contains the all-important commutation rules of the Lie Algebra that determine the several Heisenberg uncertainty relations among incompatible observables. However it does not resolve the traditional ambiguities in the ordering of noncommuting operators acting on the states of Hilbert space. p. 138 Mehra

Unanswered questions in Feynman's dissertation that seem to be vital to post-quantum physics in the 21st Century.

One such problem was the nonexistence of the wave function for action-at-a-distance theories. Mehra p.139

Feynman told Mehra about the fundamental nonlocality in time in addition to the well known EPR nonlocality in space. Although spacetime is unified in relativity, there is a second kind of "process time" of our consciousness which has a direction or "arrow" not found in classical relativity. Thus:

It is not unreasonable that it should be impossible to find a quantity like a wave function, which has the property of describing the state of a system at one moment, and from which the state at other moments may be derived. In more complicated mechanical systems ... the state of the system at a particular time is not enough to determine in a simple manner the way that the system will change in time. It is also necessary to know the behavior of the system at other times: information which a wavefunction is not designed to furnish. An interesting, and at present unsolved, question whether there exists a quantity analogous to a wave function for these more general systems. ibid

Therefore, we should not expect a simple wave function to represent the quantum mind attached to the classical brain as is done in Stapp's theory for example, where the quantum mind collapses to a classical brain state and a subjective "felt" psychological experience. To what extent is Penrose's idea, that human understanding surpasses any algorithm, tied into Feynman's insight? Aharonov and his students are investigating these new generalized multiple-time states which can act like "quantum time machines" accessing information back to a time before the machine was actually constructed. That is, there are "weak measurements" whose results are beyond the eigenvalue spectrum of the usual "strong" Von Neumann measurements. One can think of non-statistical results on individual systems, like the living brain, that are "protected".

Feynman found non-real complex expectation values in his action-at-a-distance quantum theory. This means a creative breakdown of the conservation of quantum probability current in classical configuration space. The total probability at any one moment is not conserved in living conscious matter because new possibilities come into being and old ones disappear. That is, the size of the basis in Hilbert space is not a constant of the motion. This is what negative quantum probability is telling us.< Thus, Feynman wrote:

An arbitrary action functional S produces results which do not conserve probability; for example, the energy values come out compex. I do not know what this means nor was I able to find that class of functionals which would be guaranteed to give real values for the energies. Letter to Kelber, 1949

Look at the classical Lienard-Wiechert potentials for a moving charge. Feynman says:

But Coulomb's law is wrong. The discoveries of the 19th century showed that influences cannot travel faster than a certain fundamental speed c, which we now call the speed of light.

This classical idea neglects quantum nonlocality. Feynman in Volume 1 of his Cal Tech Lectures derives the electric field of a moving charge.i .e., eq. 28.3 consisting of three terms. The first term is the delayed static Coulomb force. But the second term is more interesting as it tends to diminish the delay in the first term. This second term is the time derivative of the first term multiplied by the delay time. So the first two terms like like a Taylor series. This introduces a theme I will come back to because in other places of a quantum nature, Feynman refers to the static Coulomb field from virtual photons as "instantaneous", so there is this ambiguity between "delay" and "instantaneous" in his different writings. For example, compare the above quote of Feynman that the retarded static Coulomb force action is delayed confined to the forward light cone with his remark about the above toy model where two electrons, or "atoms", 1 and 2 interact with oscillator 3:

To what extent can the motion of the oscillator be disregarded and atoms be considered as interacting directly? ... this problem has been solved in a special case by Fermi, who has shown that the oscillators of the electromagnetic field which represent longitudinal waves could be eliminated from the Hamiltonian, provided an additional term be added representing instantaneous Coulomb interactions between particles ...

And again in "Space-Time Approach to Quantum Electrodynamics" I find

"The usual elimination of longitudinal and scalar virtual photons (leading to an instantaneous Coulomb potential
can of course be performed here too ... the fact that 1/K^2 does not contain k4 means that k4 can be integrated first, resulting in an instantaneous interaction, and the d^3K/K^2 is just the momentum representation of the Coulomb potential, 1/r.

So which is it? Delayed or instantaneous? What happened to the delayed Coulomb potential in the classical Lienard-Wiechert formula? Do the quantum fluctuations tunnel through the classically sharp light cone barrier in the near field? What about the second term in Feynman's classical formula eq. 28.3 in Vol I of his 1963 Cal Tech Lectures? Does that term also diminish the delay? How did Feynman derive his classical formula which no other text book seems to have. Also, most textbooks say that accelerating charges radiate, but, the real story that Feynman tells in his newly published Lectures on Gravitation is that it is the time-derivative of the acceleration, not the acceleration itself, that is important. Only in cyclic motions do we get the illusion that it is the acceleration that is doing the trick. The latter would contradict Einstein's equivalence principle. So we know that a charge with constant acceleration cannot radiate!

Feynman continued:

... the system with the oscillator is not equivalent to the system without the oscillator for all possible motions of the oscillator, but only for those for which some property (i.e., the initial and final position) of the oscillator is fixed...

Note that both the past and the future influence the present in the above remark. The same archeypal idea has resurfaced in Aharonov's et-al recent researches.

Jack Sarfatti on John Bell

Version 0.1

“.. despite numerous solutions of the [quantum measurement] problem ‘for all practical purposes’, a problem of principle remains. It is that of locating precisely the boundary between what must be described by wavy quantum states on the one hand, and Bohr’s ‘classical terms’ on the other. The elimination of this shifty boundary has for me always been the main attraction of the ‘pilot-wave’ picture.” p. viii Speakable and unspeakable in quantum mechanics, Cambridge, 1987

Bohm’s theory shows why Bohr’s idea that “the result of a ‘measurement’ does not in general reveal some preexisting property of the ‘system’, but is a product of both ‘system’ and ‘apparatus’...” This is because the Bohm quantum force of pilot-wave on its attached particle, whose actual position is the “hidden variable”, is nonlocal and context-dependent. “Nonlocal” means action-at-a distance which can effectively occur over “faster-than-light”,i.e., “spacelike” separations between events in Einstein’s flat spacetime. “Context-dependent” means that the force is form-dependent on which pilot-wave in quantum Hilbert space beyond spacetime is “active” in Bohm’s sense.

“While the usual predictions are obtained for experimental tests of special relativity, it is lamented that a preferred frame of reference is involved behind the phenomena.”

This is because flat spacetime is still not dynamical in special relativity the way it is in general relativity where there are preferred frames within the given solution to the Einstein curved spacetime field equations. For example, the preferred frame in the standard classical cosmological model of our expanding universe is the Hubble flow in which the cosmic blackbody radiation is isotropic to one part in a hundred thousand.

“Any study of the pilot-wave theory, when more than one particle is considered leads quickly to the question of action at a distance, or ‘nonlocality’, and the Einstein-Podolsky-Rosen correlations.”

Even the one-particle problem has nonlocality because the boundary conditions on that particle’s pilot-wave is a phenomenological “lumped parameter” description of all the particle that make the walls at which the pilot-wave is forced to vanish.

Bell says he has a “negative” attitude toward the several version of the “many-worlds” intepretations of the meaning of quantum theory.

Bell says, e.g. p. 146 that no local theory can reproduce all the statistical predictions of orthodox quantum mechanics. Note orthodox quantum mechanics has no direct “back-action” of the hidden-variable actual position of the particle on its guiding pilot-wave. Back-action causes a distortion away from the orthodox statistical predictions, but it does not restore locality. On the contrary, it permits the control of nonlocality which is uncontrollable when the back-action is effectively zero.

Bell (e.g., p. 155) shows that Bohr did not successfully refute Einstein. Yet many science journalists, like Martin Gardner, piously quote a famous passage by Bohr that Bell shows is vacuous. I mean Bohr’s mention of “no question of a mechanical disturbance ... an influence on the very conditions which define the possible types of predictions regarding the future behavior of the system ... with the finite and uncontrollable interaction between the objects and the measuring instruments...”. Bell responds that he has “very little idea” of what Bohr means by “mechanical disturbance” and “an influence on the very conditions” which is the cliche oft-quoted in the New York Times and Scientific American, for example as if it explained the problem. Bell concludes (p. 156) “Is Bohr just rejecting the premise-no action at a distance’- rather than refuting the argument?”

Appendix on Schwinger

Version 0.1

Dec 1, 1996

The Sarfatti Papers

Investigations of, and Meditations on, Schwinger’s Ideas

Quotes from Schwinger are numberd. See his Dover volume on QED.

1. “The development of quantum mechanics in the years 1925 and 1926 had produced rules for the description of systems of microscopic particles, which involved promoting the fundamental dynamical variables of a corresponding classical system into operators with specified commutators.”

For example, p = linear momentum of a point particle, x is its position in 3D classical mechanical space. This can be generalized to N point particles in 3n classical configuration space. The classical Poisson bracket {p, x} becomes the commutator [p,x] = ih which is an operator equation on the Hilbert space of wavy quantum states. These states are generally entangled enforcing instantaneous synchronized behaviors among the n particles which are widely separated from each other in 3D classical mechanical space. In Bohm’s ontology, an observer stuck on only one of the particles will not be aware of these distant connections in real time at the moment they are influencing her behavior at a distance if there is no direct back-action of the actual path of the N particle system on its guiding pilot wave in Hilbert space. Precognitive remote viewing and trance-channeling, if they are facts, require back-action in all living organizations of matter and radiation.

2. “By this means, a system, described initially in classical particle language, was equivalent energetically to a denumerably infinite number of harmonic oscillators.”

This is any classical system, not only the electromagnetic field. Even a particle in a potential can be described in terms of second-quantized creation a*k and destruction ak operators for the “quanta” of these generalized oscillators -- one oscillator for every mode labeled by quantum numbers k. The potential scatters these oscillating quanta. For example, the classical Hamiltonian

The vacuum |0>k for the k-th mode, is easily annihilated back to the NOTHING it sprang from - at least formally. Can it also happen actually? I mean ontologically? Does this mathematical operation faithfully map a physical process as it should if our theory is complete? For example, in the Casimir force effect experiment the flat parallel closely space conducting metal planes eliminate all field oscillators whose waves cannot fit between the plates and thereby obey the classical boundary condition that the tangential component of the electric field must vanish in the zero frequency limit. What about skin penetration depth for AC fields? The boundary condition may be more subtle than simply requiring that the wavefunction of the photons vanish at the plates. That is a “large thought”. So this is a matter of boundary conditions. We can eliminate and restore modes in finite regions of spacetime, but we can find no operator in this simple second-quantized algorithm for the reverse creation of the vacuum from NOTHING. Is this is an “arrow of time”?

Why only a countable i.e., “denumerable” infinity of oscillators if the world is a spacetime continuum? It is not a continuum at short distances and high energies. Obviously, the classical spacetime breaks down at the Planck distance of 10^-33 cm which is the shortest possible distance. The highest possible frequency is 3x10^43 Hz which is a single photon with an energy of 10^-5 gm or 10^16 ergs or 10^19Gev. Classical relativity breaks down here. This is the unknown region of quantum gravity. Roger Penrose believes that these ultra-strong energy “metric fluctuations”, disrupting the classical fabric of spacetime, dribble down into our low energy biological brain and cause our conscious experiences! Here is another large idea.

So the mathematical quantization of classical particles creates wavelike quantum behavior matter seen in numeorus experiments on electron, proton, neutron, atomic and molecular beams and in the solid-state.

John Gribbin, in his pop book, Schrodinger’s Kittens, cites an actual experiment which casts doubt on Bohr’s strict idea of complementarity because both particle and wave features are observed simultaneously. However, it is not clear that there is an actual violation of the Heisenberg uncertainty principle in that experiment. Note, that the original Einstein-Podolsky-Rosen paper of 1935 showed that faster-than-light quantum action at a distance, in a unique actual objective universe, where wave functions describe individuals, is required in order that the Heisenberg uncertainty principle NOT be violated for the distant twin particle on which no direct measurements are made. The argument goes like this. Suppose I decide to measure position on particle 1. From the structure of the common coherent quantum wave that both particles share, I immediately know the position of the far-away twin particle 2. But, suppose instead, I had decided to measure the complementary momentum of particle 1. I would then know the momentum of the far-away particle 2. But, if there is no faster-than-light quantum action-at-a-distance, there is no way my choice of what to measure at particle 1 can influence the actual properties of far-away particle 2. Therefore, particle 2 (and also by logic even particle 1) must violate the Heisenberg principle by having exact values of momentum and position simultaneously! Note that Bohm’s hidden-variable/guiding pilot-wave metatheory of the meaning of quantum mechanics does have hidden-variable particles that do violate Heisenberg’s principle on the unmeasured actual individual quantum event level. They also violate special relativity on that level. But when you add the measuring interaction the statistical predictions agree with Heisenberg’s principle. Special relativity is also obeyed statistically in the classical limit. Bohm’s theory, however, is highly nonlocal. John Bell abstracted the essential truth that any theory that agrees with the statistical predictions of quantum mechanics must be nonlocal at the individual quantum event level. These individual quantum events are spread out in 4D spacetime though they are localized in 3n classical configuration space for the system point of a complex many-particle system like the living human brain, for example. The resulting coordinated synchronized behavior of widely separated pieces of the coherent quantum complex system cannot be completely explained as classical signal transfer encoding and decoding. This is the “organic” feature where the whole is more than the classical sum of its parts as in naive reductionism. However, Eberhard’s theorem shows, that for quantum mechanics, nonlocality cannot be used as a direct communication channel able to locally decode faster-than-light messages before they are correlated with the help of classical signals limited to the speed of light or below. My new addition of back-action takes us to a new post-quantum mechanics in which quantum nonlocality is a communication channel and the statistical predictions of quantum mechanics are distorted in a controllable way. Our free-will depends on this violation of quantum mechanics.

4. “It was also known that electromagnetic radiation contained in an enclosure, when considered as a classical dynamical system, was equivalent to a denumerably infinite number of harmonic oscillators.”

This was the classical “ultra-violet” catastrophe that upset Max Planck at the close of the 19th Century. Maxwell’s electromagnetic field equations wrongly predicted an infinite amount of energy radiated at high frequencies for “blackbody” radiation in thermodynamic equilibrium with the material walls of the enclosure.

5. “With the application of the quantization process to these fictitious oscillators the classical radiation radiation field asumed characteristics describable in the complementary classical particle language ... The quantization procedure could be transferred from the variables of the fictitious oscillators to the components of the field in three-dimensional space, based upon the classical analogy between a field specified in small spatial cells and equivalent particle systems.”

For example, the detection of light in individual quantum events always happens in “grainy” tiny regions of spacetime. Einstein’s 1905 explanation of the photoelectric effect, where light ejects electrons from a metal surface, showed the neccessity for the “photon” idea E = hf used by Planck in 1900 to explain why only a finite amount of energy was found in the high-frequency spectrum of blackbody radiation.

Our expanding universe has ubiquitous blackbody radiation left over from the big bang that created real time. The temperature of this radiation tells us the cosmic time interval since time was created. The distribution of red and blue shifts of this omnipresent radiation tells us the direction and speed in which we are moving relative to the state of absolute rest defined as the Hubble flow of the expanding universe. Einstein’s classical general theory of relativity applied to cosmology restores a preferred global coordinate system for the entire universe in which to unambiguously define an objective state of absolute rest. Einstein’s special theory of Lorentz transformations for time dilation, the equivalence of mass to energy, and the velocity addition law so that no ordinary “real”classical matter goes locally faster-than-light, still apply “locally” in regions of spacetime that are small compared to the radii of curvature or warping.

Bohm’s hidden-variable/pilot-wave theory uses this comoving Hubble frame as the frame in which the quantum potential acts instantaneously.

The Alcubierre faster-than-light warp drive has slower-than-light local timelike free-floating, i.e., weightless, geodesic motion with no time-dilation for the Star Ship. The faster-than-light effective speed is due to contraction of spacetime in the bow and expansion of spacetime in the stern arranged by the proper placement of exotic matter with negative energy density relative to the normal vacuum. Similarly, for the construction of Star Gates for instant interstellar and intergalactic travel including paradox-free time-travel to the past along closed timelike worldlines using traversable wormholes. Time-travel to the past to a time before the wormhole came into being is impossible. However, the Aharonov “quantum time machine” using “weak measurements” on the “protected quantum states”of individual quantum systems, e.g., like the nonlocally connected web of tubulin control electron quantum computing switches in our nerve cells, can access information from a time before the time machine was constructed. This might explain experiences of reincarnation, past and future lives.

6. “When it was attempted to quantize the complete electromagnetic field, rahter than the radiation field that remains after the the Coulomb interaction is separated, difficulties were encountered that stem from the gauge ambiguity of the potentials that appear in the Lagrangian formulation of the Maxwell equations. The only real dynamical degrees of freedom are those of the radiation part of the field.”

Why is that? Gauge invariance does impose a relation between the longitudinal and timelike polarization parts of the 4-vector potential Au. These form the non-radiating near-field which we certainly can measure. Gauge invariance has two quantum, i.e., forms global and local. The local part has the classical analog that the electromagnetic field Lorentz-group tensor Fuv is invariant when one adds any gradient of a scalar function to the 4-vector potential Au. Intuitively, one can imagine a phase clock with only one hand at each event in spacetime. Global gauge invariance means that the action of the source is invariant under a rigid phase shift, i.e., fixed movement of the hand of the clock, all over spacetime. At this stage, there is no Fuv field. Local gauge invariance is a form of the locality principle which asserts the much stronger idea that the phase clocks can be set independently without changing the action. One has to add additional compensating gauge force field Fuv of spin 1, and then the action of the extended source-force system is invariant. Gauge invariance is associated with conservation of total electric charge. It also forces zero rest mass on the photon. The Meissner effect in superconductivity breaks this kind of gauge invariance and gives the photon inside the superconductor an effective rest mass with a large longitudinal polarization. Remember Feynman and Wheeler wanted to get rid of gauge fields altogether and have only advanced and retarded delays in the action consisting only of sources with no intemediating forces!

7. “Yet one can employ additional degrees of freedom which are suppressed finally by imposing a consistent restriction on the admissible states of the system.”

Now things start to get ugly. This is like Bohm having to use a preferred frame for absolute rest in which his quantum force is instantaneous even though the statistical averages over an ensemble of a large number of identically prepared quantum systems obeys special relativity in which there is no such preferred frame for absoute rest. Note Einstein’s classical field equations for the curved spacetime metric geometry of mutually tilted light cones obeys both Lorentz invariance (i.e., for non-accelerating inertial frames) and general coordinate invariance (i.e. for accelerating local inertial frames), but the solutions can break or hide this symmetry. The same spontaneous broken symmetry idea happens in ferromagnetism, ferroelectricity, antiferromagnetism, superconductivity and other second order phase transitions in materials and in the Higgs mechanism for the quantum vacuum states that give inertia to the quarks and leptons as well as to the weak bosons of the parity non-conserving radioactive part of the electro-weak force. The weak and strong forces correspond to higher dimensional hyperspheres attached to each point in spacetime as “fibers”. The weak force has a three real dimensional hypersphere surface, the strong force has an eight real dimensional hypersphere surface. In comparison, the electromagnetic force of Maxwell as the simplest one real dimensional hypersphere. The number of independent phases is the dimension of the surfaces of the hyperspheres. The quanta of the weak and strong force fields carry the same weak and strong charges that their sources carry. In the case of the strong force, the effective charges get weaker with higher energy. In the case of the electro-weak force the charges get stronger with higher energy. There is an energy where the electromagnetic and the weak force have the same strength. There is another still higher energy where all of the forces have the same strength or size of “charge”. As the universe expands it goes through this sequence of symmetry breakings in the reverse order where the initially unified force splits into forces of different strengths. The new forces are replaced by curvatures in the fiber space beyond spacetime just as the gravitational force of Newton is replaced by the curvature inside spacetime in Einstein’s general relativity.

8. “To make more evident the relativistic invariance of the scheme, other equivalent forms were given to the theory by introducing different time coordinates for each of a fixed number of charged particles coupled to the electromagnetic field.”

So, for example, a two particle entangled system has an eight-dimensional 4n configuration space with two independent local times on each particle world line, rather than the nonrelativistic 3n+1 space. Bohm’s theory stays with the latter.

9. “The synthesis of the complementary classical particle and field languages in the concept of the quantized field, as exemplified in the treatment of the electromagnetic field, was found to be of general applicability to systems formed by arbitrary numbers of identical particles, although the rules of field quantization derived by analogy from those of particle mechanics were too restritive, yielding only systems obeying the Bose-Einstein statistics. The replacement of commutators by anti-commutators was necessary to describe particles, like the electron, that obey the Fermi-Dirac statistics. In the latter situation there is no realizable physical limit for which the system behaves as a classical field.”

Schwinger, in the last sentence, is saying that coherent superpositions of different numbers of fermions like

(1 + c1a*k + c2a*k a*k’ + c3a*k a*k’ ak*” + .....)|0>,

where k not equal to k’ etc, is not possible because of several super-selection rules. For electrons, there is the superselection rule on electric charge. One wonders if this is really true.

10. “The coupling of a [point] electron with the electromagnetic field implied an infinite energy displacement, and, indeed, an infinite shift of all spectral lines emitted by an atomic system; in the reaction of the electromagnetic field stimulated by the presence of the electron, arbitrarily short wave lengths play a disproportionate and divergent role.”

So, early quantum electrodynamics was back to the original ultraviolet catastrophe in blackbody radiation that caused Planck to write E = hf for the discrete energy transfer between matter and radiation in mutual thermodynamic equilbrium. The solution now was not as simple, and to this day, according to Feynman, has not really been solved. Feynman calls renormalization, that he co-invented, a “scandal” and a “shell game”. The most natural solution is to use the quantum gravity cut-off at the Planck scale. That is, spacetime is not really a continuum, consists of tiny cells of volume 10^-99 cc with quanta of time of 10^-44 seconds. We expect Lorentz symmetry to break there anyway.

11. “The phenomenon of electron-positron pair creation, which finds a natural place in the relativistic electron field theory, contributes to this situation in virtue of the fluctuating densities of charge and current that occur even in the vacuum state as the matter-field counterpart of the fluctuations in the electric and magnetic field strengths. In computing the energy of a single electron relative to that of the vacuum state, it is of significance that the presence of the electron tends to supress the charge-current fluctuations induced by the fluctuating electromagnetic field. The resulting electron energy, while still divergent in its dependence upon the contributions of arbitrarily short wavelengths, exhibits only a logarithmic infinity; the combination of quantum and relativistic effects has destroyed all correspondence with the classical theory and its strongly structure-dependent electromagnetic mass. The existence of current fluctuations in the vacuum has other implications, since the introduction of an electromagnetic field induces currents that tend to modify the initial field; the ‘vacuum’ acts as a polarizable medium. New non-linear electromagnetic phenomena appear, such as the scattering of one light beam by another, or by an electromagnetic field. But in the calculation of the current induced by weak fields, there occurred terms that depended divergently upon the contributions of high energy [virtual] electron-positron pairs.”

By the uncertainty principle, a virtual electron-positron pair of mass 2mc^2 can spontaneously fluctuate out of and back into the vacuum in a time shorter than h/2mc^2 seconds = 6 10^-27 erg-sec/2 10^-27 gms 10^21 cm^/sec^2 which is approximately 10^-21 sec corresponding to a frequency of a billion trillion vibrations per second. The fundamental unit of quantum gravity time is 10^-23 times shorter! So the classical spacetime continuum is still a very good model for quantum electrodynamics.

13. “... the contribution to the induced charge density that is proportional to the inducing density, with a logarithmically diverging coefficient, would result in an effective reduction of all densities by a constant factor which is not observable separately under ordinary circumstances.”

14. “In contrast with the divergences at infinitely high energies, another kind of divergent situation was encountered in calculating the total probability that a photon be emitted in a collision of a charged particle. ... it is certain that ‘zero’ frequency quanta be emitted ... the correct quantum description of a freely moving carged particle includes an electromagnetic field that accompanies the particle, as in the classical picture, ... the quantum treatment of the radiation process was inconsistent in its identification of the mass of the electron, when decoupled from the electromagnetic field, with the experimentally observed mass ... the electromagnetic coupling ... [generates] a field that accompanies the charge, and which reacts on it to produce an electromagnetic mass.”

15. “... delicate measurements disclosed that the electron possessed an intrinsic magnetic moment slightly greater than that predicted by the relativistic quantum field theory of a single particle, while another prediction of the latter theory concerning the degeneracy of states in the excited levels of hydrogen was contradicted by observing a separation of the states [i.e., Lamb shift].”

16. “The parameters of mass and charge associated with the electron in the formalism of electrodynamics are not the quantities measured under ordinary conditions. A free electron is accompanied by an electromagnetic field which effectively alters the inertia of the system, and an electromagnetic field is accompanied by a current of electron-positron pairs which effectively alters the strength of the field and of all charges. Hence a process of renormalization must be carried out, in which the initial parameters are eliminated in favor of those with immediate physical significance.”
Simple subtraction does not work relativistically because:

17. “the difference of two individually divergent terms is generally ambiguous. It was necessary to subject the conventional Hamiltonian electrodynamics to a transformation designed to introduce a proper description of single electron and photon states, so that the interactions among these particles would be characterized from the beginning by experimental parameters. ... to the first significant order of approximation in the electromagnetic coupling, the electron acquired new electrodynamic properties, which were completely finite. These inclued an energy displacement in an external magnetic field corresponding to an additiona spin magnetic moment, and a displacement of energy levels in a Coulomb field. Both predictions were in good accord with experiment ...”

18. “However, the Coulomb calculation disclosed a serious flaw; the additional spin interaction that appeared in an electrostatic field was not that from the relativistic transformation ... of the supplementary spin magnetic moment ...”

Renormalization must be consistent with both special relativity and gauge invariance. The divergent terms introduce ambiguities so that use of a particular choice of gauge could violate special relativity (i.e., violate “covariant features”) in doing perturbation theory.

19. “... the various divergent integrals could be rendered convergent while maintaining their ... covariant features ... by substituting, for the mass of the particle, a suitably weighted spectrum of masses, where all auxiliary masses eventually tend to infinity.”

Instead of infinity, use the Planck mass of 10^19 Gev. These are the superstring states. Quantum electrodynamics is the ultra-low energy limit. Was the spin-statistics connection violated in this technique for the auxilliary masses? This could be the exotic matter needed to make traversable wormholes and warp drives. On the contrary, Schwinger, at the time, did not think of these as real particles. He thought in terms of:

20. “an invariant proper time parameter. Divergences appear only when one integrates over this parameter, and gauge invariant, Lorentz invariant results are automatically guaranteed merely by reserving this integration to the end of the calculation.”

Schwinger mentions the Wheeler-Feynman alternative of eliminating the electromagnetic field as an independent dynamical structure and using only the delayed action-at-a-distance between the electrons. He says the two approaches are:

21. “equivalent ... the formal integration of the differential equations of one method [i.e., Schwinger’s] supplying the starting point of the other [i.e., Feynman’s].
Neglecting bound states, Schwinger claims that the renormalized terms in the perturbation series for scattering are each finite. But one could not prove convergence of the series! In fact Schwinger says it is an “asymptotic expansion”. This is still not a pretty picture, but it was pretty enough, however, for Feynman, Schwinger and Tomonaga to share the Nobel Prize because of the incredibly accurate agreement with experiment of the sum of only the first few terms in the series.

The action is special relativistically invariant. It is better than the Hamiltonian formulation which is not invariant. For infinitesimal time intervals and simple systems the classical action is identical to the phase of the transformation amplitude for the time evolution of the system. This was proved by Feynman in his Ph.D. dissertation finishing what Dirac had started but left dangling. The action principle itself creates Bose-Einstein commutation rules and Fermi-Dirac anti-commutation rules. The former leads to robust Bose-Einstein condensates, one of which may be the physical substrate of the mind, the latter leads to the Pauli exclusion principle which makes matter stable.

22. “Furthermore, the connection between the statistics and spin of the particles is inferred from invariance requirements.”

This connection says that force particles of even spin 0, 1, 2 ... are bosons able to condense into a superfluid in which a large number of quanta are in the same single-particle state under the right conditions. An example of such a Bose-Einstein state would be a coherent superposition of
a*kn|0> for the same k and different n where n could go as high as 10^23. The Bose-Einstein condensate has a phase defined by the superposition of different numbers of particles in the single-particle state a*k|0>.

The spin-statistics connection also says that elementary source particles have odd spin ½ and maybe 3/2 etc and obey the Pauli exclusion principle of no more than one particle in the same single-particle state.

These propagators can be visualized in spacetime and applied to both bound states and scattering problems.

However, Schwinger says

24. “the observational basis of quantum electrodynamics is self-contradictory. The fundamental dynamical variables of the electron-positron field, for example, have meaning only as symbols of the localized creation and annihilation of charged particles, to which are ascribed a definite mass without reference to the electromagnetic field. Accordingly it should be possible, in principle, to confirm these properties by measurements, which, if they are to be uninfluenced by the coupling of the particles must be done instantaneously ... they can never be disengaged to give those properties immediate physical significance. ... a convergent theory cannot be formulated consistently within the framework of present spacetime concepts.