An analysis of Wolfenstein parametrization for the Kobayashi-Maskawa matrix shows that it has a serious flaw: it depends on three independent parameters instead of four as it should be. Because this approximation is currently used in phenomenological analyzes from the quark sector, the reliability of almost all phenomenological results is called in question. Such an example is the latest PDG fit from \cite{CA}, p. 150. The parametrization cannot be fixed since even when it is brought to an exact form it has the same flaw and its use lead to many inconsistencies.

The Dita paper claims that the Wolfenstein parameterization is defective because its apparent four real degrees of freedom are redundant; instead there are only three. Such a defect would prevent the parameterization from exploring “almost all” of the space of possible 3×3 unitary matrices. Instead of the whole 4-dimensional real manifold of 3×3 unitary matrices (up to multiplication of rows and columns by complex phases), one would obtain only a 3-dimensional submanifold.

In particular, the paper claims that it is impossible to use the Wolfenstein parameterization to obtain a unitary 3×3 matrix with the magnitude of all amplitudes the same (and equal to sqrt(1/3) ). This is the “democratic unitary 3×3 matrix”, a subject Marni Sheppeard and I have explored at length. It took me a few minutes to verify that it is possible to set these parameters (lambda, A, rho, and eta) to obtain a unitary matrix with all magnitudes equal.Continue reading →

I’m kind of surprised by this, given that the paper proposes a new theory of gravity. I was expecting to have that portion excised.

And to help make a week more perfect, my paper for Foundations of Physics, titled Spin Path Integrals and Generations, got a good review along with a nasty one (and much good advice from both), and the editor has asked for me to revise the manuscript and resubmit. So I suppose this paper will also eventually be published. I’m a little over half finished with the rewrite. This paper is, if anything, even more radical than the gravity paper.

I’m releasing two papers that relate Heisenberg’s uncertainty principle, spin-1/2, the generations of elementary fermions, their masses and mixing matrices, and their weak quantum numbers. I haven’t blogged anything about these because I’ve been so busy writing, but I should give a quick introduction to them.

Heisenberg’s uncertainty principle states that certain pairs of physical observables (i.e. things that physicists can measure) cannot both be known exactly. The usual example is position and momentum. If you measure position accurately, then, by the uncertainty principle, the momentum will go all to Hell. That means that if you measure the position again, you’re likely to get a totally different result. Spin (or angular momentum), on the other hand, acts completely differently. If you measure the spin of a particle twice, you’re guaranteed that the second measurement will be the same as the first. It takes some time to learn quantum mechanics and by the time you know enough of it to question why spin and position act so differently you’ve become accustomed to these differences and it doesn’t bother you very much.

If you want to figure out where an electron goes between two consecutive measurements the modern method is to use Feynman’s path integrals. The idea is to consider all possible paths the particle could take to get from point A to point B. The amplitude for the particle is obtained by computing amplitudes for each of those paths and adding them up. The mathematical details are difficult and are typically the subject of first year graduate classes in physics. Spin, on the other hand, couldn’t be simpler. Spin-1/2 amounts to the simplest possible case for a quantum system that exhibits something like angular momentum.Continue reading →

Previously, Louise had explained the anomaly in a manner that I was too obtuse to understand, for example:

Views of the Cosmic Microwave Background may also indicate a spherical Universe. By measuring distances between acoustic peaks, scientists hope to complete a triangle and determine curvature. When a changing speed of light is accounted for, the angles do not add up to 180 degrees and the triangle is not flat. Most telling, the scale of density fluctuations is nearly zero for angles greater than 60 degrees. Like a ship disappearing over Earth’s horizon, the lack of large-angle fluctuations is smoking-gun evidence that the Universe is curved. Both lines of CMB data indicate that the curvature has radius R = ct.

Lubos Motl brings to our attention a paper by Ted Jacobson and Aron C. Wall on black hole theremodynamics and Lorentz invariance, hep-ph/0804.2720 and claims that theories that violate Lorentz invariance are ruled out because they will also violate the second law of thermodynamics, the law that requires that entropy never decreases. Lubos concludes, “At any rate, this is another example showing that the “anything goes” approach does not apply to quantum gravity and if someone rapes some basic principles such as the Lorentz symmetry or any other law that is implied by string theory, she will likely end up not only with an uninteresting, ugly, and umotivated theory but with an inconsistent theory.” I disagree with this.

First, the abstract of the article:

Recent developments point to a breakdown in the generalized second law of thermodynamics for theories with Lorentz symmetry violation. It appears possible to construct a perpetual motion machine of the second kind in such theories, using a black hole to catalyze the conversion of heat to work. Here we describe the arguments leading to that conclusion. We suggest the implication that Lorentz symmetry should be viewed as an emergent property of the macroscopic world, required by the second law of black hole thermodynamics.

From the abstract, we see that Lubos has put the cart in front of the horse. Rather than proving that Lorentz symmetry has to be exact “all the way down”, the authors instead say that Lorentz symmetry does not have to be present at the foundations of elementary particles because it will automatically emerge macroscopically as a result of requiring that the second law of thermodynamics apply to black holes. And I agree wholeheartedly with this.Continue reading →

The equations of physics are derived from general theories. The odd situation of the moment is that the equations are quite well supported by experiment. One would logically conclude that the theories are as well supported, but this is not the case. The equations, or laws, themselves are very clear; their support by experiment is undeniable; it is in the interpretation of the equations that one finds difficulty.

This post arises from my reading a physics blog recently which mentioned that it seemed that in the 19th century, new physics ideas appeared in the form of “laws” while in the 20th century they were called “theories.” I think that the difference is not just a matter of terminology, but instead that theories and laws are not at all the same sort of thing.

When an experimenter takes data, the data can be arranged in various ways. If we are able to describe the data by fitting an equation to them, then I will call that a “law.” For example, Maxwell’s equations are laws. Given measurements for electric field, magnetic field, charge, velocity, etc., one can compute various things. This is more than curve fitting, but it is quite a bit less than theorizing. Theories are more general than laws. One theory can be used to define any number of laws. For example, the theory of quantum mechanics can be used to derive many different equations.Continue reading →

One finds a fairly diverse collection of characters hanging around the Crossroads Mall Chess Club, (which I sometimes inaccurately refer to as the “Overlake Mall Chess Club”). Mostly it’s men who love chess, or are retired or otherwise have too much time on their hands. In my case, it’s a love of watching others play chess. And one meets people there and one gets to know them. And they find out about one’s other hobbies, in my case physics, and they talk about their own.

In the case of Forrest LeDuc, his other hobby is divination. His regular employment is in the gold fields of north Idaho. Divination has undoubtedly been a central part of mining since before man knew how to smelt metals. I suppose that Neanderthals used divination to find flints, as well as game, other tribes, etc. Divination (or dowsing) is not taught in mining engineering, but the students, at least when I was a student 30 years ago, are exposed to divination by the miners, when they work summers in the mines. Despite centuries of suppression by the combined forces of the church and science, divining or dowsing is still in use. See the recent Mother Earth News article for a description.Continue reading →