A different way of looking at the Lord's world

Didja Know 12-6-2016

Many physicists spend their days trying to prove Albert Einstein’s theories correct. One pair of theoretical physicists is hoping to test whether the father of modern physics just may have been wrong about the speed of light.

In his theory of special relativity, Einstein left a lot of wiggle room for the bending of space and time. But his calculations, and most subsequent breakthroughs in modern physics, rely on the notion that the speed of light has always been a constant 186,000 miles per second.

But, what if it wasn’t always that way? In a paper published in the November issue of the journal Physical Review D, physicists from the Imperial College London and Canada’s Perimeter Institute argue that the speed of light could have been much faster in the immediate aftermath of the Big Bang. The theory, which could change the very foundation of modern physics, is expected to be tested empirically for the first time. (could change, get it, could, not sure but if I’m right it could)

“The idea that the speed of light could be variable was radical when first proposed, but with a numerical prediction, it becomes something physicists can actually test,” lead author João Magueijo, a theoretical physicist at Imperial College London, said in a statement. “If true, it would mean that the laws of nature were not always the same as they are today.” (just a numerical prediction you see, not proof, but if my equations match on both sides of the “=” sign, them I am predicting that it is true- not proving it, just predicting)

The theory of variable speed of light (VSL) was proposed by Dr. Magueijo two decades ago as an alternative to the more popular “inflation theory” – both offer possible solutions to the same fundamental problem.

Most cosmological theories state that the early universe was inconsistent in density – lumpy, if you will – as it expanded after the Big Bang. The modern universe, by comparison, is thought to be relatively homogeneous. For that to be possible, light particles would have to spread out to the edge of the universe and “even out” the energy lumps. But if the speed of light was always constant, it would never have been able to catch up with the expanding universe. (the universe is thought to be homogeneous, we don’t know for sure and have created theories with lots of imaginative ‘fudge factors’ built into them that proves what we are happy about)

Inflation theory, which suggests that the universe expanded rapidly before slowing down, provides one potential answer to the dilemma. The early universe could have evened out just before expanding, physicists say, if special conditions were present at the time. (that’s those special conditions aka ‘fudge factors’ to make both sides of the equation to match)

In 2003, Lori Valigrareported for The Christian Science Monitor:

But inflation, proposed by MIT physicist Alan Guth in the late 1970s, was never widely adopted by the British theoretical physics community. And Magueijo claims that as an answer to various “cosmological problems … inflation had won by default.” This propelled him to think about another solution. (not widely adopted by the British. Do you think the Brits ever agree with anyone?)

VSL offers a different inconstant: the speed of light. According to Magueijo and colleagues, the speed of light could have been much faster in the early moments of cosmological time. Fast enough, they say, to reach the distant reaches of the universe before slowing to the current rate. (what would have caused it to go faster, what are the possible consequences of something going faster that the speed of light [as we know it now], and what would have caused it to slow down and when would that have happened. Gee no answers yet?)

Now, researchers hope to prove that theory by studying the cosmic microwave background (CMB). Physicists have long used this radiative “afterglow” to glean new insights about the early universe. And since cosmic structures leave imprints on the CMB as they fluctuate in density, scientists may someday be able to produce a “spectral index” of the universe. (still learning what the “afterglow” means and trying to understand it, and a long way from it. After 20 years and $3 billion dollars the James Webb space telescope is soon to be launched and the make the Hubble its second cousin)

If VSL theory is correct – if the speed of light really was faster after the Big Bang – the spectral index should come in at exactly 0.96478. That’s not too far off from current estimates, Magueijo says. (so if his theory is correct, then the numeric index should be just about the same as it is now estimated at 0.96497. Not too terribly different, but what do I know?)

“The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction,” Magueijo said. “If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein’s theory of gravity.”

Physicists have proposed a new experiment (https://arxiv.org/pdf/1603.03312v2.pdf) to test their theory that Einstein was wrong about the speed of light being a constant, the foundation on which much of modern physics is based. (So how do you propose the test this- you write the following paper which you can read in pdf form)

We explore the space of scalar-tensor theories containing two non-conformalmetrics, and find a discontinuity pointing to a “critical” cosmological solution. Due to the different maximal speeds of propagation for matter and gravity,the cosmological fluctuations start off inside the horizon even without inflation, and will more naturally have a thermal origin (since there is never vacuum domination). The critical model makes an unambiguous, non-tuned prediction for the spectral index of the scalar fluctuations: nS=0.96478(64). Considering also that no gravitational waves are produced, we have unveiled the most predictive model on offer. The model has a simple geometrical interpretation as a probe 3-brane embedded in an EAdS2×E3 geometry.

(That was the abstract, below is the details and you can click on the link above to read it yourself)

(I will be the first to admit I am not an expert in calculus or physics. However, I do have more than a working understanding of the two subject matters. I had the delightful experience of working with Dr. John Strand, PhD in astrophysics on a global positioning system for an oil well company. He worked on the Apollo missions- in fact he was the one who developed the theory of sling-shoting the capsule around the moon to get enough speed to return to earth and saving fuel to provide more oxygen for the astronauts. Without his theory, they would have just gone into never-never land [whoops space]. Check out John’s book. “Pathways to the Planets: Memoirs of an Astrophysicist by John R. Strand. It is only available as an e-book. It is a remarkable read.

It took me about 8 hours to work the entire math, about 3 to look up the values of the common variables. Using the ‘common constants’ I ended up with Magueijo’s value. Using his vales for the variables, I got his value also. This is somewhat disconcerting; I should have gotten a different value use the ‘common constants’.

On top of this, the entire article is his theory, it has no indication of an actual provable test to be performed. It can’t be done. He can use the values that the James Webb telescope will provide, once it has been launched, calibrated, tested and allowed to gather information. A number of years from now.

It seems to be a problem with scientists these days; announcing things before they have proof of it. To me it is putting a dent in scientific research, as why should I investigate something, if when I have the facts, somebody else has already taken credit for it, albeit a little too soon and guessing instead of proving. LEM)