Do nuclear decay rates depend on our distance from the sun?

We think that the decay rates of elements are constant regardless of the ambient conditions (except in a few special cases where beta decay can be influenced by powerful electric fields).

So that makes it hard to explain the curious periodic variations in the decay rates of silicon-32 and radium-226 observed by groups at the Brookhaven National Labs in the US and at the Physikalisch-Technische Bundesandstalt in Germany in the 1980s.

Today, the story gets even more puzzling. Jere Jenkins and pals at Purdue University in Indiana have re-analysed the raw data from these experiments and say that the modulations are synchronised with each other and with Earth’s distance from the sun. (Both groups, in acts of selfless dedication, measured the decay rates of silicon-32 and radium-226 over a period of many years.)

In other words, there appears to be an annual variation in the decay rates of these elements.

Jenkins and co put forward two theories to explain why this might be happening.

First, they say a theory developed by John Barrow at the University of Cambridge in the UK and Douglas Shaw at the University of London, suggests that the sun produces a field that changes the value of the fine structure constant on Earth as its distance from the sun varies during each orbit. Such an effect would certainly cause the kind of an annual variation in decay rates that Jenkins and co highlight.

Another idea is that the effect is caused by some kind of interaction with the neutrino flux from the sun’s interior, which could be tested by carrying out the measurements close to a nuclear reactor (which would generate its own powerful neutrino flux).

It turns out, that the notion of that nuclear decay rates are constant has been under attack for some time. In 2006, Jenkins says the decay rate of manganese-54 in their lab decreased dramtically during a solar flare on 13 December.

And numerous groups disagree over the decay rate for elements such as titanium-44, silicon-32 and cesium-137. Perhaps they took their data at different times of the year.

Keep em peeled beause we could hear more about this. Interesting stuff.

I don’t see a similar problem with the German data, but it certainly seems the Brookhaven measurements are the result of systemic or instrument error, which leaves the German measurement uncorroborated.

Alsee, Unfortunately, the data prove you wrong. Mt. St. Helens created a 25 foot thick layer of sediment in a few hours that had thousands of layers in it. These layers are there for all to see as they were later exposed by a river that cut quickly through the deposits to form a canyon. These layers exists as facts for all to observe and interpret. It is not the facts we disagree on, it is in the interpretation of the facts that we disagree. We all observe layers in rocks. Some interpret them as very old, some as very young. Our religious agenda drives our interpretation.

You write with a religious ferver that springs from your religious beliefs. Yes, we all have religious beliefs. Please realize that you too have them.

Rebel dreams, I understand that the effect of gravity on a body can be integrated over time to calculate an eventual position of a body given its initial conditions. The force of gravity is of course assumed constant. Likewise, the effect of the whatever-it-is force is integrated over the course of time to produce a long term decay rate. We assume that the force of whatever-it-is is constant over time just like gravity. Why do we make this assumption? We don’t even know what the whatever-it-is force is. We have no reason to assume it’s constant, we just want to. And if it’s not, then it could have been stronger (or weaker) in the past, which then gives us a basis “to use this result to cast doubt on radioisotopic dating”.

Because the because are statistical it looks like this is a pretty good square root distance square root relation, exactly what one would expect from variations in radiation flux from the Sun. Radiation from the sun could be any of neutrino, Gamma ray, ore other type of radiation. This is a problem that has been around for a long time, “random decay”. Nuclear decay being a “random event” without cause is not a valid assumption from a Baconian point of view. I think a few more cycles should be used. Further the smoothing from the overlay of several cycles would improve the statistics.

Peter Kirk noted “If the distance from the source to the detector increases by 0.1% in summer relative to winter, the detection rate drops by 0.2%, at least for some geometries.” Were wooden supports used in either experiment? Wood moves substantially with changes in humidity: 0.1% along the grain, 2-5% across the grain, and about 1% for plywood. This might account for the phase difference too – the seasonal temperature minimum is delayed with respect to solstice or perihelion.

So…can anyone supply a reference that states what type of measurement device is used, if they have some other long-lived sources that they count with it, how they measure the background and subtract it…and, more importantly, what the data look like 10 years earlier…this is a very small sampling to try to make any statement whatsoever about variation a 1600 year half life…especially that the decay rate is correlated with distance from the sun…

[...] solar flare will depolarize the dilithium matrix, Cap’n! There are periodic variations in nuclear decay rates that appear to be related to distance from the sun. Two hypotheses are that either the sun produces [...]

Marshall’s suggestions for finding evidence in existing systems, like space probe RTGs and supernova, to verify this are excellent. Other radioactive decay experiments have cliamed a similar effect might exist, the Troitsk neutrino experiment for example (tritium decay). I reviewed all the tritium decay data I could find and have not been able to see any effect. Jenkins and Fischbach’s observation is amazing. I look forward to seeing how this turns out.

I’m not sure I understand the claims about time dilation. It’s a fair bet that the sample they were measuring and the clock they were using to compute the decay rates were operating in the same frame of reference — that is, not moving with respect to each other, and not at significantly different distances from the Earth or the Sun. From the standpoint of relativity, there should be no time dilation.

I’m leaning toward seasonal temperature variation affecting the instruments too, especially with that time lag. Have any measurements been done in Australia?

This sounds like it could be the Zero Point Field of QM at work. From the work already being done, it seems that ZPF may have a lot of surprises up its sleeve, that turn our usual understanding of space and time somewhat inside out. You don’t need to swallow the YEC package to know about perfectly valid anomalies that defy the prevailing scientific picture.

Now if the Sun was different in the past, the annual isotope decay rates might have been different overall…

[...] This post describes how it has been observed that decay rates of radioactive materials seem to vary between different times of the year, suggesting that distance from the Sun has an effect in some way. Interesting observation, and if true, could lead to two very interesting consequences: [...]

Over the last 60 years or so 100’s of thousands of 14C dates have been collected on a wide range of materials from a wide range of environments. The the science is pretty well understood. The dating has been cross validated against other methods including cosmogenic isotope dating and optically stimulated luminescence. There are issues with all geological dating methods, including 14C, but these are well-known inside the trade and do not invalidate the method. One just needs to exercise an appropriate level of caution. Note that is it well known that changes in the cosmic ray flux have a big impact on the rate of production of 14C in the atmosphere.

Measurement effects, such as those induced by temperature or extraneous radiation, certainly sound like the first line of enquiry.

If the anomaly is real, which seems unlikely, it’d be worth studying the phase carefully: the correlation is really with Earth’s position in its orbit. Albeit this can appear (subject to phase) well correlated with distance from the Sun (and hence with flux of anything radiated by the Sun); it is equally equivalent (subject to phase) to correlation with speed relative to:
* the cosmic microwave background
* the flux of anything (else) emanating from outside the solar system, e.g. gamma radiation from the galactic core (or the Perseus cluster),
* the presumed galactic dark matter distribution in our neighbourhood,
and doubtless many other fun things.

On the time dilation point: while they might have used a local clock to measure time, it’s eminently possible they used the network time protocol, thereby effectively using a global network of clocks not in their lab. Not that this materially changes the “in the same inertial frame as their lab” constraint: the deviation from “sameness” of inertial frames involved is far tinier than their experimental result.

As for the more fun whacky speculations, the other one to play with is the rather wide error bar on Newton’s constant, G. I’ve seen some reports claiming a correlation between measured value of G and lattitude of the laboratory doing the measuring ! Perhaps someone should check it for seasonal variation, too …

Impact on radio-carbon (and kindred) dating might be more than the 0.3% observed here if the mechanism is related to some effect radiated by the Sun (neutrino, gamma, yadda): the Sun’s production of these things (more or less certainly) has varied over the course of the Sun’s life-time, so the historical variations may be more than the recently observed 0.3%

Similar may be said of at least some of the other conjectured mechanisms; over the last few giga-years, the solar system has done a good few laps of the galaxy – during which, speed relative to the cosmic microwave varies by a respectable fraction, for example; and ambient dark matter density fluctuations may be significant.

Not that I can see such variations impacting estimates of the age of our home planet by even so much as a factor of 2 – let alone the factor of order a million that the YECists would need (and, besides, if the Earth is only a few kyr old, the Sun *hasn’t* had long enough for major variations in rates of radiation, nor has it made more than a tiny fraction of an orbit of the galaxy; so the YECists would have to accept an old Earth to get this effect to provide any support for even a little doubt about carbon dating).

I likewise doubt you can get enough impact on the decay rates of nuclear waste to be of material use – for which you’d need a factor significantly above 2 – at least not without running a major source of neutrinos or gammas (to drive the enhanced decay), which would seem likely to also produce its own share of nuclear waste, of which you’d then need to dispose …

Other than the really weird fantasy world your brain seems to exist in, you seem like a really nice boy. Why don’t you go off and play with those other nice boys and leave us to our medieval witchcraft experiments.

To me, the relationship of nuclear decay rates and the distance from the sun is evidence of the local effects of space/time inflation. We are “blind” to this inflation because we are tied to the earth and bound by its space/time inflation. It is only when we leave the earth that we can see that space/time inflation is not constant but varies relative to speed and gravity. Now we have proof that we are also tied to the Solar System’s space/time inflation, and I expect that over a great deal of time we may discover that we are also tied to the space/time inflation of our galaxy… if we should live long enough for the experiment to run its course.