New Doubts About Half-Life Dating

Excerpt
The most widely used tool to measure the age of the Earth is radioactive decay. The great scientist Ernest Rutherford was the first to define the concept of “half-life,” that is, the time it takes for one half of the atoms in a given quantity of a radioactive element (such as plutonium) to decay into another element (such as uranium), or for one isotope of an element (such as carbon-14) to decay into another isotope of that same element (such as carbon-12)... Continue reading

Tags

Support

Like this artice?

Our Ministry relies on the generosity of people like you. Every small donation helps us develop and publish great articles.

Please support ABR!

The most widely used tool to measure the age of the Earth is radioactive decay. The great scientist Ernest Rutherford was the first to define the concept of “half-life,” that is, the time it takes for one half of the atoms in a given quantity of a radioactive element (such as plutonium) to decay into another element (such as uranium), or for one isotope of an element (such as carbon-14) to decay into another isotope of that same element (such as carbon-12).

Moreover, Rutherford and all scientists since him have declared that the radioactive decay of a given element or isotope occurs “at a specific, universal, immutable rate” (Castelvecchi 2008: 21). Based on this assumption, scientists use the decay rate of certain substances to date the age of rock formations, fossils, and the Earth itself.

However, this assumption has recently come under doubt. The November 22, 2008, issue of the journal Science News reported that, “when researchers suggested in August [2008] that the sun causes variations in the decay rates of isotopes of silicon, chlorine, radium and manganese, the physics community reacted with curiosity, but mostly with skepticism” (Ibid.).

Despite this skepticism, there is proof that this is true. For example, a team at Purdue University in Indiana was monitoring a lump of manganese-54 in a radiation detector box to measure the isotope’s half-life. At 9:37 PM on December 12, 2006, the instruments recorded a sudden dip in radioactivity. At that same moment, satellites on the other side of the Earth (the daylight side) detected X rays coming from the sun, which signaled the beginning of a solar flare (Ibid.).

This was not the only evidence for such a change in the radioactive decay rate. As far back as the 1980s, a study of silicon-32 at the Brookhaven National Laboratory in New York State, and another study of radium-226 at the PTB, a scientific institute in Germany, made similar findings. Both studies were long-term, and, according to Science News, “both had seen seasonal variations of a few tenths of a percent in the decay rates of the respective isotopes” (Ibid.). The journal went on to point out:

A change of less than a percent may not sound like a lot. But if the change is real, rather than an anomaly in the detector, it would challenge the entire concept of half-life and even force physicists to rewrite their nuclear physics textbooks (Ibid.).

Because the decay rates in the two studies from the 1980s were altered by the seasons, physicists suspect that the sun was affecting the rates of decay, “possibly through some physical mechanism that had never before been observed” (Ibid.). The Brookhaven study, for example, which lasted from 1982 till 1986, showed that samples of silicon-32 and chrlorine-36 “had rates of decay that varied with the seasons, by about 0.3 percent” (Ibid. 22). Science News went on to report:

The samples were kept at constant temperature and humidity, so the changing seasons should have had no effect on the experiment. The team tried all the fixes it could to get rid of the fluctuations, but, in the end, decided to publish the results (Ibid.).

The results were ignored by the scientific community. “People just sort of forgot about it, I guess,” commented David Alburger, the Brookhaven scientist who had conducted the experiment (Ibid). Alburger was unaware that, at the exact same time, the German scientists at the PTB had found the same thing, with “yearly oscillations in a decay rate, in a 15-year experiment with radium-226” (Ibid.). Again, the finding made no splash in the scientific community.

Such small fluctuations in the rate of radioactive decay may not seem like much, but, as Science News noted, it is great enough to cause physicists to change their entire way of looking at the concept of half-life and the accuracy with which it measures ancient ages. Moreover, if solar activity was greater in the past, before humanity began measuring it, then the changes in radioactive decay might actually be greater than those measured by the scientists at Brookhaven, PTB, and Purdue.

Editorial note: The Institute for Creation Research published detailed scientific evidence to show that these dating methods have several flaws, and produced evidence to show there was billion-fold accelerated decay in the past, most likely occuring at the time of the Flood. ABR hosted a RATE Conference (Radioisotopes and the Age of the Earth) with over 700 attendees in the Fall of 2006. For more on this important research, visit: http://store.icr.org/prodinfo.asp?number=BRATE1

Share

Subscribe

We need your support!

Our ministry relies on the generosity of people like you, who make it possible for us to develop and publish great articles. If you enjoyed this article, please consider supporting ABR with a small donation or by becoming a member.

The RATE team of eight research scientists at the Institute for Creation Research reported major problems with radioactive dating techniques in 2005 in its final report by Vardiman et al., 2005, "Radioisotopes and the Age of the Earth", Institute for Creation Research, Dallas, Texas, www. icr.org. Three major pieces of evidence were offered that radioactive decay was accelerated during the past, probably during the Genesis Flood. The evidence was (1) helium diffusion from zircons in granite give an alternative age of the earth of 6,000 +/-2000 years, (2) polonium radiohalos in biotite surrounding zircons in granite indicate accelerated nuclear decay on the orders of millions of times faster than today, and (3) carbon-14 in coal and diamonds indicate that the earth is less than 50,000 years and the Flood probably occurred less than about 5,000 years ago, if the effects of the Genesis Flood are considered. A popular version of the report is entitled "Thousands not Billions" by Don DeYoung, Master Books, Green Forest, Arkansas, also available at www.icr.org.

I have for some time now, off and on, searched for Libby's papers on the slowing of the rate of decline at the end of each half life period. very few people talk about this automatic slow down that they say takes place every 5,500 years (approx.)

iI just do not buy into it and would like to see the thinking behind this part of the theory