The age of the earth is a central issue in creation-evolution discussions, because a young earth would not permit enough time for evolution to occur, and an old earth would contradict a literal reading of the Bible account of creation. The belief in an old earth is based on conventional dates for geological periods, which are in the hundreds of millions of years range, and are obtained by isotopic dating methods. Standard isotopic (radiometric) dating techniques typically yield such dates on fossil-bearing strata. There are, however, numerous disagreements between dates produced by different isotopic dating methods, and there are many cases where the dates obtained are very different from the expected ones. Furthermore, geologists are aware of a number of factors that can cause radiometric dating methods to give bad dates, and these factors are sometimes difficult to recognize. This already casts some doubt on isotopic dating methods. Creationists have given evidence that the geological column is much younger than hundreds of millions of years, but until now they have not had a quantitative method of measuring the age of the fossils or the geologic column. Nor have they had a uniform explanation for why isotopic dating methods give such old dates. This has put creationists at a disadvantage in discussions of dating issues, and also has been an obstacle in the widespread acceptance of a young earth.

Now there are evidences that explain why isotopic dating methods yield such old dates on fossil-bearing strata. These evidences also provide a quantitative measure of how old the fossils really are. These evidences show that the geological column on earth, at least from the Cambrian period onwards, was laid down in a few thousand years rather than the hundreds of millions of years assumed by conventional geology. This gives strong support to the creationary viewpoint, and provides methods of dating that are more in harmony with the Biblical creation account. These evidences also explain the old ages given by conventional methods as the result of accelerated decay. It now appears that radioactive decay was much faster in the past. This explains why isotopic dating methods typically give dates in the hundreds of millions or even billions of years on samples that are really only a few thousand years old on a young earth. Faster decay could also be the cause of the Flood, because accelerated decay would have caused the generation of a huge amount of heat, wreaking havoc with the earth’s crust. These evidences do not directly establish the age of the earth or the universe, but suggest that the earth is young.

In fact, a number of evidences are fitting together so well that one has to ask how much evidence is needed for a paradigm shift. How much evidence suffices for the scientific establishment to accept the fact that the geological column was laid down very rapidly, in thousands rather than millions of years? Or is it the case that no amount of evidence will convince them? I think that the new evidences are so convincing that the scientific establishment would have a hard time refuting them in a debate. But whatever the reaction of the scientists, the evidence is now compelling enough to convince many educated people of the error of the current assumption of hundreds of millions of years for the geological column.

In the past, many creationists have attempted to explain old isotopic (radiometric) dates by assuming that the system was disturbed. Isotopic dates are often computed by measuring the amount of a parent substance X and the amount of a daughter substance Y into which X decays. If one assumes that at some time T in the past, no Y was present, and no X or Y entered or left the system in the meantime, then, by measuring the amount of X and Y present and knowing the speed at which X decays into Y, one can compute the age of the system, that is, the time elapsed since time T. The more Y and the less X there is, the older the sample. This method typically gives ages in the hundreds of millions of years. Creationists often argue that the computed age is too old because Y may have been present initially, or X or Y may have entered or left the system since it was formed. However, geologists have developed sophisticated methods to account for such possibilities. Furthermore, it seems unusual that so many different isotopic methods would give old dates if these dates resulted only from disturbances in the system. Disturbances could just as well make the dates too young as too old. Now creationists are beginning to think that a large amount of radioactive decay occurred in a short time, because the rate of decay was much faster in the past.

There are two main processes by which radioactive decay occurs, alpha decay and beta decay. In alpha decay, an alpha particle is emitted from a nucleus. An alpha particle consists of two protons and two neutrons. This is the nucleus of a helium atom, and when an alpha particle is emitted, it soon acquires electrons and becomes a helium atom. Thus helium is produced by alpha decay. The other main method of decay is beta decay, in which an electron or a positron is emitted from the nucleus and a neutron becomes a proton, or vice versa. Another way that this can happen is if an electron is captured by the nucleus. If rates of decay were faster in the past, then it is reasonable to assume that alpha decay and beta decay would be sped up by different amounts, because they are such different processes.

The first evidence for accelerated decay in the past has to do with the dating of zircons. Zircons have the element zirconium in them, together with other elements. They are often used for jewelry. Zircons are used for isotopic dating because their crystal structure incorporates uranium and thorium but not lead, making them suitable for uranium-lead and thorium-lead dating. Uranium and thorium decay into lead, so one can assume that the lead in the zircon results from decay, and thus compute the age of the zircon. Although this assumption has its limitations, the idea is basically sound. Zircons on earth give dates up to about 4 billion years.

Uranium and thorium decay into lead by a complex series of steps, of which a number involve alpha decay. Thus helium is produced. This helium should diffuse out of the zircon rapidly. Therefore if the zircons were really hundreds of millions or even billions of years old, there should be no helium left in them that resulted from such decay. However, a significant amount of helium has been found in some zircons that give isotopic dates of 1.5 billion years. Until recently, no one had measured the rate of diffusion of helium in zircons. In 2000 the RATE project [RATE 00] began experiments to measure the diffusion rates of helium in zircon and biotite. Using this data, the ages of these zircons were computed [Humphreys et al 03]. In other words, an age was computed consistent with the amount of helium remaining in the zircon. The ages computed in this way are between 4,000 and 14,000 years! These results support the hypothesis of accelerated nuclear decay and represent strong scientific evidence for the young world of Scripture. This shows that alleged isotopic dates of 1.5 billion years for these particular zircons correspond to true dates of between 4,000 and 14,000 years. This suggests that all these old isotopic dates correspond to very young true dates. However, these results do not yet show that even older dates are in this time range. It would be interesting to test zircons having even older isotopic dates to see how much helium they contain, and to test more zircons to see if this helium retention is a universal phenomenon.

The next evidence for a recent creation is provided by carbon 14 dates. Carbon 14 is produced in the upper atmosphere by cosmic rays and then slowly decays. The older an organic sample is, the less carbon 14 it will contain because the sample will not be absorbing new carbon 14 after it dies. An astonishing discovery made over the past twenty years is that, almost without exception, when tested by highly sensitive accelerator mass spectrometer (AMS) methods, organic samples from every portion of the fossil record show detectable amounts of 14C! Giem reviewed the literature and tabulated about seventy reported AMS measurements of 14C in organic materials from the geologic record that, according to the conventional geologic time-scale, should be 14C ‘dead.’ The surprising result is that organic samples from every portion of the fossil record show detectable amounts of 14C. For the measurements considered most reliable, the 14C/C ratios appear to fall in the range 0.1-0.5 percent of the modern 14C/C ratio (percent modern carbon, or pmc). 0.1 percent modern carbon corresponds to a computed age of 57,000 years, and higher values correspond to even younger ages. This implies that the entire geologic column from the Cambrian period onward is less than 57,000 years old. Some of the researchers tried to explain this carbon 14 as contamination, but none of their attempts to clean it were successful, and other evidence indicated that this carbon 14 was not contamination.

Organic matter consistently has a higher 14C ratio than Precambrian inorganic matter. This shows that this carbon 14 is not noise and not contamination. If the carbon 14 arose from noise in the measurement process or from contamination, then one would not expect to find such systematic differences. The amount of carbon 14 must therefore indicate that these samples are very young.

Here we have additional evidence that samples alleged to be hundreds of millions of years old are in fact 60,000 years old or less. If decay were accelerated in the past, the true age would be even less than 60,000 years. There is also reason to believe that the biomass before the flood may have been 100 times larger than it is today, which would dilute carbon 14 by a factor of 100 or more. This corresponds to six or seven half lives of carbon, or to an age of about 40,000 years. Thus the ages of these samples would be brought down to the 10,000 to 20,000 year range, and with accelerated decay the ages would be even less, consistent with the Biblical account. Another factor to consider is that there may have been less carbon 14 before the flood; the amount of carbon 14 in the atmosphere appears to be increasing even today. This would make the ages even younger.

There is even measurable carbon 14 in diamonds! Dr. Baumgardner [Wieland 03] sent a diamond for C-14 dating. It was the first time this had been attempted, and the answer came back positive—i.e. the diamond, formed deep inside the earth in a ‘Precambrian’ layer, nevertheless contained radioactive carbon, even though it ‘shouldn’t have’. This is exceptionally striking evidence, because a diamond has remarkably powerful lattice bonds, so there is no way that subsequent biological contamination can be expected to find its way into the interior. The diamond’s carbon-dated ‘age’ of less than 58,000 years is thus an upper limit for the age of geological column from the Cambrian period onwards. And this age is brought down still further now that the helium diffusion results have so strongly affirmed dramatic past acceleration of radioactive decay.

The fact that isotopic dates are generally too old by hundreds of millions of years, but Carbon 14 dates are only too old by thousands of years, is also evidence for accelerated decay, because Carbon 14 decays much faster. In general, one would expect that if decay were accelerated, all radioactive decay systems would have about the same amount of extra decay. This is especially true if the cause of the accelerated decay was a large amount of radiation hitting the earth, because a nucleus that was hit by radiation would receive a large amount of energy and would be likely to decay, regardless of its half life. Carbon 14 has a short half life, meaning that it is relatively unstable and decays rapidly, so the number of atoms per unit time that decay is large. Uranium, thorium, and other substances used for isotopic dating have much larger half lives, almost all of them in the billions of years range. This means that these substances are comparatively stable and decay events are very rare, so the number of atoms per unit time that decay is very small. Therefore, if there are N extra decay events in a unit of time, these extra decay events would proportionally affect the number of Carbon 14 decays by a much smaller amount than the number of uranium and thorium decays. This means that the age computed from Carbon 14 would be increased by a much smaller proportion than the ages computed from uranium-lead and thorium-lead decay. In fact, this is what is observed, with Carbon 14 ages typically in the 60,000 year range or less, but uranium and thorium ages typically in the hundreds of millions of years.

Here is a table of some common half lives, showing how much longer many half lives are than the half life of carbon 14:

Radioactive Parent

Stable Daughter

Half life

Potassium 40

Argon 40

1.25 billion yrs

Rubidium 87

Strontium 87

48.8 billion yrs

Thorium 232

Lead 208

14 billion years

Uranium 235

Lead 207

704 million years

Uranium 238

Lead 206

4.47 billion years

Carbon 14

Nitrogen 14

5730 years

Also, alpha decay and beta decay use different processes. Therefore they may not be affected the same amount by an increase in the decay rate. So discordances between alpha and beta decay ages are an evidence of disturbed decay. To sum up, the following are the evidences one would expect from accelerated decay in the past: Carbon 14 ages should be much younger than other isotopic ages like K-Ar, U-Pb, et cetera. Alpha and beta ages should differ. And ages computed from elements with long half lives should be more affected than ages computed from elements with short half lives.

In fact, these evidences are reported in [Austin et al 03]. This paper considers ages computed from "isochrons." An isochron is a method for computing the amount of daughter product Y that was initially present in a system. This is computed by taking several samples from the same area, and measuring the amount of parent and daughter substance in each sample. Another isotope of Y, not produced by radioactive decay, is also measured. It is reasonable to assume that initially, all isotopes of Y were distributed in a similar manner in the samples. Thus one can estimate how much Y was present initially in each sample, at least up to a constant factor. Knowing the amount of daughter product that was initially present, one can compute the age of the samples. It is also possible using isochrons to detect whether the system has been disturbed since its origin. This means that isochrons are self-checking. There are two kinds of isochrons, whole rock isochrons and mineral isochrons. Whole rock isochrons use samples that are obtained by combining many different minerals in each sample. Mineral isochrons use a different mineral for each sample. Whole rock isochrons can give wrong ages due to mixings. However, this is not a problem for mineral isochrons. Therefore mineral isochrons, though they are somewhat more expensive, are more reliable. Especially the agreement of a whole rock isochron and a mineral isochron gives excellent evidence that the date obtained is good, and that the system has not been disturbed since it formed. Most isotopic dates are model ages computed simply by measuring the amount of parent and daughter substance in a sample, and only a small fraction of isotopic dates are obtained using isochrons. Even when isochrons are performed, only a small portion of them are mineral isochrons. Therefore, only a small fraction of isotopic dates have such reliability factors built in; the remainder are subject to various errors.

However, even when extra reliability factors are built into dating methods, the dates generally still do not agree with one another. In [Austin et al 03] an example is given where two different systems (that is, ages measured by two different decay processes) both have internal evidence for consistency in that whole rock and mineral isochrons agree for each system, but the dates obtained for the two systems disagree. This means that one computes two ages, A1 and A2 for the formation. Both A1 and A2 have excellent evidence for their correctness, based on the agreement of a whole rock isochron and a mineral isochron for A1, and likewise for A2. But the ages A1 and A2 disagree! The only reasonable explanation is that there was a change in the decay rate, and the decay measured for age A1 was increased by a different amount than the decay measured for the age A2. Furthermore, these data are consistent with alpha decay having been accelerated more than beta decay, and with the longer the present half-life the greater being the acceleration factor. Thus there is excellent evidence that decay rates were increased in the past. In fact, according to Austin [Austin 00], such disagreements between "good" dates (dates computed using whole rock or mineral isochrons) are very common in the literature. Thus there is abundant evidence for a change in the decay rates.

Isotopic dates on earth obtained by different methods are typically discordant (in disagreement), but this is not true of the meteorites. There are certain meteorites that consistently give dates of about 4.5 billion years by many different methods. Therefore a different process must have been at work in these meteorites than on earth. Perhaps the 4.5 billion year age of these meteorites is a result of an old universe, or perhaps it is a result of changes in the physical constants very early in the creation, causing all decay processes to run faster by the same amount. Another factor is that the same processes leading to discordant dates on earth should have led to discordant dates on the meteorites, but this did not occur. One possible explanation for this is that radiation hitting the earth largely missed the meteorites, or else they were shielded from it in some way. Another possibility is that the radiation had its source in the sun. Objects farther from the sun would have received less radiation; an object ten times farther away than the earth would only have received one percent of the radiation. This would have resulted in a much smaller speedup in the decay rate and much smaller discordances in the ages obtained by different methods. A variation of isochrons called isochrones are used to measure the ages of stars. The ages obtained are typically in the billions of years. Perhaps these ages are also the result of an old universe or a change in the decay rates very early in the creation.

There is also evidence for a speedup in mutation rates in the past, based on genetic diversity. The genetic diversity of a species measures the probability that two randomly chosen individuals will disagree in a given base pair of their DNA. If a species is large, the genetic diversity will continue to increase over time, as mutations occur and different individuals in the species become more and more different in their DNA. Thus, assuming a large species, one can give an upper bound on the age of the species knowing the genetic diversity and the mutation rate. This either gives an upper bound on the time since the species originated, or else measures the time since the species population was very small. This method was applied to the human race, using mitochondrial DNA. Mitochondria are the "energy factories" of the cell and convert ADP to ATP, which is used by the cell to generate energy. Mitochondria have their own DNA and divide independently of the cell; each cell typically has many mitochondria. Also, mitochondria typically pass exclusively from mothers to their children, although there may be exceptions. By measuring the rate of mutation of mitochondrial DNA and computing the genetic diversity of the human race, one obtains [Gibbons 98] an age of somewhat over 6000 years since the common maternal ancestor of the human race (mitochondrial Eve). Biologists attempt to explain this young age by assuming that the rate of mutation of mitochondrial DNA was much slower in the past for some unexplained reason.

It is not only the human race whose age, measured this way, is young, but many other species as well, including wolves, coyotes, dogs, ducks, birds, E. Coli, and Drosophila (fruit flies). Most of these ages are based on the assumption that mitochondria in other organisms mutate at about the same rate as they do in humans. Biologists are puzzled by this low genetic diversity in many organisms. This is spectacular evidence for a recent creation, but it has largely been ignored by creationists.

It is also possible to compute ages based on nuclear DNA diversity. Most of the DNA of an organism is in the nucleus, and this nuclear DNA mutates slower than mitochondrial DNA. The nuclear DNA diversity due to SNP’s (single nucleotide polymorphisms) is given in [Nature 01], and is about 7.51 10-4; for the Y chromosome the diversity is about 1.5 10-4. Ages computed from the Y chromosome diversity (which would have been zero at the creation) tend to be somewhat larger than those computed from mitochondrial DNA diversity, and based on a Y chromosome mutation rate of 6 10-8 per generation of 20 years, are about 25,000 years. (There is reason to believe [Crow 97] that the Y chromosome mutates about twice as fast as the other chromosomes. The overall human mutation rate is estimated at about 3 10-8 per base pair per generation and may be higher.) Even this 25,000 year estimate is not too far from the Biblical time frame and supports the creationary view. However, this calculation is based on a mutation rate that is itself partially derived from evolutionary assumptions. As with radioactive decay, this longer age for nuclear DNA is evidence for a speedup in the mutation rate in the past. Because nuclear DNA mutates much slower, any increase in the mutation rate would have a much larger effect on ages computed from nuclear DNA diversity than on ages computed from mitochondrial DNA diversity.

Furthermore, if decay was faster in the past, it could have increased the mutation rate, because the level of radiation would have been higher, and radiation causes mutations. There is evidence that small doses of radiation can lead to unexpectedly high mutation rates in humans [Science 02a]: "researchers led by geneticist Yuri Dubrova of the University of Leicester, United Kingdom, describe a compelling connection between radioactive fallout and elevated mutation rates in families living downwind of the Semipalatinsk nuclear facility ... The findings bolster a controversial 1996 report by Dubrova and a different group of colleagues that linked germ line mutations to fallout from the 1986 Chernobyl explosion. That study, published in Nature, described double the usual mutation rate in the children of men living in a region of Belarus heavily contaminated with cesium 137. In each subject they examined eight minisatellite DNA regions that are prone to mutations. ... Compared to control families in a nonirradiated part of Kazakhstan, individuals exposed to fallout had a rougly 80% increase in mutation rate, and their children showed an average rise of 50%."

So it all fits together: increased decay leads to higher levels of radiation and also increases mutation rates in humans! And there is some evidence that the rate of decay may vary: Slusher [Slusher 81] reports: "Anderson and Spangler maintain that their several observations of statistically significant deviations from the (random) expectation strongly suggests that an unreliability factor must be incorporated into age-dating calculations. Such irregularities were observed for carbon 14, cobalt 60, and cesium 137. The source for this information is [Anderson and Spangler]. Even Dalrymple [Dalrymple 84] recognizes such irregularities: "Under certain environmental conditions, the decay characteristics of 14C, 60Co, and 137Ce, all of which decay by beta emission, do deviate slightly from the ideal random distribution predicted by current theory ... , but changes in the decay constants have not been detected." Dalrymple cites the references [Anderson 72] and [Anderson and Spangler 73]. Though he claims no changes in the decay constants have been detected, he admits to puzzling irregularities in decay.

What could have sped up decay rates? Some creationists including [Chaffin 00] and Barry Setterfield postulate a change in the basic physical constants at the time of the creation and during the flood, resulting in an accelerated burst of decay very early in the creation and also during the flood. Early in the creation the constants including the speed of light may indeed have been different, and even secular scientists have suggested this. However, a change in the constants at the time of the flood would have had many consequences, and may have made the basic biology of life impossible. But there is another possible mechanism.

The following comment by Keith Wanser [Wanser 99], a creationist physicist, is significant: "Actually, it turns out that when you get the nucleus "excited", decay is going to be much quicker, making things look vastly "older". People have been talking recently about magnetic stars giving off big bursts of gamma rays; there are all sorts of ways that radiometric "clocks" could have been reset catastrophically, during the Flood, for example." In fact, when the nucleus gets excited, it takes time for it to settle down. This means that rates of decay may have been faster for some time after the Flood. Another mechanism for an increase in the decay rate is presented in [Science 00]. This article shows how interactions with elementary particles can cause decay rates to increase. One such particle is the neutrino. A recent result [Science 02b] implies that neutrinos interact with matter much more readily than previously thought: "The results also show that another property of neutrinos, related to how they interact with matter, known as the mixing angle, must be large, rather than small, contrary to what physicists believed until quite recently." So radiation, possibly gamma radiation or possibly neutrinos, could have sped up decay rates.

But where would this radiation have come from? One possibility is a supernova. Many supernovae are known. The Crab Nebula is the remnant of a supernova explosion that was seen on Earth in 1054 AD. It is 6000 light years from Earth. At the center of the bright nebula is a rapidly spinning neutron star, or pulsar that emits pulses of radiation 30 times a second. In X-ray pictures taken of the Crab nebula, one can see a ring structure and beams of radiation coming out from the poles. Another supernova, SN 1987A, appeared on February 23, 1987. Supernovae typically leave behind rapidly spinning neutron stars, or pulsars. And there is evidence that supernovae occurred near the earth in the past.

An article [New Scientist 03] in the September 2003 New Scientist states, "A devastating burst of gamma rays may have caused one of Earth's worst mass extinctions, 443 million years ago. A team of astrophysicists and palaeontologists says the pattern of trilobite extinctions at that time resembles the expected effects of a nearby gamma-ray burst (GRB). GRBs are the most powerful explosions known. As giant stars collapse into black holes at the end of their lives, they fire incredibly intense pulses of gamma rays from their poles that can be detected even from across the universe for 10 seconds or so. … Now Melott believes he has palaeontological evidence that this actually happened at the end of the Ordovician period 443 million years ago, causing one of the five largest extinctions of the past 500 million years. The researchers found that species of trilobite that spent some of their lives in the plankton layer near the ocean surface were much harder hit than deep-water dwellers, which tended to stay put within quite restricted areas. Melott says this unusual pattern could be explained by a GRB, which would probably devastate creatures living on land and near the ocean surface, but leave deep-sea creatures relatively unharmed."

Another article [New Scientist 02a] in the January 2002 New Scientist gives additional evidence for a recent supernova near the earth. The researchers found atoms of a very rare isotope of iron, 60Fe, in cores taken from the ocean floor. 60Fe is rare in the solar system because it has a half-life of 1.5 million years. The group suggested that the iron arrived on Earth as fallout from a nearby supernova about two million years ago. This is about the time that fossil records indicate that many marine molluscs went extinct. Donald Clayton, an astronomer at Clemson University, says the story appears consistent: "The amount of 60Fe found in deposits is about what you might expect from a supernova going off about 100 light-years away." Clayton says 60Fe would be blasted towards Earth when high energy neutrons from the supernova core smack into iron atoms in its outer shell.

An additional evidence is given in the May 2002 New Scientist [New Scientist 02b]. "A student at Harvard University has stumbled across the terrifying spectacle of a star in our galactic backyard that is on the brink of exploding in a supernova. It is so close that if it were to blow up before moving away from us, it could wipe out life on Earth. We are only 150 light years away from HR 8210 at present - well short of the 160 to 200 light years thought to be the minimum safe distance from a supernova. If it did let fly, the high-energy electromagnetic radiation and cosmic rays it released would destroy Earth's ozone layer within minutes, giving life little chance of survival. "The fact that there's such a system so close to us suggests maybe these objects are not so rare," says Latham." The fact that supernovae are common near the earth makes it more likely that one occurred in the past. Of course, the evidence for supernovae in the past is valid even if the assumption that they occurred hundreds of million years ago is in error.

So there is reason to believe that a supernova occurred near the earth, and we have reason to believe that radiation from a supernova would increase decay rates. But which supernova might have been responsible for the increase in decay rates?

The Gum Nebula is a huge constellation in the Southern hemisphere, about 1000 light years away, and extends over at least 40 degrees of the sky. The Gum Nebula is thought to be the remnant of one or more ancient supernovae. One pulsar in this region, perhaps not associated with the Gum Nebula, is the Vela Pulsar, which is about 800 light years away and estimated to be about 11,000 years old. However, if the dating of pulsars is wrong, as has recently been suggested [Science 01], then the Vela Pulsar could be much younger, and may have arisen only 4,500 years ago, or about the time of the Flood. The author of [Science 01] does not feel that these results apply to the Vela pulsar, but this does show that dating methods can change with time. The Vela supernova remnant is now about 230 light years across and covers over 100 times the sky area of the full moon. The Vela Pulsar is still the the brightest gamma-ray source in the sky above 100 MeV. It’s a "smoking gun" and a logical choice for the supernova that increased decay rates in the past. Jueneman was the first to suggest a link between this pulsar and an acceleration of decay [Jueneman 72]. A recent X-ray picture of the Vela pulsar shows the typical ring structure with a beam of radiation exiting from a pole.

Another evidence of a recent creation is comets. Comets are essentially frozen mud. That is, they are believed to be composed of dust combined with water, ammonia, methane, or other frozen liquids. When a comet is heated by the sun some of the ice vaporizes and dust escapes. This is what makes comets visible to us. Each time a comet orbits close to the sun, it loses 5 to 10 % of its material. Astronomers have even seen them break up into pieces as they go around the sun. At this rate they couldn't last more than 100,000 years. Some of the short-orbit comets couldn't last more than 10,000 years old. If so, how could there be any comets left after 5 billion years? The Kuiper belt is the supposed origin of the short period comets. The Oort cloud is also believed to originate comets. But the Kuiper belt was recently found [Science 03] to have only 4 percent of the necessary objects! Comets must have been recently produced, then, by some kind of a catastrophe. Perhaps a planet between Mars and Jupiter exploded when the decay rate increased, thereby generating comets, producing the asteroid belt, and also explaining many asteroid impacts on the earth at the time of the flood.

Yet another evidence for an increase in the decay rate in the past is the correlation between surface heat flow and the radioactivity of surface rocks [RATE 00b]. Geologists have found a puzzling correlation between heat flow out of the ground, and the presence of radioactive elements near the surface. This should not be so if decay has proceeded slowly for millions of years, because the heat would have long since dissipated. A better explanation is that the pulse of heat from an interval of accelerated decay in the past, has not entirely dissipated. It is also possible that in the "wild," decay is taking place faster than we realize, generating extra heat.

Finally, Robert Gentry claims [Gentry 76] to have found "squashed" polonium haloes as well as embryonic uranium radiohaloes in coal deposits from many geological layers claimed to be hundreds of millions of years old. The ages given for several adjacent geological periods using squashed Polonium haloes are nearly identical.

Many evidences have already been presented by creationists that indicate something is wrong with the long ages of radiometric dating on earth. These include a rate of erosion that is too high for the assumed age of the continents, too much salt entering the ocean, too little sediment on the ocean floor, many evidences of catastrophe in the geological column, too little erosion in many places in the geological column, evidence of sudden burial of fossils in large numbers, turbidities in the geological column, missing periods in places in the geological column, the lack of uniform unconformities, polystrate fossils, overthrusts, and others. Another evidence that appeared [Kerr 03] in a recent issue of Science is the survival of remnants of meteoritic fragments for (supposedly) 251 million years. These remnants would long since have been destroyed by chemical reactions in such a long time period, scientists say. "Meteoriticists and impact geologists are stunned that tiny, fresh-looking, unaltered fragments of a meteorite should have survived burial for 251 million years." Though there is disagreement about the date of origin of these fragments, one possibility is that accepted dates of 251 million years correspond to actual dates of a few thousand years.

In addition to these evidences, there are now many new evidences of increased decay rates in the past that indicate that isotopic dates of hundreds of millions of years were produced in thousands or tens of thousands of years, namely, helium retention in zircons, young Carbon 14 dates, and disagreements between well justified isotopic dates. In addition, there is evidence of an acceleration of the mutation rate in the past, which would have been the result of increased decay. There is also evidence of a nearby supernova in the past, and evidence that the radiation from such a supernova would have increased the decay rate. Finally, there is the lack of expected objects in the Kuiper belt, and the correlation between surface heat flow and the radioactivity of surface rocks. And of course there is the mitochondrial DNA mutation evidence indicating that man and many other species had a very recent origin. Not only do all these evidences fit together, but several of them seem impossible to explain in the long ages geological framework. This justifies a repetition of the question posed at the beginning of this article: How much evidence is necessary before a paradigm shift occurs? How much evidence is needed before geologists will seriously consider the possibility that the geological column was laid down in thousands, rather than millions of years? When will those who hold this view be regarded with respect by the scientific establishment rather than being considered as religious fanatics? Only time will tell.

[Austin 00] Steven A. Austin, Mineral Isochron Method Applied as a Test of the Assumptions of Radiometric Dating, Radioisotopes and the Age of the Earth: A Young-Earth Creationist Research Initiative (ICR and CRS, 2000), pp. 95-121.

[Dalrymple 84] Dalrymple, G. B., 1984 How Old is the Earth?: A Reply to `Scientific' Creationism In "Proceedings of the 63rd Annual Meeting of the Pacific Division, American Association for the Advancement of Science" vol. 1, pt. 3, Frank Awbrey and William Thwaites (Eds).