Fire and Cooking in Human Evolution,
Rates of Genetic Adaptation to Change, Hunter-Gatherers, and Diseases in the Wild(LAST UPDATED 11/3/97)

Note: A number of the research updates listed below refer to the internet PALEODIET listgroup archives for further information and scientific references. The search engine for locating material in the PALEODIET archives can be found at:

You can also find instructions for subscribing to the PALEODIET list at the same link.

At the time the postscripts to the interviews were first written, the Beyond Veg site had not yet been conceived and I did not foresee the need to provide a full set of scientific references for the updates. Eventually, the inclusion of references linked directly to the updates here on the site is planned when time allows.

Overall Note: Some of my views on the adaptation-to-cooking question have changed since the Health & Beyond interviews were published, due to having been exposed to new information sent my way by Paleodiet researcher Loren Cordain, Ph.D. While the underlying data cited in the interview regarding inception dates for the discovery and control of fire remains accurate for the most part, the interpretations and conclusions I drew need revision in several cases. A number of the notes below discuss this.

(EDITORIAL NOTE: Triple-asterisked items in boldface below refer to passages in the interview as originally published, which are followed by updated comments based on additional observations or more recent scientific research.)

Uncertainties about earliest
use of fire for cooking

*** "Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control."

*** "...given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked."

*** "...it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival."
In hindsight here, it is worth pointing out that the above estimate of 125,000 years ago for widespread cooking is not based on very much evidence--nor is anything about the earliest origins for fire and cooking based on very much evidence, for that matter. Much about what is inferred about the earliest occurrences of fire use is based on considerable deduction applied to very limited evidence (unlike the considerably more well-established, consistent, and converging lines of evidence for early meat consumption). There are some who believe a more reasonable estimate might be more like 40,000-60,000 years ago for widespread use of fire, based on wider distribution of hearths, but in my conversations with those perhaps more familiar with the paleontological evidence for cooking and fire than me, it still seems apparent that probably nobody knows for sure given the current state of the paleontological findings.

While some believe the rarity of evidence such as hearths early on after control of fire points toward the early use of fire mostly for heating and protection from predators rather than cooking, discussions on the internet's PALEODIET listgroup have brought up the point that some modern hunter-gatherers in fact have been known on occasion to use cooking techniques that don't require hearths or cooking vessels that leave behind fossil evidence. And though modern hunter-gatherers have evolved behaviorally since ancient ones, this suggests it may not be out of the realm of possibility that other methods could have been used to utilize fire in cooking than just hearths.

*** "...the most interesting of these [stages of sequence for control] is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use."
Again, I would now be more cautious in inferring this (my earlier inference about this was based on Johan Goudsblom's 1992 book, Fire and Civilization), after having been privy to further discussions with Paleodiet researchers. As the preceding paragraph mentions, it is possible the stage of use for warmth and protection from predators may not have so quickly progressed to the stage of using fire for cooking. But no one really seems to know one way or the other for sure from what I can tell.

*** "A brief look at the Australian Aborigines [who utilize cooking for survival purposes] might be illustrative here."
Elsewhere I believe studies of other hunter-gatherer tribes show that many of them cook about half their food. Again, whether this particular behavior of modern hunter-gatherers is representative of behavior in the more-distant evolutionary past should perhaps be regarded with caution.

Incompatibilities between dairy
consumption and human physiology

*** "...for our purposes here, the example [of lactose tolerance having developed within 1,150 years in some segments of the population] does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought."
The estimate of 1,150 years is from the Cavalli-Sforza data. A somewhat more conservative estimate based on the prevalence of lactose tolerance in those of Northern European extraction is that the gene for adult lactose tolerance would have increased from 5% to 70% prevalence within about 5,000 years (approx. 250 generations). [Aoki 1991]

Genetic changes due to "neoteny" (such as adult lactose tolerance) not indicative of overall rates of adaptation. Even while these data for relatively quick evolutionary changes resulting in adult lactase production remain essentially true, however, an important point that should be clarified is that the gene for lactase production is already present and expressed in all humans (for breastfeeding) up through the time of weaning. Therefore, for lactase production to continue into adulthood would require only relatively small changes in the gene, e.g., via the process known as "neotenization" (the retention of juvenile traits into adulthood). Thus, "brand-new" traits, so to speak, unlike polymorphisms such as the gene for lactase production which already exist (even if not in a form previously expressed in adults) would take much longer to evolve.

Additional indications of incongruence between dairy and human physiology. Further, beyond the question of lactose tolerance, I have since learned there would be many additional genetic changes required (than just that for lactose tolerance) to result in more complete adaptation to milk consumption. A number of recent studies demonstrate problems of milk consumption that go considerably beyond whether or not a person is capable of handling lactose:

Lactose and heart disease. One is that lactose itself is a risk factor for heart disease, since in appreciable quantities it induces copper deficiency which, in turn, can lead through additional mechanisms to heart pathologies and mortality as observed in lab animals.

Poor Ca:Mg ratio which can skew overall dietary ratio. Another problem is the calcium-to-magnesium ratio of dairy products of approximately 12:1, which is directly at odds with the ratio of 1:1 from a Paleolithic diet composed of meats, fruits, and vegetables. Depending on the amount of milk in the diet, the resulting overall dietary ratio can go as high as 4 or 5:1. This high ratio leads to reduced magnesium stores, which have the additional ramification of increasing the risk of coronary heart disease, since magnesium helps to lower levels of blood lipids (cholesterol), lower the potential for cardiac arrthymias, lower the oxidation of LDL and VLDL cholesterol (oxidation of cholesterol has been linked to atherosclerosis), and prevent hyperinsulinism. (More about hyperinsulinism shortly below.)

Saturated fat. Milk has also been linked to coronary heart disease because of its very high saturated fat content.

Molecular mimicry/autoimmune response issues. Additionally, autoimmune responses are being increasingly recognized as a factor in the development of atherosclerosis. In relation to this, research has shown milk to cause exceptionally high production of certain antibodies which cross-react with some of the body's own tissues (an autoimmune response), specifically an immune response directed against the lining of the blood vessels. This process is thought to lead to atherosclerotic lesions, the first step that paves the way for consequent buildup of plaque.

[See Part 2 of Loren Cordain, Ph.D.'s posting of 10/9/97 to the PALEODIET list (relayed to the list and posted by Dean Esmay) for details and references relating to the above points about dairy consumption.]

Signs of evolutionary mismatch between
grains and human physiology

*** "Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed."
The most succinct addendum to this assessment is: Not any more, as we have seen above with milk. Where grains are concerned, there are several similar problems which I have since learned have been uncovered. To list a few:

Certain wheat peptides appear to significantly increase the risk of diabetes through molecular mimicry of the body's own tissues, leading to autoimmune responses destructive of cells that produce insulin. [See Loren Cordain, Ph.D.'s post of 6/23/97 on the PALEODIET listgroup for reference citations, and also his article on this site about the evolutionary discordance of grains and legumes in the human diet, for details. ]

Increasing amounts of research suggest that celiac disease is probably also caused by autoimmune responses generated through molecular mimicry by certain peptides in wheat and other grains (known collectively as "glutens"). Additional studies on autoimmune problems have led some researchers to believe numerous additional chronic conditions are also traceable to autoimmune responses generated by the glutens in grains. [See Ron Hoggan's various postings in the archives of the PALEODIET listgroup for information and reference citations about this.]

It is well documented that the phytates in grains bind the minerals iron, zinc, magnesium, and calcium, which can impair bone growth and metabolism, among other problems. Antinutrients in grains also negatively affect vitamin D absorption which can lead to rickets with sufficient levels of intake. Grain consumption also generates biotin deficiencies in experimental animal models--the lack of which impairs fatty acid synthesis. [See Loren Cordain's post of 10/1/97 on PALEODIET for a brief summary and references pertaining to the preceding points, as well as the article on the evolutionary discordance of grains and legumes mentioned in the first bullet point just above.]

Hyperinsulinism and excess carbohydrate consumption. Lastly, and most importantly, significant amounts of grain consumption considerably increase the carbohydrate intake of the diet, and excessive carbohydrate consumption is the primary factor driving what has come to be known as the hyperinsulinism syndrome. Hyperinsulinism and its associated constellation of resulting symptoms, collectively known as "Syndrome X," is not yet well-accepted in the medical and mainstream nutritional communities, but recent research has been increasingly pointing in its direction as a potential underlying factor in the development of many "diseases of civilization," which may be linked together via hyperinsulinism as a common cause.

The etiology of hyperinsulinism leading to the symptomology of Syndrome X is as follows:

All carbohydrates, from whatever source (natural or artificial), whether simple or complex, are ultimately broken down into glucose, or blood sugar. (And remember here that grains contribute the largest percentage of carbohydrates in most modern diets.)

For glucose to be taken up by the cells of the body and used as fuel, the hormone insulin must be secreted by the pancreas.

When chronically excessive levels of carbohydrates are eaten, insulin is overproduced, and the body eventually becomes dulled--more unresponsive--to insulin. This can become a vicious circle: Since the body is dulled to insulin, more has to be produced, which causes further dulling of sensitivity, leading the body to produce even more.

Biomarkers indicating hyperinsulinism. High levels of insulin have been correlated with high blood pressure, high cholesterol, high triglycerides. If the relationship is causative, then by extension hyperinsulinism would also presumably be causative of the health problems that these symptoms themselves lead to (i.e., heart disease, etc.). High levels of insulin also lead to obesity, since in addition to enabling glucose to be used as fuel, insulin also promotes fat storage, while inhibiting the burning of fat.

Hyperinsulinism and diabetes. The foregoing constellation of symptoms constitutes what has come to be called "Syndrome X." The extreme end of Syndrome X is probably Type II diabetes, in which the body has become so insulin-resistant it can no longer control its blood sugar levels, or even Type I diabetes, in which the pancreas overloads and is no longer able to produce insulin at all.

Recent studies indicate diets higher in protein reduce symptoms of Syndrome X. Dovetailing with this research on Syndrome X is that diets higher in protein--or diets which lower carbohydrate levels by substituting animal protein--improve blood lipid (cholesterol) profiles by lowering LDL, VLDL (the "bad cholesterols") and total cholesterol, while increasing HDL ("good cholesterol") and improving other symptoms of Syndrome X. Conversely, repeated studies are showing that low-fat, high-carbohydrate diets popular today do the opposite. [See Loren Cordain, Ph.D.'s posting of 3/26/97 to the PALEODIET list, relayed to the list by Dean Esmay, for reference citations on this.] Note particularly that these dietary changes which improve hyperinsulinism parallel the macronutrient composition that would have prevailed in the evolutionary diet of humans during Paleolithic times. (I.e., fairly low carb intake combined with relatively higher levels of fat and protein due to the prevalence of animal flesh in the diet, and more limited availability of fruits with high sugar content.)