A series of three scientific papers was published in the Proceedings of the National Academy of Sciences (1-3) evaluating the diet of numerous species of fossilized hominins (bipedal or upright walking apes) who lived in Africa from 4.1 to 1.4 million years ago. Additionally, the diet of a grass eating baboon was examined (4). Many of the authors of these papers are friends and colleagues whose data contribute to our understanding of our remote African ancestors’ diets. Collectively, the following hominin genus’s and the time frame they lived were examined: Australopithecus (circa 4 million years ago [MYA]), Kenyanthropus (circa 3-3.6 MYA), Paranthropus (circa 2.5-1.4 MYA), and early Homo (circa 2.3-1.5 MYA).

Before I get into the details of these studies, let me first openly reprimand some of the popular press who have incorrectly interpreted these studies by suggesting that our distant ancestors were regular consumers of grass and grass seeds (cereal grains). For instance, popular blogger Carrie Arnold, titles her write-up (5) of these three scientific studies as, “Even Our Ancestors Never Really Ate the “Paleo Diet”, and goes on to say, “Researchers are just beginning to understand what ancient humans ate, and these recent studies show that grasses and grains have been part of the human diet for millions of years.” As I will shortly show you, this statement represents sensationalistic journalism and is patently false, as nowhere in any of these three papers (1-3) is this conclusion reached by any of the authors.

Another piece of inaccurate and hyped journalism (6) by author Chris Joyce at NPR labels his piece, “Grass: It’s What’s For Dinner (3.5 Million Years Ago)”. Chris then tells us, “What the tale of the teeth reveals is this: About 3.5 million years ago, our ancestors started switching from the ape diet – leaves and fruit – to grasses and grass-like sedges.” This statement is false and again nowhere in any of these three papers (1-3) is this assumption made by the scientists who wrote these manuscripts. Chris finally gets it right in his following statement, “Now, one thing this carbon isotope technique can’t tell is whether Australopithecus just grazed like a bunch of antelope, or whether they ate the antelope that did the grazing”. However, in his final paragraph his conclusion again is erroneous, when he tells us, “ So, what to make of this? Well, for one, those who favor a “Paleo diet” that resembles what our early ancestors lived on might consider investing in a lawn mower. After all, lawn grass is probably American’s largest un-harvested crop – there’s plenty to go around. Why not go back to our roots?”

Catherine Griffin, a writer for Science World Reports obviously did not carefully read any of these three papers (1-3) because of incorrect statements she has made in her brief article (7), “Human Ancestors’ Ape-like Diet Changed 3.5 Million Years Ago to Grass”. Catherine informs us, “Feel like eating some grass? Didn’t think so – but our ancient ancestors did. About 3.5 million years ago, our human forebears added tropical grasses and sedges to an ape—like diet of leaves and fruits from trees and shrubs”. She goes on to make other statements like, “In the end, the scientists found a surprising increase in the consumption of grasses and sedges” and “The earliest ancestors that consumed substantial amounts of grass foods . . .” that were never made in the original scientific papers.

In science the devil is almost always in the details. Accordingly, all three of these popular science writers have done their readers a disservice by inaccurately reporting the details of these three studies (1-3) and making assumptions about ancient hominin diets that the scientists themselves did not make.

In all three papers the measurement of two stable isotopes of carbon (13C and 12C) were made from samples of enamel in teeth of extinct hominins. From the ratio 13C/12C a difference (delta) (?13C) is calculated relative to a standard value (8). ?13C values can then be used to determine if the carbon isotopes in the enamel ultimately originated from plants using either the C3 or C4 photosynthesis pathways.

In Africa and elsewhere, C4 plants include grasses and sedges and little else, whereas C3 plants include trees, shrubs, herbs and bushes. C4 plants incorporate relatively more 13C into their tissues during photosynthesis than do C3 plants. Hence, ?13C values extracted from enamel can reveal the dietary source of the isotopic signature, be it: 1) grasses and sedges, 2) trees, bushes, shrubs, herbs or 3) a combination of both categories of plants.

Unfortunately, a number of fundamental limitations exist with ?13C analysis to evaluate diet. ?13C measurements cannot determine the exact species of either C3 or C4 plants that were consumed, but more importantly ?13C values cannot distinguish if the C3 or C4 signatures originated from the direct consumption of plants or from the indirect consumption of animals that consumed these plants. To emphasize this essential concept, I have bolded it. In all three studies (1-3), this crucial point was brought out again and again by the authors. Apparently, the popular science writers covering these papers missed it. The data from all three papers (1-3) corroborates the increasing body of literature (8) demonstrating an increased C4 signature in the enamel of African hominins starting about 3.5 MYA, but whether or not it resulted from increased consumption of animal or plant foods or both is unknown. The authors of one of these three scientific papers (1) put it best, “The 13C-enriched resources that hominins ate remain unknown and must await additional integration of existing paleodietary proxy data and new research on the distribution, abundance, nutrition and mechanical properties of C4 (and CAM) plants.”

I would like to point out a number of logical shortcomings with any interpretation of the hominin C4 data suggesting that it originated primarily from increased consumption of either grass leaves, grass seeds (cereal grains) and sedges rather than from consumption of animals (grazers) that ate grasses and grains. The point in time (~3.5 MYA) at which the C4 signature begins to increase occurs simultaneously with the earliest known use (before 3.39 MYA) of stone tools to cut flesh from animal carcasses and to extract marrow from their bones (9). Such hominin dietary practices have also been documented by 2.5 MYA (10) and appear to be widely employed by 2.0 MYA (11) and by 1.5 MYA (12). Hence by triangulating these indisputable archaeological facts with stable carbon isotope data, it is virtually certain that ?13C values in hominin enamel were enriched partially or perhaps mainly from increasing consumption of animals that ate C4 plants.

Other lines of evidence indicate that early African hominins were increasingly consuming more animal foods during the same time interval (3.5 MYA to 1.5 MYA) that ?13C had become enriched. Aiello and Wheeler (13) have shown that the mass of the human gastrointestinal tract is only about 60% of that expected for a similar-sized primate. Consequently, the increase in brain size that occurred in hominins starting ~2.5 MYA was balanced by an almost identical reduction in the size of the gastrointestinal tract (13). The selective pressures that simultaneously allowed for both a reduction in gut size and an increase in brain size are attributed to an improvement in dietary quality (DQ) that occurred largely as a result of increased consumption of animal foods by Australopithecine species prior to the emergence of the first members of Homo (13-15). Because a diet with an increased DQ contains less structural plant parts and more animal material (16), its nutrient and energy density is greater. Hence the greater DQ of animal foods permitted relaxation of the selective pressures in hominins that formerly selected for a large, metabolically active gut necessary to process low DQ foods, which in turn permitted the natural selection of a large metabolically active brain (13, 14). Grass leaves and seeds maintain a low DQ (15), and are high in fiber and cellulose and are indigestible in their raw, unprocessed state in modern humans (17). Accordingly, the proposition that increased consumption of grass leaves and seeds were the C4 source in hominin enamel, is inconsistent with the evolutionary gut/brain metabolic tradeoff (13-15). Selective pressures that reduce the size and metabolic activity of the gut require more energetically dense foods like meat and marrow – not energy poor, high cellulose and high fiber foods like grasses and sedges.

In addition to their low DQ, grass leaves and seeds are devoid of long chain fatty acids of both the omega 6 family (arachidonic acid, 20:4n6) and omega 3 family (docosahexanoic acid, 22:6n3), as are all plant foods (15). These fatty acids are necessary structural elements required for the synthesis of brain and neural tissues and cannot be produced endogenously in sufficient quantities to relax the selective pressures normally constraining encephalization (brain volume expansion relative to body weight). Therefore, exogenous sources of these two fatty acids must be obtained through diet in hominins to permit the evolution of large metabolically active brains (15, 18-21). Likely candidate animal foods which simultaneously increased the DQ and provided arachidonic acid (AA) and docosahexanoic acid (DHA) were scavenged de-fleshed long bones (which contain marrow – a high fat food) and skulls (which contain brains – high in AA and DHA) from carnivore kills (15). These foods along with meats from grazing animals likely represent the dominant dietary source for the increasing C4 signature in our African ancestors.

Another nutritional point lends little support to the notion that the increasing C4 signature in hominins starting 3.5 MYA resulted from direct consumption of grass leaves or seeds. All great apes (chimps, gorillas, orangutans and gibbons) living in their native environment bear ?13C values indicative of near total reliance upon C3 plants. Only a single higher primate (a baboon species, Theropithecus gelada) consumes grass leaves and seeds as their primary dietary source. Accordingly, this baboon maintains a carbon isotopic signature that is nearly 100 % C4 derived (4).

High reliance upon grass and grass seeds in Theropithecus gelada or in any hominin requires a number of evolutionary adaptations in the digestive tract to accommodate these low quality, high cellulose foods – none of which have been observed in contemporary humans. All vertebrates lack the enzyme cellulase which is required to breakdown cellulose and hemicellulose found in grass leaves and seeds into glucose. Mammals that rely heavily upon grass and grass seed consumption for their sustenance have evolved large caecums (hindguts) or a four compartment stomach (ruminants) containing enormous quantities of microflora which have the capacity to ferment and breakdown cellulose, hemicellulose, starches and proteins into simpler compounds which can then be assimilated and metabolized by the host animal. In the case of Theropithecus gelada (the grass eating baboon), it has evolved a large hindgut where microbial fermentation of grass takes place (22). In contemporary humans, and in the hominin line that led to Homo, there is no credible evidence that gut morphology became larger and more metabolically active to support fermentation of cellulose in the caecum, but rather the opposite (13, 14). Hence, without the evolution of hindgut fermentation, efficient consumption of grass and grass seeds would have been impossible in any hominin species.

Other comparative physiological data between modern humans and the grass eating baboon (Theropithecus gelada) support the notion that the increasing C4 signature in evolving African hominins was not a result of grass or sedge consumption. Dicots or C3 plants produce compounds called tannins which act as a chemical defense system that discourage animals from eating them. Monocots or C4 plants (such as grass and sedges) do not synthesize tannins (23). Over the course of evolution, mammals that consume tannin containing C3 plants have evolved measures to counter the adverse effects of tannins. The most important of these mechanisms are salivary proteins that act as a defense against dietary tannins (24). These proline rich salivary proteins (PRPs) bind tannins and form stable complexes which prevent tannins from producing adverse health effects (24-27).

Species that usually ingest tannin containing foods as part of their natural diets produce high levels of PRPs, whereas species not exposed to tannins produce little or no PRPs (24). In this regard, the saliva of the grass (C4) eating baboon (Theropithecus gelada) produces a saliva devoid of PRPs (23). In contrast, modern humans synthesize a saliva containing abundant concentrations of PRPs (25-27) which have been suggested to result from the long evolutionary history of fruit and vegetable (C3 plants) consumption in human ancestors (25). If ancestral African hominins had intensely exploited C4 plants (grasses and sedges) for millions of years, then it might be expected that the line of hominins that led to Homo and modern humans would also maintain low concentrations of salivary PRPs similar to Theropithecus gelada. Data in contemporary Homo sapiens do not support this conclusion.

In summary, recent comprehensive analyses (1-3) of ?13C values in the enamel of African hominins from 4.1 to 1.5 MYA support the conclusion that plants of C4 origin were ultimately responsible for this isotopic signature. Nevertheless, when the isotopic data is triangulated from archaeological, physiological and nutrition evidence, it is apparent that the C4 signature in ancestral African hominin enamel almost certainly is resultant from increased consumption of animals that consumed C4 plants.

40.Finch CE. Atherosclerosis is an old disease: Summary of the Ruffer Centenary Symposium, The Paleocardiology of Ancient Egypt, a meeting report of the Horus Study team. Exp Gerontol 2011;46(11):843–6.

41.Lindeberg S. Who wants to be normal? European Heart Journal 2005;26(24):2605–6.

Several recent epidemiological studies associated red meat and processed meat intakes to increased total mortality, heart failure, diabetes and cancer risks. Statements such as “men with the highest average intake of red meat (almost 10 servings per week) were at a 24 per cent higher risk of heart failure than men with the lowest average weekly intakes” make newspapers headlines everyday and make everyone aware of potential adverse health effects of red and/or processed meats. These apparently impressive results are often used by the media to scandalize the population, and to catch attention. However, several key points must be considered before making general population recommendations from these epidemiological studies.

In one of these recent studies, the “Meat intake and mortality: a prospective study of over half a million people” (Sinha et al.) [1], the author’s main conclusion was that “red and processed meat intakes were associated with modest increases in total mortality, cancer mortality, and cardiovascular disease mortality”. This was a prospective study of, as referred, more than half a million people, aged 50-71 years at baseline, followed from 1995 to 2005, where meat intake was estimated with a food frequency questionnaire according to quintiles of red meat, white meat and processed meat intakes. A casual look at this study leads one to believe that red meat was analyzed separately from the processed meat, but this is not the case. The red meat group also included processed red meats, from grain-fed animals.

Red meat intake ranged from 9.3grs/1000kcal in the first quintile to 68.1grs/ 1000kcal in the fifth quintile. The multivariate model was adjusted for several food and lifestyle covariates, such as: age, education, marital status, family history of cancer, race, body mass index, 31-level smoking history, physical activity, energy intake, alcohol intake, vitamin supplement use, fruit consumption, vegetable consumption and menopausal hormonal therapy among women. Notice the authors assumed that saturated fat (SAFA) and cholesterol content in read meat would be responsible for the development of cardiovascular disease (CVD) and cancer, hence they might be biased in their hypothesis. These preconceived ideas frequently influence the selection of adjustment parameters, leaving other potentially more important factors ignored.

Relative and absolute risks

The results of “Meat intake and mortality: a prospective study of over half a million people” (Sinha et al.) [1] showed namely that the maximum relative risk of increased all-cause mortality for men, that between the extreme quintiles of red meat intake, apparently is 31%. Also, according to [6], “if men and women in the studied age group (50-71) would reduce their intakes of red meat to that of the group with the lowest intake, then the mortality risk is expected to be reduced by 11% in men and 16% in women over the observed period of time. The portion of cardiovascular disease mortality of total mortality could be reduced by 11% for men and 21% for women. By reducing the intake of processed meat, the cardiovascular disease mortality for women over the period of study of 10 years could be reduced to 20%.”

At first glance, all these statistical numbers look very impressive, and thus convincing, but they are quite similar to relative risks, which don’t take into account the size of the observed population. Actually, in this Sinha et al. study these ratios are actually hazard ratios estimated by Cox proportion. These are similar to relative risks, they also ignore the population size. Although ratio measures are commonly reported in the medical

literature, the underlying absolute risks are not. In a review of ratio measures in six major medical journals, Schwartz et. al. [12] found that “the underlying absolute risks were often difficult to access or were missing altogether”. These researches explain that “the lack of accessibility of these fundamental data may well lead journal readers (doctors, policy makers, journalists, and patients) to have exaggerated perceptions of the reported effect sizes”.

Absolute risks and the so called “numbers needed to harm/treat” provide a much better picture of the (dis)advantages of a certain food intake and/or lifestyle change, in this case the red meat intake. When it is said that reducing red meat to that of the group with the lowest intake, the mortality (relative) risks would be reduced by 11% in men and 16% in women, we estimate that corresponding absolute risk reductions of such measures would be approximately 2.1% and 1.4%. On what concerts total mortality, we estimated that the equivalent absolute risk translates into about 4% from the lowest to highest quintile of red meat intake. In a similar way, the maximum absolute risks of cancer and CVD deaths were even smaller, probably close to 1.3% and 1.0% respectively. Also notice these are just statistical/observational associations, establishing causality requires that a plausible biological/biochemical mechanism is identified in carefully designed interventional studies. These causality proving studies against red meat intake simply don’t exist until now.

Despite of these unfavorable statistical associations found for red meat intake and several unhealthy conditions that lead to increased mortality, even if direct biological causality existed, these results might still be not very relevant for most people, those in the many countries where their red meat intakes fall in the lower quintiles of this American study. For example, in the case of Germany, as discussed in a review of the Federal Institute for Risk Assessment, the average intake of red meat for men lies between the 2nd and 3rd quintile of the US data, and that of women between the 1st and 2nd quintile [6]. So the current, absolute amounts of red meat intake in each country should also be taken into account before assuming that recommending eating less red meat would provide any relevant mortality risk reductions.

Questionnaire inaccuracies

According to Fraser [14], “the potential correlations between nutrients, and to a lesser extent foods, make it difficult to know whether the nominated variable is actually the active principle or whether there is some other dietary risk factor that is closely associated. It is not generally recognized that all traditional analyses of this sort are based on a powerful but incorrect assumption: that there are no errors in dietary assessment. If the incorrect assumption is not satisfied, relative risk estimates become distorted—reduced by one-half or more in some cases.”

C. Masterjohn text/ideas:

According to Chris Masterjohn, a researcher affiliated with the Weston A. Price Foundation, because of important questionnaire inaccuracies, the Sinha et. al. study found “a correlation between increased mortality and a population's propensity to report eating meat, not a correlation between mortality and true meat intake”. In an extensive blog article, Masterjohn explains that the food frequency questionnaire (FFQ) contained 124 questions, each question about a particular type of food or group of food. Participants were asked how often they consumed those foods over the course of the previous year, giving them ten options. Then it asked how large of a serving size they consumed, giving them usually three or four options. Sometimes they were given additional instructions, like including sandwiches in some cases or excluding sandwiches in other cases.

The problem is any individual trying to quantify his or her average intake of 124 foods over an entire year is going to have to engage in a lot of guess work. Even 24-hour recalls of what a person ate the day before are subject to a great deal of error. For this reason, researchers will commonly "validate" an FFQ or a 24-hour recall to test whether these accurately measure the intake of the foods of interest. In order to do this, they have the participants make a weighted dietary record where they meticulously weigh everything they eat with a dietetic scale and record it as they prepare each meal. Then the researchers compare the FFQ or 24-hour recall to the weighted dietary record, assuming that the weighted dietary record is the best indicator of true dietary intake.

According to Masterjohn’s article, this type of validation was not applied and, instead, a 24-hour recall was used. With this simplified procedure, “the author's validation study found that the true intake of protein, carbohydrate, fat, cholesterol, fiber, vitamins, minerals, fruits, and vegetables could explain between 5 percent and 45 percent of the variation in the participants' answers on the FFQ, but they never validated the FFQ's ability to predict the true intake of meat”. When working with this type of questionnaires, other researchers found that FFQ predict true intake of some foods very well, and true intake of other foods very poorly. The ability of FFQ to predict true intake of meats is probably horrible. When some foods, in a certain cultural context, are socially and emotionally charged, participants are more likely to lie about their intake of those foods, or more likely to deceive themselves about how much of those foods they are really consuming.

As an example of FFQ inevitable inaccuracies, consider what the researchers who validated the Nurses' Health Study FFQ had to say: “Focusing on the second questionnaire, we found that butter, whole milk, eggs, processed meat, and cold breakfast cereal were underestimated by 10 to 30% on the questionnaire. In contrast, a number of fruits and vegetables, yoghurt and fish were overestimated by at least 50%. These findings for specific foods suggest that participants over-reported consumption of foods often considered desirable or healthy, such as fruit and vegetables, and underestimated foods considered less desirable. This general tendency to over-report socially desirable foods, whether conscious or unconscious, will probably be difficult to eliminate by an alteration of questionnaire design.”

Confounding factors (and results)

Should we consider these results meaningful enough to establish solid recommendations for the general population? Are all these epidemiological data sufficiently adjusted to the several lifestyle confounding factors that exist, and thus trustable? When dealing with large populations, it is difficult to control for all possible causing factors and, although results are adjusted for several factors, others cannot be ruled out, as the authors state. For example, the way meat is produced and cooked may affect the production of carcinogenic compounds such as heterocyclic amines, polycyclic aromatic hydrocarbons, nitrates, nitrites and N-nitroso compounds [15]. Other factors which could be involved in the potential disease promoting properties of meat, depending on the way it is produced and/or cooked [16], are the time meat stays in the intestines, fruit and vegetable consumption, hormone residues or salt.

Individuals consuming red and processed meats also have lifestyle behaviors that may significantly affect mortality types, like education, physical activity, smoking, alcohol use, adiposity and fruit/vegetable intake. According to Mozaffarian [2], the model of Sinha et al. did not adjust for parameters like income, air pollution exposure, intake of high-glycemic index starches, sugars, and processed foods, and lower intake of dietary fiber, whole grains, and nuts, seeds, and legumes. Also there was no record about personal history of CVD, or related conditions like hypertension, diabetes, dyslipidemia, nor the use of medications, and such these factors were also not included in the multivariate analysis [4]. As referred above, the authors assumed that SAFA and cholesterol content in read meat would be related to CVD but, despite of this, they don’t inform of differences in saturated fat intake between quintiles or if this was included at all in the multivariate analysis.

Given the known relationship between glycated hemoglobin (HbA1c), diabetes and heart disease, without the carbohydrate intake information (modern high-glycemic index starches and sugars tend to raise HbA1c) this study could not evaluate if it was the eventually higher amounts of carbohydrates, consumed along with the higher portions of meat, that raised the mortalities. Also, since high blood sugars tend to suppress the immune system and, at the same time, feed glucose to cancer cells, this might help explain the link found between increased cancer deaths and higher red meat intakes. Other similar observational studies, where meat intake was more accurately estimated, didn’t found such association.

As Mozaffarian [2] refers in his comments, in such a large population, with broad social/economical, geographic, ethnic/cultural, and lifestyle diversity, adequate control for confounding factors assumer even higher importance. In fact, the small observed risk differences (eg, relative risks of 1.1-1.3), that are rendered statistically significant because of the large cohort, are those most susceptible to being due to bias. In these cases, the use of a “negative control”, for which an outcome for which the exposure would have no plausible mechanism, would be recommended. The multivariate model should be adjusted/calibrated until the association of observed variables with the negative control shows no effect. In the Sinha et al. study, such residual confounding actually exists, the “all other deaths” item. Since the association found of meat intake with this item was the strongest found in this study, this strongly suggests that insufficient adjustment for residual confounding variables is present.

Healthy cohort effect

Regarding other factors involved in mortality associated to meat intake, it is noteworthy to mention that, in this Sinha et al. study, the participants in the top quintile of red meat consumption were 3 times more likely to smoke, half as likely to do regular exercise, much less likely to have a college degree, were substantially heavier and also had higher caloric intakes than low red-meat consumers [3].

Furthermore, it is common that people who believe that red meat is not healthy tend to adopt other lifestyle factors that favor their health like doing more exercise, not smoking, not drinking alcohol, not frying and taking supplements. In summary, cause-effect cannot be deduced from observational studies as cancer or CVD are multifactorial diseases, in which several genetic and environmental factors are involved. If some relevant (unknown?) confounding factors are not adjusted in the multivariate model, then the risk factors are often overestimated. Large samples, such as this one, with different ethnic, socioeconomic, geographic, cultural and lifestyle need to be carefully controlled for many confounding factors.

Also, when health authorities decide that a certain food and/or lifestyle behavior are healthy, from that point it becomes increasingly difficult to measure their impact on public health with epidemiological studies. This happens because health conscious people tend to gravitate toward the official, mainstream recommendations. As Dr. Stephan Guyenet, a researcher from University of Washington, explains in his blog Whole Health Source, “if a theory manages to become implanted early on, it will become a self-fulfilling prophecy as healthy, conscientious people adopt the behavior and are detected by subsequent observational studies. People who don't care about their health or aren't motivated enough to make a change will keep living how they used to, and that will also be detected”.

Dr. Guyenet then adds that “you can't measure all the little things that accompany a health-conscious lifestyle. Do the participants take the stairs or the elevator? Do they take supplements, and if so, which ones? How much sunlight do they get? Do they have positive relationships with their friends and family? (…) What is the quality of the foods they buy? How often do they visit the doctor, and how often do they follow her advice? I believe there are too many confounds to measure and correct for. In my opinion, this means that observational data gathered from populations that already have opinions about the factor you're trying to study are unreliable and will tend to reinforce prevailing notions.”

Evolutionary evidence above epidemiological uncertainties

As a whole, epidemiological/observational studies can only provide statistical associations, but an epidemiological association does not equal causation and as we well know, very often, epidemiological studies show something and then randomized controlled trials show no association whatsoever, or even the opposite. They should not be completely worthless, but only by themselves we believe they are not sufficient to establish solid recommendations and dietary guidelines for the general population.

Given all these epidemiological limitations, it is then worthwhile not forgetting the most powerful paradigm in human health: evolution. It is weird that a food which has been part of the human diet for 2.6 [17], or even 3.4 [18], million years is now responsible for the cause of CVD or cancer, among other diseases associated to meat intake [19]. Our genome was shaped during the long period of the paleolithic era with little change (0,005%) since the agricultural revolution, 10,000 years ago, despite an enormous change in human diet and other lifestyle factors.

Nowadays, more than 70% (cereal grains, dairy products, refined vegetable oils and refined sugars) of the calories of the typical western diet come from foods unavailable for our ancestors during the paleolithic era. This discordance between our ancient, genetically determined biology and the nutritional characteristics of the actual diet may be the real cause of the so called diseases of civilization, including CVD and cancer [17].

The writer of this article suggests that the Paleo Diet has only been scientifically tested in “one tiny study”. This quote is incorrect as five studies (1-7); four since 2007, have experimentally tested contemporary versions of ancestral human diets and have found them to be superior to Mediterranean diets, diabetic diets and typical western diets in regards to weight loss, cardiovascular disease risk factors and risk factors for type 2 diabetes.

The first study to experimentally test diets devoid of grains, dairy and processed foods was performed by Dr. Kerin O’Dea at the University of Melbourne and published in the Journal, Diabetes in 1984 (6). In this study Dr. O’Dea gathered together 10 middle aged Australian Aborigines who had been born in the “Outback”. They had lived their early days primarily as hunter gatherers until they had no choice but to finally settle into a rural community with access to western goods. Predictably, all ten subjects eventually became overweight and developed type 2 diabetes as they adopted western sedentary lifestyles in the community of Mowwanjum in the northern Kimberley region of Western Australia. However, inherent in their upbringing was the knowledge to live and survive in this seemingly desolate land without any of the trappings of the modern world.

Dr. O’Dea requested these 10 middle-aged subjects to revert to their former lives as hunter gatherers for a seven week period. All agreed and traveled back into the isolated land from which they originated. Their daily sustenance came only from native foods that could be foraged, hunted or gathered. Instead of white bread, corn, sugar, powdered milk and canned foods, they began to eat the traditional fresh foods of their ancestral past: kangaroos, birds, crocodiles, turtles, shellfish, yams, figs, yabbies (freshwater crayfish), freshwater bream and bush honey. At the experiment’s conclusion, the results were spectacular, but not altogether unexpected given what known about Paleo diets, even then. The average weight loss in the group was 16.5 lbs; blood cholesterol dropped by 12 % and triglycerides were reduced by a whopping 72 %. Insulin and glucose metabolism became normal, and their diabetes effectively disappeared.

The first recent study to experimentally test contemporary Paleo diets was published in 2007 (5). Dr. Lindeberg and associates placed 29 patients with type 2 diabetes and heart disease on either a Paleo diet or a Mediterranean diet based upon whole grains, low-fat dairy products, vegetables, fruits, fish, oils, and margarines. Note that the Paleo diet excludes grains, dairy products and margarines while encouraging greater consumption of meat and fish. After 12 weeks on either diet blood glucose tolerance (a risk factor for heart disease) improved in both groups, but was better in the Paleo dieters. In a 2010 follow-up publication, of this same experiment the Paleo diet was shown to be more satiating on a calorie by calorie basis than the Mediterranean diet because it caused greater changes in leptin, a hormone which regulates appetite and body weight.

In the second modern study (2008) of Paleo Diets, Dr. Osterdahl and co-workers (7) put 14 healthy subjects on a Paleo diet. After only three weeks the subjects lost weight, reduced their waist size and experienced significant reductions in blood pressure, and plasminogen activator inhibitor (a substance in blood which promotes clotting and accelerates artery clogging). Because no control group was employed in this study, some scientists would argue that the beneficial changes might not necessarily be due to the Paleo diet. However, a better controlled more recent experiments showed similar results.

In 2009, Dr. Frasetto and co-workers (1) put nine inactive subjects on a Paleo diet for just 10 days. In this experiment, the Paleo diet was exactly matched in calories with the subjects’ usual diet. Anytime people eat diets that are calorically reduced, no matter what foods are involved, they exhibit beneficial health effects. So the beauty of this experiment was that any therapeutic changes in the subjects’ health could not be credited to reductions in calories, but rather to changes in the types of food eaten. While on the Paleo diet either eight or all nine participants experienced improvements in blood pressure, arterial function, insulin, total cholesterol, LDL cholesterol and triglycerides. What is striking about this experiment is how rapidly so many markers of health improved, and that they occurred in every single patient.

In an even more convincing recent (2009) experiment, Dr. Lindeberg and colleagues (2) compared the effects of a Paleo diet to a diabetes diet generally recommended for patients with type 2 diabetes. The diabetes diet was intended to reduce total fat by increasing whole grain bread and cereals, low fat dairy products, fruits and vegetables while restricting animal foods. In contrast, the Paleo diet was lower in cereals, dairy products, potatoes, beans, and bakery foods but higher in fruits, vegetables, meat, and eggs compared to the diabetes diet. The strength of this experiment was its cross over design in which all 13 diabetes patients first ate one diet for three months and then crossed over and ate the other diet for three months. Compared to the diabetes diet, the Paleo diet resulted in improved weight loss, waist size, blood pressure, HDL cholesterol, triglycerides, blood glucose and hemoglobin A1c (a marker for long term blood glucose control). This experiment represents the most powerful example to date of the Paleo diet’s effectiveness in treating people with serious health problems.

So, now that I have summarized the experimental evidence supporting the health and weight loss benefits of Paleo Diets, I would like to directly respond to the errors in the U.S. News and World Report article.

1. “Will you lose weight? No way to tell.”

Obviously, the author of this article did not read either the study by O’Dea (6) or the more powerful three month crossover experiment by Jonsson and colleagues (9) which demonstrated the superior weight loss potential of high protein, low glycemic load Paleo diets. Similar results of high protein, low glycemic load diets have recently been reported in the largest randomized controlled trials ever undertaken in both adults and children.

A 2010 randomized trial involving 773 subjects and published in the New England Journal of Medicine (8) confirmed that high protein, low glycemic index diets were the most effective strategy to keep weight off. The same beneficial effects of high protein, low glycemic index diets were dramatically demonstrated in largest nutritional trial, The DiOGenes Study (9), ever conducted in a sample of 827 children. Children assigned to low protein, high glycemic diets became significantly fatter over the 6 month experiment, whereas those overweight and obese children assigned to the high protein, low glycemic nutritional plan lost significant weight.

2. “Does it have cardiovascular benefits? Unknown.”

This comment shows just how uninformed this writer really is. Clearly, this person hasn’t read the following papers (1 – 6), which unequivocally show the therapeutic effects of Paleo Diets upon cardiovascular risk factors. Moreover, as we have already reviewed elsewhere (10-12), high protein diets have been shown to improve dyslipidemia and insulin sensitivity, and are potential effective strategies for improving metabolic syndrome. Furthermore, mounting evidence suggests that a reduced-carbohydrate diet (which is obviously lower in sugars and cereal grains) may be superior to a western type low-fat, high-carbohydrate diet, especially in metabolic syndrome patients, because it may lead to better improvement in insulin resistance, postprandial lipemia, serum fasting triglycerides and HDL-C, total cholesterol/HDL-C ratio, LDL particle distribution, apo B/apo A-1 ratio, postprandial vascular function, and various inflammatory biomarkers (13, 14).

Finally, the evidence for recommending whole grains to reduce cardiovascular disease risk is based on epidemiological studies or intervention trials with soft end-points, while randomized controlled trials with hard end points do not seem to support it. For instance, the DART study, found a tendency towards increased cardiovascular mortality in the group advised to eat more fiber, the majority of which was derived from cereal grains (15). And of relevance, this non-significant effect became statistically significant, after adjustment for possible confounding factors, such as medication and health state (16).

“And all that fat would worry most experts.”

This statement represents a “scare tactic” unsubstantiated by the data. As I, and almost the entire nutritional community, have previously pointed out, it is not the quantity of fat which increases the risk for cardiovascular disease or cancer, or any other health problem, but rather the quality. Contemporary Paleo Diets contain high concentrations of healthful omega 3 fatty acids and monounsaturated fatty acids that actually reduce the risk for chronic disease (10-12, 17-22).

3. “Can it prevent or control diabetes? Unknown.”

Here is another example of irresponsible and biased journalism, which doesn’t let the facts speak for themselves. Obviously, the author did not read the study by O’Dea (6) or Jonsson et al. (2), which showed dramatic improvements in type 2 diabetics consuming Paleo diets.

“but most diabetes experts recommend a diet that includes whole grains and dairy products.”

If the truth be known, in a randomized controlled trial, 24 8-y-old boys were asked to take 53 g of protein as milk or meat daily (23). After only 7 days on the high milk diet, the boys became insulin resistant. This is a condition that precedes the development of type 2 diabetes. In contrast, in the meat-group, there was no increase in insulin and insulin resistance. Furthermore, in the Jonsson et al. study (2) milk and grain free diets were shown to have superior results in improving disease symptoms in type 2 diabetics.

Finally, in an interventional study including 2263 postmenopausal women, participants were assigned to a low-fat (<20% en), high whole-grain fiber (>6 servings per day), high fruit (>5 per day) and high vegetable (>5 servings per day) diet or comparison group with no advice. After 6 years of follow-up, those women with diabetes at the start of the study, and allocated to the low-fat/high whole-grain fiber, actually worsened their glucose control (24). Notwithstanding, the majority of the evidence, supports the beneficial effect of soluble fiber, found mainly in vegetables and fruits, while the evidence supporting the beneficial effects of insoluble fiber, found in whole grains, seems less evident (25-28).

4. “Are there health risks? Possibly. By shunning dairy and grains, you’re at risk of missing out on a lot of nutrients.”

Once again, this statement shows the writer’s ignorance and blatant disregard for the facts. Because contemporary ancestral diets exclude processed foods, dairy and grains, they are actually more nutrient (vitamins, minerals and phytochemicals) dense than government recommended diets such as the food pyramid. I have pointed out these facts in a paper I published in the American Journal of Clinical Nutrition in 2005 (11) along with another paper in which I analyzed the nutrient content of modern day Paleo diets (19). In addition, micronutrient analysis derived from the two studies performed by Lindeberg, et al. (5) and Jönsson et al. (2) shows that, except for calcium, a Paleolithic type diet, not only meets all of the micronutrients DRI, but in some cases exceeds that of the whole grain and dairy food diets. Regarding vitamin D, as we have already pointed out in a recent paper (12), except for fatty ocean fish, there is very little vitamin D in any commonly consumed natural (that is, not artificially fortified) food, and throughout history, almost all hominins (except for those living in the far North, such as the Inuit people) depended on the sun to satisfy their vitamin D requirements.

Moreover, most nutritionists are aware that processed foods made with refined grains, sugars and vegetable oils have low concentrations of vitamins and minerals, but not all have realized that dairy products and whole grains contain significantly lower concentrations of the 13 vitamins and minerals most lacking in the U.S. diet compared to lean meats, fish and fresh fruit and vegetables (11, 19). Interestingly, although micronutrient intake is important, intestinal absorption is even more impactful. It is widely known that some antinutrients contained in cereal grains, such as phytate, binds to divalent minerals (i.e., zinc, iron, calcium and magnesium) compromising their absorption (29).

“Also, if you’re not careful about making lean meat choices, you’ll quickly ratchet up your risk for heart problems” .

Actually, the most recent comprehensive meta-analyses and reviews do not show fresh meat consumption whether fat or lean to be a significant risk factor for cardiovascular disease (30-34), only processed meats such as salami, bologna, bacon and sausages (30).

Abstract: It is increasingly recognized that certain fundamental changes in diet and lifestyle that occurred after the Neolithic Revolution, and especially after the Industrial Revolution and the Modern Age, are too recent, on an evolutionary time scale, for the human genome to have completely adapted. This mismatch between our ancient physiology and the western diet and lifestyle underlies many so-called diseases of civilization, including coronary heart disease, obesity, hypertension, type 2 diabetes, epithelial cell cancers, autoimmune disease, and osteoporosis, which are rare or virtually absent in hunter–gatherers and other non-westernized populations. It is therefore proposed that the adoption of diet and lifestyle that mimic the beneficial characteristics of the preagricultural environment is an effective strategy to reduce the risk of chronic degenerative diseases.