Can humans really do anything to prolong life? A recent article by Christopher Wanjek in the Washington Post said "Humans can reap no such benefits from the continuing flood of anti-aging potions and precepts, which are at best naively optimistic and at worst fraudulent and harmful. Wanjeck goes on to say that "every book, powder or pill that promises a fountain of youth..... is just plain wrong."

"There is no intervention that has been proven to slow, stop or reverse aging. Period," says Leonard Hayflick, professor of anatomy at the University of California, San Francisco, a leader in the study of aging. Even New Age physician Deepak Chopra chimed in by saying legitimate anti-aging remedies can only keep a person from dying young, but they don't increase the life span.1

Human growth hormone, widely extolled on radio infomercials across America for its anti-aging properties, may produce side effects including the elevation of growth hormones that trigger the growth of tumors. Growth hormone is more appropriate for very old adults who have lost muscle mass and can't get up out of a chair any longer, not for middle-aged adults fighting the first signs of aging.2

So is there anything that adults can do to lengthen the human life span? Scientists want you to wait (as if everybody has the time) for gene therapy. The latest breakthrough is the so-called Methuselah gene, a portion of DNA that confers healthy old age to those who carry its active form.3 But don't wait around for genetics to prolong life. Gene therapy has yet to cure any disease, is likely to be too costly for the average person to afford and is more likely to be inserted in new genes in the offspring of the next generation.

Why wait for an antioxidant breakthrough?

Thomas Johnson at the University of Colorado-Boulder, is looking in another direction. Johnson found that by tweaking a certain gene in roundworms he could create a super antioxidant gene that would double the worm's life span.4

But while researchers conduct their antioxidant studies on roundworms, what can we do? Actually youth seekers need look no further than the vitamin shelf at local stores for a well substantiated. anti-aging compound - vitamin C. While researchers attempt to make careers out of their research and thus delay any conclusions indefinitely, vitamin C may be the anti-aging miracle humanity can begin to use today. The story is not new, it's just not been widely told.

Humans have the capacity to live for hundreds of years

The good news is that there is scientific evidence that humans have the capacity to lengthen their average life span by hundreds of years. The evidence for vitamin C as a key anti-aging agent is compelling and rooted in the genetic makeup of humans. All humans are mutants. Homo sapiens, guinea pigs, monkeys, bats, some fish and many birds, do not produce their own vitamin C. The rest of the animal kingdom synthesizes their own vitamin C. For them, ascorbic acid is a hormone, not a dietary-acquired vitamin. Animals employ different organs to produce vitamin C. Some birds and reptiles use their kidneys and perching birds and mammals make vitamin C in their liver.5

Humans once made vitamin C in their liver by the production of four enzymes which convert circulating sugars into ascorbic acid (vitamin C). Humans today only make 3 of the 4 enzymes required to convert glucose (sugar) into ascorbic acid. A progressive mutation at some time in past generations deactivated the gene for the enzyme gulonolactone oxidase and slowly as the mutation progressed the synthesis of vitamin C came to an end in humans.

Mammals who make their own vitamin C can live 8-10 times beyond their age of physical maturity. Mammals without this ability have a difficult time reaching 3-4 times. Researchers believe the reinstallation of the gulonolactone oxidase enzyme in humans would extend the lifespan to hundreds of years.

This means that humans at one time in the past, prior to this gene mutation, lived for hundreds of years. This doesn't fit with the current evolutionary scheme of biology which postulates that humans evolved from monkeys and early man lived no longer than 40 years.

Darwinian theory off the mark

In 1966 Irwin Stone, a chemical engineer, theorized that the mutation of the gulonolactone oxidase enzyme in humans had been part of human evolution. There had been a branching of the Prosimii and the Anthropoidea orders of monkeys. The Anthropoidea developed the inactive gene for vitamin C and that branch evolved into humans.6

Health writer Jack Challem in 1997 also hypothesized that either a virus or free radical attack caused the genetic defect that disabled the enzyme necessary for vitamin C synthesis and that this in turn led to mutations that propelled the evolution of monkeys to humans.7

Of course, mutations are destructive and regressive, not progressive advancements of the genome. According to Darwin's theory of natural selection (survival of the fittest), humans or monkeys who could not produce their own vitamin C would have been less likely to thrive. Of course these theories are based upon suppositions which depict branch-like evolution from simple life forms to monkeys and finally humans. With no intermediate species (no missing links), these evolutionary tree charts are still nothing more than cartoons.

According to paleontologists early humans lived for about 40 years and only in recent times has the human life span dramatically shifted upwards. But then again, we have that evidence that humans at one time in the past made their own vitamin C. What evidence do we have, if any, that humans at some time in the past lived for hundreds of years?

When did humans live for hundreds of years?

An examination of the historical records of the Holy Bible reveals that Adam was recorded to have lived for 930 years (Genesis 5:5), and Noah for 950 years (Genesis 9:29). According to the Biblical record the human genome was severely narrowed at that time, down to just eight members of Noah's family as gene carriers. Thereafter the human life span was recorded to slowly dissipate. After the Flood, Bible genealogies indicate Shem lived 600 years, followed by Arphaxad who lived 438 years, and through other generations on down to Abraham who lived 175 years and finally to Moses who lived 120 years (Deuteronomy 34:7). This description would fit the progressive mutation of the gulonolactone oxidase gene. Humans still house this gene, it is just defunct and called a pseudogene. Thus the Biblical genealogies may not be far-fetched fairy tales.

Can the enzyme to produce vitamin C be re-installed in humans?

What if the gulonolactone oxidase gene could somehow be re-inserted into the human genome?

We know that guinea pigs who lack gulonolactone oxidase have been given this enzyme by injection and are able to survive on a diet deficient in vitamin C.8

Scientists have taken the gulonolactone oxidase DNA from rat liver and successfully tranplanted it into the tomato genome.9 The gulonolactone oxidase gene has also been successfully transferred into a teleost fish (Oryzias latipes) via microinjection into fertilized fish eggs.10

With all of the widely heralded prospects for gene therapy there hasn't been a peep about the feasability of inserting the gulonolactone oxidase gene into the human genome. Yet the profound impact of such a development, if successful, would obviously be monumental. Diabetes, blood vessel disease, cataracts, gallstones, to name a few age-related maladies, would be eradicated. The breakdown of collagen with advancing age would be slowed. The world human population jumped from 1.6 to 6.1 billion in the past century, 2 billion of that growth coming since 1960, largely from improvements in sanitation, food fortification and modern medicines.11 Imagine the social, political and medical ramifications if humans could live for hundreds of years?

Does vitamin C supplementation work?

As early as 1984 researchers knew that supplementation of drinking water with vitamin C increased the average life span of mice by as much as 20 percent.12

Nobel prize winner Dr. Linus Pauling suggested humans supplement their diet continually through the day to mimic what the liver would make if the gene for the gulonolactone oxidase enzyme were still active. Dr. Pauling advocated supplementation with mineral ascorbates, the same alkaline form of vitamin C the liver produces in mammals, not ascorbic acid which can sometimes be irritating to the stomach and can even eat away tooth enamel.13

How much vitamin C should humans ingest? If you want to get all your vitamin C from foods, consumption of the recommended 5 to 7 servings of fruits and vegetables a day is likely to provide 200-250 milligrams. A mouse makes about 275 milligrams of vitamin C per day per kilogram (2.2 lbs) of body weight. If a mouse weighed 154 pounds, about the weight of a human, this would amount to about 19,250 milligrams of vitamin C per day. A 160-pound goat produces about 13,000 milligrams per day, and more under stress. A dog or cat will produce about 40 milligrams of vitamin C per kilogram of body weight per day, or the equivalent of 2800 mgs per day if these animals were about the same size as humans. So using animals as a rule of thumb, humans may benefit from consumption of anywhere from 2,000-20,000 milligrams per day. The only common side effect from high-dose vitamin C is a transient diarrhea-like buildup of water in the lower bowel. Government health authorities recommend only about 90 milligrams of vitamin C a day for adults, but that's just the minimum amount to prevent scurvy and promote general health, not to achieve optimal health and longevity. Studies indicate the vitamin C intake for Americans is around 110 milligrams per day, but adequate vitamin C status, even with food fortification, is still not guaranteed. According to one study, about 1-2 percent of college students exhibited true deficiency and marginal deficiencies were found in an additional 12-16 percent of students.14

Can vitamin C prolong life?

Is there any evidence that increased vitamin C consumption can prolong the human life span? A study of 11,000 Americans over 10 years revealed that individuals with the highest level of vitamin C intake, only about 300 milligrams, suffered 35 percent fewer deaths than those with the lowest intake, about 50 milligrams a day. This amounts to about 6 added years of life to those who consume higher levels of vitamin C. Since 300 mg of vitamin C is difficult to obtain from dietary sources alone, the primary group that exhibited increased life span were the vitamin C supplement users.15 A person would have to consume five oranges a day to get 300 milligrams of vitamin C from their diet alone.

There are other corroborating studies that back up the idea of vitamin C supplementation and longevity.

A study over a 12-16-year period showed that males with the highest blood serum levels of vitamin C experienced a 57 percent drop in their risk of dying from any cause compared to males with low circulating levels of vitamin C.16

Among men and women ages 45-79 years, just a 50 milligram increase in vitamin C consumption was able to reduce the relative all-cause mortality rate by 20 percent.17

Another study published in 2001 also confirms a 25-29 percent decreased all-cause mortality rate among adults with normal to high circulating levels of vitamin C.18

It is interesting to note that vitamin C acts as an agent in various models of anti-aging. Vitamin C would be a key antioxidant in the free radical theory of aging.19 Researchers have demonstrated that vitamin C slows down telomere shortening by 52-62 percent in a controlled experiment.20 Telomeres are the end caps of DNA that shorten with many generations and limit the number of replications of DNA.

Is high-dose vitamin C genotoxic?

However, with all of this positive information about vitamin C, the news media recently chose to widely circulate a misleading test-tube study claiming high-dose vitamin C is toxic to DNA which could cause cancer. Researchers recommended vitamin C supplements be restricted to no more than 200 milligrams per day. This report caused the public to temporarily pause regarding vitamin C supplements.21 However, the 200-milligram limit conflicts with government health authorities who recommend consumption of 5-7 servings of fruits and vegetables per day which would likely provide more than the 200 milligram amount. Virtually all evidence from dietary studies confirm the health benefits of foods that provide high amounts of vitamin C. Another earlier study published in Nature indicated 500 milligrams of vitamin C in humans may produce damage to DNA in lymphocytes, a type of white blood cell.22 However, other studies reveal that vitamin C actually protects against DNA damage to lymphocytes but this protective effect is greatly enhanced when accompanied by bioflavonoids which usually accompany vitamin C in nature.23 Bioflavonoids are plant pigments commonly found in citrus, berries, grapes and tea leaves. The better store brands of buffered vitamin C powder (mineral ascorbates) include bioflavonoids. Furthermore, five other subsequent human studies were conducted using high-dose vitamin C up to 5000 milligrams per day and could not find evidence that vitamin C induces gene mutations.24

Then there is the aforementioned evidence from the animal kingdom where animals produce thousands of milligrams of vitamin C daily without evidence this induces cancer. A modern mountain gorilla living in its natural habitat, that produces no vitamin C on its own, would obtain 4,500 milligrams of vitamin C per day from native foodstuffs.25 A 15 pound howler monkey takes in 600 milligrams of vitamin C per day and an 18 pound spider monkey consumes about 744 mg of vitamin C per day.26 There is no evidence that these levels of vitamin C from dietary sources induce any DNA mutations or cancer in these animals.

Furthermore, there are studies which reveal significant health benefits for humans who consume vitamin C in excess of the newly established 90 milligram reference daily intake. For example, human studies reveal that 300 milligrams of daily vitamin C appears to reduce the risk of blinding cataracts, an otherwise inevitable consequence of aging, by 77-83 percent.27 A 500-milligram daily dose of vitamin C has been found to significantly reduce blood pressure among hypertensive patients who previously had to use prescription medications.28

Anyone interested in anti-aging should begin with vitamin C, the missing human hormone.

Part II: What we can learn about anti-aging from mynah birds, fruit flies and leeches

The number of Americans age 65 or older has increased ten told in the last century and the elderly are living longer. In 1900 there were about 3.1 million people over the age of 65 in the USA. Today there are more than 35 million retirees and this figure is expected to leap to 70 million by the year 2030. In 1900, 65-year olds could expect to live another 12 years. Today they can expect to live an additional 18 years.29 There are even an estimated 35,000 centenarians living in the USA today. But these achievements in longevity have only whetted the appetite of humans to reach for even greater prolongation of human life. Yet, while Americans are living longer, the maximum human life span hasn't budged.

Throughout history females have outlived males. The difference is about five to eight years on average. Some females living in the same environment and eating the same diet as their spouse will outlive their husbands by a couple of decades. Those who pursue anti-aging technologies often overlook this important point.

Estrogen effect is small

The idea that estrogen is responsible for the increased longevity of females has served as a distraction. The majority of the life extension attributed to estrogen, about 2.3 years, occurs only among women with coronary artery disease who take hormones. Estrogen replacement therapy only adds about 0.3 years of additional life if for healthy women if begun at age 50 years.30 Furthermore, the decreased mortality rates with the use of replacement hormones appears to wear off over time.31

Women control iron, and live longer

Women live longer because they are better designed to withstand the rigors of life. Women, being the baby carriers of the species, must be protected from disease for human life to skirt extinction. Women better control iron in their bodies and thus outlast men. During the growing years both males and females require iron to produce hemoglobin for the production of red blood cells. Because the human body is growing rapidly during youth and more blood volume is needed, there is little danger of iron overload. But once full growth has been achieved, around age 18, the demand for iron is relaxed and about one excess milligram of iron per day of life accumulates thereafter in the body. But at this point females avoid iron overload by virtue of their monthly menstrual cycle. About 80 percent of the iron stores in the body are in the red blood cells and females will lose about 30-60 milligrams of iron with the monthly cycle. On the other hand, males have no direct route for the disposal of excess iron and by the age of 40 have as much iron as a 70-year old female, about 5000-7000 milligrams of excess stored iron. A 40-year old male will have twice the iron load as a female and will experience twice the rate of diabetes, cancer, heart disease and infections. Bacteria, viruses and fungi all utilize iron as a primary growth factor, so lower iron levels in females protect them from infection.32

Females who have undergone early hysterectomy or who have entered menopause lose their control of iron and begin to experience the same rate of disease as males. Females at age 45 have an advantage of about 5-8 more remaining years of life than males. But at age 80 this advantage shrinks to just two years. This is because now both sexes have lost any direct outlet for iron.

Blood loss is a method of controlling stored iron levels. For example, full-grown males, or females who no longer control iron via monthly blood loss, but who regularly donate blood are healthier. Even blood letting, practiced long ago, is returning to conventional medicine to treat Alzheimer's disease, Parkinson's disease, cancer and diabetes. Blood-sucking leaches could theoretically protect against age-related iron overload and thus promote longevity, though blood letting is currently the preferred method.33

Calorie or iron restriction?

Iron is the single most important factor in the control of aging. Yet anti-aging researchers recognize calorie restriction as the only proven method of slowing down the aging process. However, the calorie restriction model of anti-aging has only been proven in rodents to date. It's going to take another 10-15 years to determine if this also occurs in larger animals such as monkeys.34 According to data at hand, it would take a 30% reduction in calories over a human lifespan to significantly slow aging in humans. Calorie restriction lowers body temperature, reduces cholesterol, triglycerides and blood pressure, elevates HDL cholesterol and reduces arterty stiffness.35 But there is more to the story.

Studies of fruit flies (Drosophila melanogaster) may help to understand the supremacy of iron control in the aging process. Fruit flies are often used in aging studies because of their short life span, maybe 50-70 days. Insects have inborn mechanisms to control iron similar to humans. Insects control iron by iron-binding proteins (ferritin, transferrin).36 Excessive iron has been found to be the initiator of aging in fruit flies.37

Researchers at the University of Texas Health Science Center in San Antonio, Texas, measured the aging rate of mice at 6, 12 and 24 months by determining the level of oxidation (rusting of tissues) in various organs. The more food these animals consumed the more iron that accumulated in their tissues and the greater the oxidation (aging). As early as 1985 researchers proposed that the rate of age-related iron accumulation correlates with the life span of some species. The accumulation of iron in these rodents did not occur till growth had been completed. After that time iron levels increased in the liver by 140 percent and the kidney by 44 percent. The liver and brain experience the greatest iron buildup with advancing age. An iron-restricted diet minimized the oxidation levels in the liver, kidney and brain with advancing age.38

The lifespan of the fruit fly has been found to be proportional to the iron content in the diet. The life span of humans has been correlated with mice and the fruit fly. Consumption of tea extracts, which bind iron and inhibit its absorption, has been found to inhibit the age-related accumulation of iron and prolong life in the fruit fly by as much as 21 percent.39 So calorie restriction may not be the only way to prolong human life.

Progressive iron overload is universal

Another problem in understanding the iron overload model of aging is that it is often characterized as being a genetically-acquired disease rather than a universal aging factor. Iron overload is called a disease, hemochromatosis, and it is mistakenly believed to only affect about 1 million Americans. It is usually diagnosed between the ages of 40-60 years. The growing years and menstruation usually mask the problem till middle age. Iron overload is not just a genetic disease. Furthermore, since fertile females and growing children are often deficient in iron (anemia) due to the high demands for this mineral, nutritionists have often overemphasized the problem of anemia without warning full-grown males and females who no longer have monthly cycles of the dangers of iron overload.40

For example, adult males should not eat a bowl of Total cereal (General Mills) for breakfast which provides a full 18 milligrams of iron per serving, 100% of the daily requirement, per bowl. Do you know any males who eat just one bowl of cereal? Many consume a couple of bowls for breakfast and may through meat eating get three to four times more iron than they need on a daily basis. There is no warning on the label for full-grown males to steer clear of Total cereal.

It's not that iron should be totally avoided. The body needs some replacement iron to make red blood cells. Dietary iron is OK, even from meat. Only 5 percent of iron in plant foods is available, vs. 30 to 50 percent of iron from meat.41 Meat is not the culprit, it is the lack of molecules in the diet that control iron that makes iron a rusting agent (see below). A little meat is needed to keep us from becoming anemic, the flip side of iron nutrition. Supplemental iron should be avoided for full-grown males and females who no longer have monthly cycles. Neither do iron tablets in stores carry a warning label for full-grown males and post-menopausal females. Ironically, females who have higher iron needs in their fertile years may crave iron-rich foods, such as meats. If the women do the cooking in the family household and they prepare foods to meet their own nutritional needs, loading upon on iron-rich meat, they may hasten the demise of their male spouse.

Humans consume about 30 milligrams of iron per day from the diet, but only a small portion is absorbed. But the addition of alcohol to a meal greatly increases the absorption of iron. Iron overload disease is strongly associated with alcohol consumption. Compared to alcoholic spirits, red wine is low in iron and contains the iron-binding pigments from grapes, which may explain some of the health benefits attributed to wine.42

Mynah birds and longevity

Any animal or human can develop iron overload disease. The study of the mynah bird reveals some secrets of iron and longevity. Iron overload in mynah birds has been likened to iron overload disease (hemochromatosis) in humans.43 While some mynah birds live in the wild as long as 20 years or more, mynah birds in captivity often die early of iron storage disease, usually by the age of 10. This problem is attributed to iron-rich bird feed. [www.mynadbird.com]

Mynah birds in the wild are fruit eaters and the favorite fruit is figs. Figs are relatively high in iron. Yet the wild mynah birds don't develop iron overload. Figs are high in iron-binding pigments (tannins) that bind iron and render it harmless. Starlings are birds that are similar to mynah birds in that they are fruit eaters. When starlings are fed a diet high in iron along with natural iron binders (tannins), such as found in figs, tea, and grapes (wine), the starling don't accumulate iron in their liver and none of the birds develop iron overload.44 Tannins are potent binders (chelators) of iron.45

Iron-binding plant pigments also have been shown to extend the lifespan of the fruit fly. In one study the survival of fruit flies exposed to an herbicide (paraquat) was about 56 percent and when iron was added the survival rate dropped even lower. But when the herbicide was fed to the fruit flies with an iron binder commonly found in green tea (catechin), the survival rate jumped to 78-87 percent.46

Most health practices control iron

Most of the health-promoting practices of modern life unknowingly control iron. For example, taking an aspirin a day to prevent heart attacks and strokes causes blood loss via the digestive tract on the order of about a tablespoon per day. This results in iron loss.47 Aspirin also appears to increase the production of ferritin, an iron-binding protein produced in the liver that prevents iron from inducing oxidation.48 By exercising, a person loses about 1 milligram of iron through sweat.49

Fasting and vegetarian diets, both of which promote longevity in animals and humans, limit iron consumption because plant foods provide non-heme iron which is poorly absorbed.

According to one study, when you sit down to eat a meal consisting of a hamburger, string beans and mashed potatoes, the addition of coffee, which contains iron-binding pigments, will reduce iron absorption by 35 percent. Green tea will reduce iron absorption even further, by 62 percent.50

While vitamin C increases iron absorption,51 there is no evidence that vitamin C leads to iron overload. Thus vitamin C should not be avoided by meat-eaters for this reason, since studies show high-dose vitamin C supplements are associated with a decreased risk for heart disease, cancer, cataracts and other disorders.52 A vegetarian diet does not generally cause iron-deficiency anemia because there is more vitamin C in plant-food diets, which enhances iron absorption.53

Bind that iron

Grapes have a relatively large amount of iron, but you don't see any rusty grapes. The reason is that grapes, like the figs that mynah birds eat, have those iron-binding pigments that tightly bind to iron. In healthy individuals there is little if any unbound iron circulating in the blood. In all states of disease, however, unbound iron (also called free iron) is released at sites of inflammation, tumors and infection, and can spark uncontrolled oxidation (rusting) and tissue destruction.54

Fortunately, there are numerous automatic mechanisms in the body that help to control iron. For instance, melanin is an iron-binding pigment in the skin. The liver makes binding proteins called ferritin, transferrin and lactoferrin, to bind to iron as it enters the circulatory system.55

The diet also provides some potent iron binders. Iron-binding pigments found in berries, coffee, green tea, pine bark, onions and the rind of citrus fruits, and phytic acid (a component of whole grains and seeds such as sesame and rice bran) bind to iron and other minerals in the gastric tract and help to limit iron availability.

If bioflavonoids and phytic acid haven't bound to minerals in the digestive tract they will get into the bloodstream, where they can bind to free iron, acting as blood-cleansing iron chelators. Therefore, maximum iron chelation in the blood circulation is achieved when these iron binders are consumed apart from meals.

How to remove (chelate) excess iron (rust) from the body

The question is, what can adult males, or females who have not menstruated for years, do to remove the excess iron from their body stores? Chelation therapy is what is needed, the removal of the excess iron. Alternative medical specialists offer to perform chelation therapy via the intravenous administration of EDTA, a mineral chelator. Intravenous chelation therapy requires many treatments, maybe 30 or 40, and is somewhat costly ($3000-4000). Conventional medicine also has a mineral-chelating drug, desferrioxamine, but it is sparingly used because of side effects.

Nature's most potent rust remover is phytic acid, commonly found in whole grains, seeds and nuts. Phytic acid - also called inositol hexaphosphate, or IP6 - is comprised of six phosphorus molecules and one molecule of inositol. IP6 is provided as a food supplement extracted from rice bran (Tsuno Foods & Rice Co., Wakayama, Japan). Bran cereal has some IP6 in it, but it is already bound to minerals. The IP6 extract imported from Japan is 70 percent unbound, ready to selectively chelate (attach to) minerals as it enters the human circulatory system.56 IP6 doesn't remove minerals from bones or other needed minerals, it just removes the free unbound iron, copper, calcium, and heavy metals such as mercury, lead and cadmium. IP6 has little or no affinity for sodium, potassium, and magnesium, the important electrolyte minerals required for proper heart rhythm. Taken in between meals with water, IP6 can rid the body of excessive iron and other minerals in a short period of time, 30-90 days. Once bound to IP6 the excess minerals are excreted via the urinary flow. IP6 rice bran extract is an unheralded but potent anti-aging therapy.

The iron stores in your body will control the severity of disease and longevity. Learning how to control iron is a major, if not the primary, anti-aging factor in living organisms. The pursuit of long life requires the control of iron.

Cowley G, Church V, Live longer with vitamin C, Newsweek May 18, 1992 and Enstrom JE, et al, Vitamin C intake and mortality among a sample of the United States population, Epidemiology 3: 194-202, 1992.

Comment: In a paleo-diet, you can get all vitamins and minerals from organ meats and animal sources. In fact, you get what your body really needs from eating meats to an even greater extent than when eating veggies, including the most crucial vitamins. Except perhaps for vitamin C. To be precise, there is a very small amount of vitamin C in animal foods. But perhaps that is really enough? See the following:

Vitamin C is needed to hydroxylate the amino acids lysine and proline into hydroxylysine and hydroxyproline- connective tissue. That is why scurvy is characterized by a degeneration of connective tissue. However, unknown to most, red meat already contains hydroxylysine and hydroxyproline which is absorbed into the bloodstream when eaten. Thus, less vitamin C is needed to hydroxylate proline and lysine, because they are already present in the blood in the hydroxylated state.

However, one might ask about the role vitamin C plays in antioxidant function. Sure, a purely carnivorous diet will prevent scurvy, but will it replace the other biochemical functions of vitamin C? While those who regularly consume liver and brains do not need to concern themselves too much here, what about those carnivores who consume primarily muscle meat and eggs? Sure, they may be free from scurvy, but are there some other unseen health effects, such as excessive free radical damage from lack of vitamin C?

First of all, a ketogenic metabolism produces less free radicals than a carbohydrate-burning metabolism. Secondly, there are numerous other substances, endogenous and dietary, that act as antioxidants present on a purely carnivorous diet. But what if this is insufficient? Should meat and eggers be worried?

Fortunately, the answer no. And the answer may lie in uric acid.

Uric acid is derived from purines in meat. They are the final metabolic end-product of purine compounds. This is because the genes encoding for the production of the enzyme uricase, needed to break down uric acid, have been absent from primate DNA for millions of years.

The thing that makes ascorbate as a molecule useful is the property of being a strong electron donor. Uric acid is also a strong electron donor (1). In fact, it may even be a better electron donor than vitamin C (2). Because of this, uric acid is a powerful antioxidant, similar to vitamin C. Thus, it follows that the loss of the enzyme uricase and the consequent increase in blood levels of uric acid in primates has probably provided a substitute for ascorbate in certain biochemical functions, including antioxidant activity.

Since meat is rich in purines, uric acid is inevitably abundant in the bloodstream of someone who consumes a large amount of muscle meat and organ products. Conclusion: Even if a human carnivore does not consume vitamin C-containing animal products, a purely carnivorous diet is still sufficient to produce the biochemical functions that vitamin C is normally responsible for.

In humans and higher primates, uric acid is the final oxidation (breakdown) product of purine metabolism and is excreted in urine. In most other mammals, the enzyme uricase further oxidizes uric acid to allantoin.[8] The loss of uricase in higher primates parallels the similar loss of the ability to synthesize ascorbic acid, leading to the suggestion that urate may partially substitute for ascorbate in such species.[9] Both uric acid and ascorbic acid are strong reducing agents (electron donors) and potent antioxidants. In humans, over half the antioxidant capacity of blood plasma comes from uric acid.[10] The Dalmatian dog has a genetic defect in uric acid uptake by the liver and kidneys, resulting in decreased conversion to allantoin, so this breed excretes uric acid, and not allantoin, in the urine.[11]

Uric acid is created when the body breaks down purine nucleotides. Purines are found in high concentration in meat and meat products, especially internal organs such as liver and kidney.

In addition to that, vitamin C uses the same receptors as glucose to enter the cell membrane. So if we have eaten a lot of simple sugars, as in sweets, cool drink, but also complex carbohydrates which are broken down to glucose after being digested, less Vitamin C can be absorbed because the receptors are already in use.

So it seems that a mutation happened to those who were already eating meat, especially organ meats. With a meat diet, you will receive much more nutrients than you will ever get from eating fruit. Indeed, one actually needs LESS Vitamin C on a meat/fat diet.