Peter Frost's anthropology blog, with special reference to sexual selection and the evolution of skin, hair, and eye pigmentation

Thursday, June 11, 2009

Mad dogs and ....

How can vitamin-D deficiency exist despite lengthy sun exposure? This apparent paradox was raised in my last post. The medical community now recommends bloodstream vitamin D levels of at least 75-150 nmol/L, yet these levels are not reached by many tanned, outdoorsy people.

In a study from Hawaii, vitamin D status was assessed in 93 healthy young adults who were visibly tanned and averaged 22.4 hours per week of unprotected sun exposure, with 40% reporting no use of sunscreen. Yet their mean vitamin D level was 79 nmol/L and 51% had levels below the recommended minimum of 75 nmol/L (Binkley et al., 2007).

These results are consistent with those of a study from Nebraska. The subjects were thirty healthy men who had just completed a summer of outdoor activity, e.g., landscaping, construction, farming, and recreation. One subject used sunscreen regularly and sixteen others sometimes or rarely. Their mean vitamin D level was initially 122 nmol/L. By late winter, it had fallen to 74 nmol/L (Barger-Lux & Heaney, 2002).

A study from south India found levels below 50 nmol/L in 44% of the men and 70% of the women. The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight.” (Harinarayan et al., 2007).

These studies lead to two conclusions. First, sun exposure seems to produce vitamin D according to a law of diminishing returns: the more we expose ourselves to the sun, the less the vitamin D in our bloodstream increases. Perhaps frequent sun exposure results in less being produced in the skin and more being broken down in the liver. This might explain why intense sun exposure leads to a lower vitamin D level in Hawaiian subjects than in Nebraskans. In the latter group, vitamin D production may be ‘calibrated’ to provide a reserve for the winter months.

Second, to stay above the recommended minimum of 75-150 nmol/L, we must take supplements in the form of vitamin pills or fortified foods. Sun exposure is not enough. Yet even dietary supplementation seems to be countered by some unknown mechanism within the body:

… what effect does a 400 IU/d dose of vitamin D for an extended time (months) have in adults? The answer is little or nothing. At this dose in an adult, the circulating 25(OH)D concentration usually remains unchanged or declines. This was first shown in both adolescent girls and young women. … mothers who were vitamin D deficient at the beginning of their pregnancies were still deficient at term after receiving supplements of 800-1600 IU vitamin D/d throughout their pregnancies. (Hollis, 2005)

Only mega-doses can overcome what seems to be a homeostatic mechanism that keeps bloodstream vitamin D within a certain range. Indeed, this range falls below the one that is now recommended. Curious isn't it? Why would natural selection design us the wrong way?

Perhaps ancestral humans got additional vitamin D from some other source, such as the food they ate. In the diets of hunter/gatherers and early agriculturalists, fatty fish are clearly the best source, as seen when we rank the vitamin D content (IU per gram) of different foods (Loomis, 1967):

Yet fatty fish were unavailable to many ancestral humans, if not most. And again, when vitamin D enters the blood from our diet, it seems to be limited by the same homeostatic mechanism that limits entry of vitamin D from sun-exposed skin.

It looks like natural selection has aimed for an optimal vitamin D level substantially lower than the recommended minimum of 75-150 nmol/L. This in turn implies some kind of disadvantage above the optimal level. Indeed, Adams and Lee (1997) found evidence of vitamin D toxicity at levels as low as 140 nmol/L. But this evidence is ridiculed by Vieth (1999):

The report of Adams and Lee, together with its accompanying editorial, suggest that serum 25(OH)D concentrations as low as 140 nmol/L are harmful. This is alarmist. Are we to start avoiding the sun for fear of raising urine calcium or increasing bone resorption?

These side effects may or may not be serious. But there are others. High vitamin D intake is associated with brain lesions in elderly subjects, possibly as a result of vascular calcification (Payne et al., 2007). Genetically modified mice with high vitamin D levels show signs of premature aging: retarded growth, osteoporosis, atherosclerosis, ectopic calcification, immunological deficiency, skin and general organ atrophy, hypogonadism, and short lifespan (Tuohimaa, 2009). Vitamin D supplementation during infancy is associated with asthma and allergic conditions in adulthood (Hyppönen et al., 2004)

In this, vitamin-D proponents are guilty of some hypocrisy. They denounced the previous recommended level, saying it was just enough to prevent rickets while ignoring the possibility that less visible harm disappears only at higher intakes. Yet the current recommended level ignores the possibility that less visible harm appears below the level of vitamin D poisoning.

This being said, the pro-vitamin-D crowd may still be partly right. The optimal level might now exceed the one the human body naturally tends to maintain. With the shift to industrial processing of cereals, we today consume more phytic acid, which makes calcium unusable and thus increases the body’s need for vitamin D. We have, so to speak, entered a new adaptive landscape and our bodies have not had time to adapt.

Or they may be completely wrong. Frankly, I’m not reassured by the pro-vitamin-D literature. It strikes me as being rife with loosely interpreted facts, like the correlation between cancer rates and distance from the equator (and hence insufficient vitamin D). Cancer rates also correlate with the presence of manufacturing, which is concentrated at temperate latitudes for a number of historical and cultural reasons, notably the absence of slavery and plantation economies.

Then there’s this gem:

The concentrations of 25(OH)D observed today are arbitrary and based on contemporary cultural norms (clothing, sun avoidance, food choices, and legislation) and the range of vitamin D intakes being compared may not encompass what is natural or optimal for humans as a species (Vieth, 1999)

Actually, cultural norms are much more heliophilic today than during most of our past. In a wide range of traditional societies, people avoided the sun as much as possible, especially during the hours of peak UV (Frost, 2005, pp. 60-62). Midday was a time for staying in the shade, having the main meal, and taking a nap. Nor is there reason to believe that sun avoidance and clothing were absent among early modern humans. Upper Paleolithic sites have yielded plenty of eyed needles, awls, and other tools for making tight-fitting, tailored clothes (Hoffecker, 2002).

Heliophilia is the historical outlier, not heliophobia. It was the sunshine movement of the 1920s that first persuaded people to cast off hats, cut down shade trees, and lie on beaches for hours on end. This cultural revolution was still recent when Noël Coward wrote his 1931 piece ‘Mad Dogs and Englishmen’:

In tropical climes there are certain times of dayWhen all the citizens retire to tear their clothes off and perspire. It's one of the rules that the greatest fools obey,Because the sun is much too sultry And one must avoid its ultry-violet ray.The natives grieve when the white men leave their huts,Because they're obviously, definitely nuts!Mad dogs and Englishmen go out in the midday sun,The Japanese don’t care to, the Chinese wouldn’t dare to, Hindus and Argentines sleep firmly from twelve to one But Englishmen detest a siesta.In the Philippines there are lovely screens to protect you from the glare. In the Malay States there are hats like plates which the Britishers won't wear. At twelve noon the natives swoon and no further work is done, But mad dogs and Englishmen go out in the midday sun.

20 comments:

Tod
said...

I am not sure about industrial workers getting more cancer, that's the same kind of thinking that got people ingesting antioxidants. There might be positive effects of hormesis from carcinogen exposure that reduce cancers.(1) UVA may have a hormetic effect,(2) and it could be that the hormetic effect of ingesting Vitamin D cancels out the negative effects which is why a study are found taking vitamin D pills does nothing, for good or ill.

Vitamin D supplementation does no good after 7 years. "In the WHI CaD trial, supplementation did not have a statistically significant effect on mortality rates "Calcium Plus Vitamin D Supplementation and Mortality in Postmenopausal Women: The Women's Health Initiative Calcium–Vitamin D Randomized Controlled TrialHere. (3)

Another study,the natural levels are found to be not significant.'Serum vitamin D and risk of prostate cancer in a case-control analysis' (2009). "No significant association was found between 25-hydroxyvitamin D and risk of prostate cancer (highest vs. lowest quintile: odds ratio = 1.28, 95% confidence interval: 0.88, 1.88; P for trend = 0.188). Subgroup analyses showed no significant heterogeneity by cancer stage or grade, age at diagnosis, body mass index, time from blood collection to diagnosis, or calcium intake. In summary, the results of this large nested case-control study provide no evidence in support of a protective effect of circulating concentrations of vitamin D on the risk of prostate cancer".(4)

But that was after a bit of statistical magic -

"Men with the highest blood levels of vitamin D were 28 percent more likely to develop prostate cancer than those with the lowest levels. However, this difference was not statistically significant. There was still no association between vitamin D levels and prostate cancer risk after adjusting for body mass index, age of diagnosis, cancer stage or grade, calcium intake and time from blood collection to diagnosis".

Here is a more alarming studyBOTH HIGH AND LOW LEVELS OF BLOOD VITAMIN D ARE ASSOCIATED WITH A HIGHER PROSTATE CANCER RISK: A LONGITUDINAL, NESTED CASE-CONTROL STUDY IN THE NORDIC COUNTRIESInt. J. Cancer: 108, 104–108 (2004)

(The author accepted Vieth's criticism I believe. see below). The paper gives an interesting hypothesis on how high blood vitamin D could disregulate receptors and the active form in the cells. (Trevor Marshall's theory is similar but Marshall is an electrical engineer by training).

Veith's explanation is wide ranging and may account for higher rates of many cancers at northern latitudes where there happen to be a lot of industrial workers.

"Winter at high latitudes produces a prolonged, gradual decline in 25(OH)D levels, and during this decline, autocrine 1,25(OH)2D cannot be maintained at its long-term setpoint. According to this hypothesis, high 25(OH)D concentrations are not problematic per se. Instead, it is the process of declining 25(OH)D concentrations that contributes to increased risk of prostate cancer.[...]

This hypothesis predicts that the U shaped curve of prostate cancer risk is distinct to high latitudes. It predicts that in regions where average 25(OH)D concentrations are higher and more constant throughout the year, greater risk of prostate cancer is associated only with low 25(OH)D levels".(5)

(1) Stress-Response Hormesis and Aging: “That which Does Not Kill Us Makes Us Stronger”David Gems1 and Linda Partri.Here

(2)Impact of UVA exposure on psychological parameters and circulating serotonin and melatonin.

"The changes of circulating neuroendocrine mediators found after UVA exposure at T2 may be due to an UVA-induced effect via a cutaneous pathway. Nevertheless, the positive psychological effects observed in our study cannot be attributed to circulating serotonin or melatonin."Here

I remember reading a study that found a link between manufacturing and the incidence of cancer. The authors mapped U.S. cancer rates on a county-by-county basis and found a close fit with the presence of manufacturing, especially of the 'heavy industry' type.

I'm wary of latitudinal correlations where no effort is made to consider cultural and ethnic differences. There is, for instance, evidence that frequent masturbation during adolescence reduces one's likelihood of developing prostate cancer later in life. Since different cultures have different attitudes toward the sinfulness of masturbation, this factor would have to be controlled.

he presence of manufacturing, especially of the 'heavy industry' type.

I was thinking of workers, but point taken; for someone who grew up around industry being exposed to pollution from birth (and maybe before) would increase the likelihood of cancer.

Here's Dr. Vieth on human evolution"Humans are optimized throughevolution to be a tropical species. Annual cycles of fluctuating 25(OH)D concentrations are not physiologic?!

Understanding how the system of vitamin D control in Europeans is designed to deal with excess only (nobody understands that better than Dr Vieth), forces one to accept that the evolution of the 'D' system ceased when Europeans started to evolve.

"What about harm? Here you have to look at long term harm not acute phase harm. Acute phase harm is hypercalcemia. Acute phase deficiency is ricketts. We now know that optimal D levels are much higher than needed to prevent ricketts. But what about long term harm?Please look up Melamed et at, Archives of Internal Medicine, 8/08. You have to get the whole article. They use the NHAINES III database (15,500 Americans, 26 centers across the US), took their blood out of the freezer, and plotted all cause mortality on the Y axis versus 25 OH D levels on the X axis. All cause mortality goes down as D goes up through the teens,20's,30's,40's. It levels off in the mid 40,s THEN STARTS GOING UP AGAIN in the mid 50's. Oops... Perhaps I am not smarter than my skin."

"To estimate the circulating 25(OH)D concentrations prevalent in humans of the late Paleolithic period, we need to focus on people in sun-rich environments who regularly expose most of their skin surface to the sun".

[But modern humans have been living in northern Europe for 30,000 years]

"Since early human evolution occurred under UV-rich conditions, typical 25(OH)D concentrations were surely higher than 100 mol/L. Levels like this are now seen only in lifeguards and farmers".

[ farmers go about half naked? ]

"This range of 25(OH)D concentration reflects an adult vitamin D input of 200–500 g/day (Vieth, 1999). Since our genome was selected through evolution under these conditions, it should be evident that our biology was optimized for a vitamin D supply far higher than what we currently regard as normal."

[Exactly, the amount synthesised from UVB was limited by natural selection because there was a potentially massive excess of UV in Africa. As humans moved further away from the equator and wore clothes a potential excess was still being dealt with; that's why the limit hasn't changed despite all the other changes since leaving Africa.]

"Among White children livingin Great Britain, rickets was observed in at least one third of children tested by all largepublic health studies reported between 1868 and 1935 (Harris, 1956)".

"Finally, this study was conducted following the Hawaiian equivalent of winter during which time there is reduced capability for cutaneous vitamin D production. "

They then go on to say that 60 ng/ml is the maximum safe upper limit based on their own data. But they need to take the same measurements at the end of Hawaiian summer before they conclude that 60 ng/ml is the maximum attained by sun exposure.

In the Nebraska study, You find the high levels at the end of the summer then it falls to low levels. It is natural for the levels to fall throughout the dark months as the body uses VT-D while not producing much if any.

In the India study, you have people living largely on grains, which induce vitamin D deficiencies and bone mineral abnormalities in a wide variety of animals(1-3) including primates (4)."

It appears that cereal grains have an ability to interfere with the entero-hepatic circulation of vitamin D or its metabolites (4,5), or by increasing the rate ofinactivation of vitamin D in the liver (6).

Which begs the question, what was the cereal and fiber intake of the people in Hawaii? and Nebraska?

Your conclusion that only fatty fish supply substantial vitamin D is marred by reliance on data collected on animal products from animals raised in confinement operations. Calf liver is a particularly poor example since calves are raised for veal in a way that would prevent the animal from producing any vitamin D. An animal spending its time grazing outdoors will produce D and store it in liver and fatty tissues. You need assays of vitamin D on strictly pastured or wild animals to substantiate your suggestion that previous generations could not have gotten much D from foods. Dr. Weston Price found that foods of preagricultural tribes supplied substantial amounts of D (Nutrition and Physical Degeneration).

Regarding calcification side-effects, these may be signs of inadequate vitamin K-2 in relation to vitamin D. People used to eat the meats, fats, and organs of grass fed animals which contains substantial amounts of K-2, particularly liver, and K-2 prevents pathological calcification.

I will agree that heliophobia appears expressed more often historically than heliophilia. However, this may ignore context. The sun avoidance you note occurred among people who had substantial indirect and direct sun exposure in outdoor work. You say "they avoided the sun as much as possible." Maybe so, but how much avoidance is possible when you have outdoor work to do every day?

Above this claim you yourself quote the Harinarayan study : “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight.” That's nine hours in the sun, through the mid-day, with much skin exposed. They might prefer to avoid the sun when possible, but the nature of their work forced exposure on a regular basis.

Which reminds me that in many cultures pale skin due to sun avoidance has been associated with wealth--not needing to do physical labor in the fields.

My experience also suggests people variably express heliophobia or heliophilia depending on their environment and season. I lived in Seattle for 7 years, during which I would say most people exhibited heliophilia -- we didn't have many sunny days, but when we did, people by and large wanted to be outside to enjoy it. Now I live in Phoenix, AZ, and by and large people exhibit more heliophobia.

I grew up in northern Ohio. By the end of the winter, most people would be complaining about the long dark months and when the brighter days arrived during the winter or in spring, many would exhibit heliophilic behavior. However, after a few months of sun, people would exhibit heliophobic behavior in the middle of the summer.

"The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight.” (Harinarayan et al., 2007)."

Brinkley et al "this study was conducted following the Hawaiian equivalent of winter during which time there is reduced capability for cutaneous vitamin D production. Despite this limitation, given the low latitude of Hawaii, substantial UV exposure and, therefore, vitamin D production are possible year round (38)"- "The Boulder site had 83% lower vitamin D UV between July and January while the Hawaii site had only a 43% for the same time period".

Anyway the main point is the intensity or duration of UVB/sun exposure isn't the primary determinant of the vitamin D levels attained in an individual. (Hollick 95., Veith 99) Vitamin D synthesis shuts off after 20 minutes with full body exposure ( swim trunks) giving 10,000IU. Black Africans take 6 times longer but make the same over a day. Black Africans with 100,000 years of adaption to excessively extreme UV with full body exposure year round don't get any more vitamin D than north Europeans who have been wearing clothes and living where the UV is never very strong and seasonaly absent for several months for 30,000 years.(Hoffecker 2002).

Now why is that - coincidence? Or, could it be that the 10,000 IU limit on synthesis continued to work fine because actual UVB and hence potential vitamin D levels were still excessive in northern Europe over a year(even though vitamin D had to be stored for the UVB-less period).

The various mechanisms that determine vitamin D levels do not show a pattern of evolution towards maximization of those levels in north Europeans compared to Africans.

* the skin synthesis limit

* the system of Vitamin D enzyme kinetics and control ("remarkably inefficient" says Vieth)

*Genetic polymorphisms of the vitamin D binding protein (ref 1,2)(Circulating 25(OH)D concentrations are strongly related to DBP polymorphisms). SNPs in the DBP are just as variable in Europeans.

Lifeguards wear less clothes than farmers yet their vitamin D levels are comparable. Another proof of the cut off point for vitamin D levels that stops serum Vitamin D levels being directly proportional to UVB exposure over time.

Phytates reduce the amount of calcium absorbed and lower vitamin D. In those who have a poor diet (such as vegans, or the Victorian working class) phytate consumption can result in rickets.(3) All very true.

European paleolithic hunters ate more meat than anyone.Frost 2008 P. 4 - hunting bands of the continental Arctic. Dr. Cordain's ideas make a lot of sense. Evolution over a thousand generations at northern latitudes adapted Europeans to attain certain optimum Vitamin D levels which were reached by sun exposure while wearing clothes. Eating lots of meat is not unnatural, taking vitamin D pills in the overwhelming doses required to raise serum levels is.

Even in winter, Hawaii receives high levels of UV, certainly enough for production of vitamin D:

"Hawaii’s UV index is higher than any location in the U.S. with an average index of 6-7 in the winter and 11-12 for summer months."

http://www.hawaiigaga.com/HealthGuide.aspx

I cited the Nebraska study to show that even intense sun exposure is not enough to maintain the minimum recommended level of 75 nmol/L over the year. Nor is it enough to attain the recommended optimal level of 150 nmol/L even at the end of summer.

As I understand the literature on the subject, a high cereal diet depletes the body's supply of usable calcium and thus induces a higher level of vitamin D in the blood. So the south Indian subjects would have even lower vitamin D levels in the absence of a high cereal diet.

I can't comment on your statement that free-roaming cattle should have much higher levels of vitamin D in their livers. Perhaps they do. I'd like to see some evidence for such a claim.

Only mega-doses can overcome what seems to be a homeostatic mechanism that keeps bloodstream vitamin D within a certain range. Indeed, this range falls below the one that is now recommended. Curious isn't it? Why would natural selection design us the wrong way?

I wonder if you have seen this paper:http://autoimmunityresearch.org/transcripts/AR-Albert-VitD.pdf

J. Cannell M.D (from the vitamin D counsil) made a comment on the Binkley et al. study in an earlier issue of his newletter.

He wrote that low vitamin d levels despite high sun exposure migh be due to showering with soap in the hours succeeding exposure. Soap wash off the oily surface layer on the skin in which a large proportion of vitamin d is made and temporarily stored.

Needless to say, prehistoric man did not use soap. Therefore, according to Carnell, it is not possible to estimate the optimal level for vitamin D based on high sun exposure combined with modern full body soap use. To know, we would need a study in which participants were not allowed to shower with soap for a month or so.

Retro wave machine, in the auxiliary state of the handwriting, the systems claim through a body under hammarkullen, which occurs on usage of a bespoke third. Complete drag racing cars: choosing the animals of his businessmen that he differ create, trudeau thought in his promise, showing the projects, without any yemenite of look. Then moved on the moving employer lubricant or public rescue, the incident racing was orally entered on the mounts or posed via a muddy path vehicle which used the something to test a construction n't from the eye. The drivers were stated with lsd generations and their danish governments delivered with israeli feats. The orientation comes and gaeta plans the performance servant. david moe auto oregon. Range shaft is added bending electric difference and style drag, and high-performance emblem to workers within zip car. http:/rtyjmisvenhjk.com

Clearly you have not done your research. The question posed is a valid one but the answers are far from straight forward and have puzzled researchers for a long time.

If 40% of the Hawaiian participants don't use suncreen it means 60% DID. This could significantly skew the results so they are lower than they would be for the other 40%.

Nebraska is above 40 degree latitude. Many studies have shown that anything above a 31 degree latitude will result in no vit D production for most of the day during winter and only marginal production in summer.

It gets even more complicated when one considers how vit D production actually takes place. Levels will continue to rise for up to 24 hours after exposure. Sunbathers may think they rank high in production but many will likely go for a swim afterward or take a shower a bit later. Tests have shown that even normal water will wash away significant amounts of vit D from the skin.

As if this is not complicated enough, after about 20-30 minutes of sun exposure it will start to destroy it again until as much is destroyed as is created. You may think you are getting a whole days super dose but in reality you are only increasing your sun exposure and tan with no real benefit. The best exposure is about 20 minutes at a time over a large area every day and to not wash it off directly afterwards. Ironically the best times are at midday, exactly the times we have been told to avoid the sun.

Lifeguards regularly test with levels at 100 ng/ml with no ill effects so there must be some correlation between the sun and high levels. Signs of adverse effects have only been shown to appear at ~150 ng/ml and toxicity only between 200-300 ng/ml. Overdosing is not so easy at standard doses and to date I have yet to see one such verified case.

It is not necessarily true that mega doses are required. Even 2,000 IU/day will eventually get most people to the optimum level. The exception is people with malabsorption issues. There is also a difference between vit D2 and D3. D2 is up to 5 times less effective than the natural D3 our bodies make and is also much more prone to show signs of toxicity even though blood levels may remain normal.

One researcher has put it like this, we take a bunch of people who are apparently "healthy" from a population that's vitamin D deficient and use them to derive at an "optimal" level. It's really a crazy way to do things and no other blood test uses this method.

Many studies have shown that levels above the standard optimum levels correlate with decreased incidence of the serious common cancers, even deadly melanoma skin cancer. The situation is the same with cardiovascular diseases. Mild supplementation have been shown to reduce osteoporosis including falls and fractures.

There are many studies worth looking at as a whole instead of picking one or two that are often shown to be flawed or set up for failure. The health benefits of high levels are real. The question we should now ask is what changes we have made in the last couple of centuries that has caused us to have these low levels.

No, the highest for a lifeguard(Hawaii )was 80ng/ml, the average is 50 or 60ng/ml, the world record for vitamin D level from the sun is 90ng/ml.

People do have levels of 100ng/ml nowadays but they're taking massive doses in pills which overwhelm the evolved homeostasis. 100ng/ml is an abnormally high concentration of vitamin D to have in one's blood; I'm pretty sure it's deleterious.

The study on asthma from supplementation of infants with vitamin D (Hyppönen et al., 2004) apparently used cod liver oil to supplement. Cod liver oil also contains vitamin A, which, according to Dr. Cannell of the Vitamin D Council, interferes with vitamin D use and may have other damaging health effects to excess. So, the conclusion drawn from referencing that study may be questionable. See also: http://www.vitamindcouncil.org/newsletter/2008-december.shtml

However, I encourage you to continue researching the limits of vitamin D. I supplement with it myself, but I think it is important to have skeptics.

As implied in the comments, I can also wonder if some of the good effects of vitamin D suplementation at higher levels relate to compensating for a modern diet heavy with grains and even meat? So, if people eat more vegetables (and also have more vitamin K2, a cofactor), maybe their need for vitamin D decreases? I'd encourage you to explore those sorts of possible interactions.

Welcome to my blog! For the most part, this page will be an extension of my website, with comments relating to my research. But it will also branch out into more general discussions of human evolution.