Testosterone replacement therapy (TRT)is associated with a lower risk of adverse cardiovascular (CV) events among men with testosterone deficiency, according to a new study.

Researchers led by T. Craig Cheetham, PharmD, MS, of the Southern California Permanente Medical Group, identified a retrospective cohort of 44,335 men aged 40 years and older with evidence of testosterone deficiency. The cohort included 8808 men who had ever been dispensed testosterone (ever-TRT group) and 35,527 men never dispensed testosterone (never-TRT group). The primary outcome was a composite of cardiovascular endpoints that included acute myocardial infarction (AMI), coronary revascularization, unstable angina, stroke, transient ischemic attack (TIA), and sudden cardiac death (SCD).

After a median follow-up of 3.2 years in the never-TRT group and 4.2 years in the ever-TRT group, the rates of the composite endpoint were significantly higher in the never-TRT than ever-TRT group (23.9 vs 16.9 per 1000 person-years), Dr Cheetham and colleagues reported online ahead of print in JAMA Internal Medicine. After adjusting for potential confounders, the ever-TRT group had a significant 33% lower risk of the primary outcome compared with the never-TRT group. The investigators found similar results when looking separately at combined cardiac events (AMI, SCD, unstable angina, coronary revascularization) and combined stroke events (stroke and TIA). The ever-TRT group had a significant 34% and 28% lower risk of cardiac events and stroke events compared with the never-TRT group, respectively.

“While these findings differ from those of recently published observational studies of TRT, they are consistent with other evidence of CV risk and the benefits of TRT in androgen-deficient men,” the investigators wrote.

Previous studies have found an association between low serum testosterone levels in aging men and an increased risk of coronary artery disease as well as an inverse relationship between serum testosterone and carotid intima thickness, Dr Cheetham’s team pointed out.

The never-TRT and ever TRT groups had a mean age of 59.8 and 58.4 years, respectively. In the never-TRT group, 13,824 men (38.9%) were aged 40 to 55 years, 10,902 (30.7%) were aged 56 to 65 years, and 10,801 (30.4%) were older than 65 years. In the ever-TRT group, 3746 men (42.5%) were aged 40 to 55, 2899 (32.9%) were aged 56 to 65 years, and 2163 (24.6%) were older than 65 years. The groups were similar with respect to race and ethnicity composition.

With regard to study limitations, the investigators noted that their criterion for identifying men with testosterone deficiency (a diagnosis or at least 1 morning testosterone measurement) does not meet the strict criteria established by the Endocrine Society. “Therefore some individuals in the study could be misclassified as being androgen-deficient.” In addition, as the study was observational in design, “unmeasured confounding may have had an influence on the results; unmeasured confounders could possibly influence clinicians to selectively use testosterone in healthier patients.”

In an accompanying editorial, Eric Orwoll, MD, of Oregon Health & Sciences University in Portland, commented that while the study by Dr Cheetham’s group “provides reassuring data concerning the effects of testosterone on cardiovascular health, convincing answers about this question—and other safety issues like prostate health—remain elusive and will require large, prospective randomized trials.

“At this point, clinicians and their patients should remain aware that the cardiovascular risks and benefits of testosterone replacement in older hypogonadal men have not been adequately resolved.”

How it is ruining your health and virility, and what you need to do to prevent It

Estrogen dominance is often thought of as a female-only disorder, however men suffer with it as well, and overexposure to estrogen-like compounds (xenoestrogens), has made it increasingly common.
Understanding Male Estrogen Dominance

The healthy range of estradiol is between 15 and 60 pg/ml. When estradiol climbs higher than that, or when testosterone levels fall too low to balance out estrogen, estrogen dominance occurs.
Estrogen Dominance Symptoms

Estrogen dominance can cause: a loss of libido, an inability to get and/or maintain an erection, low sperm count, infertility, an inability to orgasm, and male breast enlargement.

Estrogen Dominance and Prostate Health

As estradiol levels climb, both prostate size and fibrous tissues increase. This makes it hard to urinate and increases the risk for prostate cancer and benign prostatic hyperplasia (BPH).
Four Main Causes of Estrogen Dominance in Men

Cause #1: Diet

Animal products are major estrogen dominance contributors. Non-organic produce and processed foods made from them, can also contribute to estrogen dominance because they are grown with herbicides and pesticides which mimic estrogen.

Cause #2: Excess Body Weight

Fat tissues are rich in an enzyme that converts protein into testosterone, and testosterone into estradiol; the more fat you have, the higher your estradiol levels will be. Estrogen is also stored in fat cells, so if you’re overweight you’ll need to lose excess fat cells to reverse estrogen dominance.

Cause #3: Caffeine and Alcohol

Caffeinated beverages are major estrogen dominance triggers. Alcohol is also problematic because plants used to produce alcoholic beverages contain estrogen-like compounds that mimic estrogen in the body.
Cause #4: Tight Underwear

Tight underwear forces the testicles to be squeezed up against the body, which reduces the flow of blood to the testicles and causes them to overheat. These two factors lead to an increase in estradiol and a decrease in testosterone.

A genomic study of baldness identified more than 200 genetic regions involved in this common but potentially embarrassing condition. These genetic variants could be used to predict a man’s chance of severe hair loss. The study, led by Saskia Hagenaars and W. David Hill of The University of Edinburgh, United Kingdom, is published February 14th, 2017 in PLOS Genetics.

Before this new study, only a handful of genes related to baldness had been identified. The University of Edinburgh scientists examined genomic and health data from over 52,000 male participants of the UK Biobank, performing a genome-wide association study of baldness. They pinpointed 287 genetic regions linked to the condition. The researchers created a formula to try and predict the chance that a person will go bald, based on the presence or absence of certain genetic markers. Accurate predictions for an individual are still some way off, but the results can help to identify sub-groups of the population for which the risk of hair loss is much higher.

The study is the largest genetic analysis of male pattern baldness to date. Many of the identified genes are related to hair structure and development. They could provide possible targets for drug development to treat baldness or related conditions.

Saskia Hagenaars, a PhD student from The University of Edinburgh’s Centre for Cognitive Ageing and Cognitive Epidemiology, who jointly led the research, said: “We identified hundreds of new genetic signals. It was interesting to find that many of the genetics signals for male pattern baldness came from the X chromosome, which men inherit from their mothers.”

Dr David Hill, who co-led the research, said: “In this study, data were collected on hair loss pattern but not age of onset; we would expect to see an even stronger genetic signal if we were able to identify those with early-onset hair loss.”

The study’s principal investigator, Dr Riccardo Marioni, from The University of Edinburgh’s Centre for Genomic and Experimental Medicine, said: “We are still a long way from making an accurate prediction for an individual’s hair loss pattern. However, these results take us one step closer. The findings pave the way for an improved understanding of the genetic causes of hair loss.”

As 2016 draws to an end, I believe that a change is in the air. The dietary guidelines, or perhaps I should call them the ‘dietary misguidedlines’, are under a sustained attack. This, finally, may actually result in success. We will be able move on from believing that fat, or saturated fat, in the diet is responsible for cardiovascular disease or, indeed, any form of disease.

But where to then? The current dogma is that saturated fat in the diet raises cholesterol levels and this, in turn, leads to cardiovascular disease. However, as many of you may have spotted earlier this year, in the Minnesota Coronary Experiment (MCE), substituting saturated fat with polyunsaturated fat was effective at lowering cholesterol levels. However, it had absolutely no effect on deaths for heart disease, and greatly increased the overall risk of death.

The low saturated fat group had a significant reduction in serum cholesterol compared with controls.

There was no evidence of benefit in the intervention group for coronary atherosclerosis or myocardial infarcts.

For every 0.78mmol/l reduction in serum cholesterol [Around a 20% reduction], there was a 22% higher risk of death [This is about a 30% reduction in cholesterol level]

Big deal, you might think. This is just one trial, so what difference does it make. However, this was no ordinary trial. It was absolutely pivotal for four main reasons:

It was the largest controlled trials of its kind ever done. That is, substituting saturated with polyunsaturated fats.

It was done by Ancel Keys (who started the entire diet-heart hypothesis in the first place)

It was finished, before the main clinical nutritional guidelines were developed

It was not published at the time, for reasons that have never been explained, by anyone.

As the authors of the re-analysis note.

“Whatever the explanation for key MCE data not being published, there is growing recognition that incomplete publication of negative or inconclusive results can contribute to skewed research priorities and public health initiatives. Recovery of unpublished data can alter the balance of evidence and, in some instances, can lead to reversal of established policy or clinical practice positions.” 1

Which is a polite way of saying that a bunch of liars hid the results. Almost certainly because the results contradicted their self-promoted message that saturated fats are unhealthy. It is clear that these researchers, in particular Ancel Keys, did this quite deliberately, and then continued to promote their own dietary dogma.

I think it is almost impossible to overestimate the long-term impact of the non-publication of this trial.

For want of a nail the shoe was lost.

For want of a shoe the horse was lost.

For want of a horse the rider was lost.

For want of a rider the message was lost.

For want of a message the battle was lost.

For want of a battle the kingdom was lost.

And all for the want of a horseshoe nail.

Here is my updated version

For want of the MCE trial evidence the McGovern hearings were lost

For want of the hearings the guidelines were lost

For want of the guidelines the message was lost

For want of the message battle was lost

For want of the battle saturated fat was lost

All for the want of the MCE trial data.

The McGovern hearings which set the entire direction of nutritional thinking, and guidelines, took place in 1977. The MCE trial ran from 1968 to 1973. Had the data from this study been made available, the dietary guidelines in the US, the UK and the rest of the world (In their current form, demonising saturated fat) simply could not have been written.

If those guidelines had not been written, then the entire world of cardiovascular research would almost certainly have gone off in a different direction. The role of LDL in causing CVD would have been consigned to the dustbin history. Goldstein and Brown wouldn’t have done their research on Familial Hypercholesterolaemia, statins would never have been developed, and we not have been forced to endure fifty years of the damaging, destructive diet-heart/cholesterol hypothesis.

The fact that the diet-heart/cholesterol hypothesis is complete nonsense, has been clear as day to many people for many years. In 1977 George Mann, a co-director of the Framingham Study, writing in the New England Journal of Medicine called it ‘the greatest scam in the history of medicine.’ In my view, anyone with a moderately functioning brain, can easily see that it is nonsense.

So, if not fat and cholesterol, what does cause cardiovascular disease, and more importantly, what can be done to prevent it, or at least delay it? At last (some of you are thinking) I will state what I believe to be one of the most important things you can do to reduce the risk.

Returning to the central process of cardiovascular disease (CVD), for a moment. If you are going to reduce the risk of cardiovascular disease, you must do, at least, one of three things:

Reduce the size and tenacity (difficulty of being broken down) of the blood clots that develop

If you can do all three, you will reduce your risk of dying of a heart attack, or stroke, to virtually zero.

What protects the endothelium?

There are many things that that can do this, but the number one agent that protects the endothelium is nitric oxide (NO). Thus, anything that stimulates NO synthesis will be protective against CVD. Which brings us to sunshine and vitamin D.

Sunlight on the skin directly stimulates NO synthesis, which has been shown to reduce blood pressure, improve arterial elasticity, and a whole host of other beneficial things for your cardiovascular system, not least a reduction in blood clot formation.

Sunlight on the skin also creates vitamin D, which has significant impact on NO synthesis in endothelial cells, alongside many other actions. It also prevents cancer, so you get a double benefit.

Therefore, my first direct piece of direct advice for those who want to prevent heart disease, is to sunbathe. In the winter when the sun is not shining take vitamin D supplementation. Alternatively, go on holiday to somewhere sunny. Or get a UVB sunbed, and use it.

My only note of warning here is to say, don’t burn, it is painful and you don’t need to.

By the way, don’t worry about skin cancer. Sun exposure protects against all forms of cancer to a far greater degree than it may cause any specific cancer. To give you reassurance on this point, here is a Medscape article, quoting from a long-term Swedish study on sun exposure:

‘Nonsmokers who stayed out of the sun had a life expectancy similar to smokers who soaked up the most rays, according to researchers who studied nearly 30,000 Swedish women over 20 years.

This indicates that avoiding the sun “is a risk factor for death of a similar magnitude as smoking,” write the authors of the article, published March 21 in the Journal of Internal Medicine. Compared with those with the highest sun exposure, life expectancy for those who avoided sun dropped by 0.6 to 2.1 years.

Pelle Lindqvist, MD, of Karolinska University Hospital in Huddinge, Sweden, and colleagues found that women who seek out the sun were generally at lower risk for cardiovascular disease (CVD) and noncancer/non-CVD diseases such as diabetes, multiple sclerosis, and pulmonary diseases, than those who avoided sun exposure.

And one of the strengths of the study was that results were dose-specific — sunshine benefits went up with amount of exposure. The researchers acknowledge that longer life expectancy for sunbathers seems paradoxical to the common thinking that sun exposure increases risk for skin cancer.

“We did find an increased risk of…skin cancer. However, the skin cancers that occurred in those exposing themselves to the sun had better prognosis,” Dr Lindqvist said.”2

In short, avoiding the sun is a bad for you as smoking. In my opinion ordering people to avoid the sun, is possibly the single most dangerous and damaging piece of health prevention advice there has ever been. The sun has been up there, shining down, for over four billion years. Only very recently have we hidden from it. If you believe in evolution, you must also believe that sunshine provides significant health benefits. It cannot be otherwise.

Much of the $23 billion spent each and every year on statin drugs is really targeting the treatment of “high cholesterol”—but actually unhealthy distortions in lipoproteins—created by consuming grains.

Most people, unfortunately, continue to focus on fat consumption, especially saturated fat, as the cause for high cholesterol and have been led to believe that cutting saturated fat and statin drugs are the solution. So let me try and clear up this somewhat confusing issue and show you that 1) there is no real benefit to cutting saturated fat, 2) grains and sugars cause distortions that increase cardiovascular disease, and 3) statin drugs do not fully address the causes of cardiovascular disease, accounting for their relatively trivial benefits.

In other words, HDL cholesterol is lowish, triglycerides high, LDL cholesterol and total cholesterol high. What does this mean? Let’s take each, one by one. It’s a bit complex, but stick with it and you will emerge smarter than 95% of doctors who “treat” high cholesterol.

Triglycerides are the byproduct of two digestive processes: 1) De novo lipogenesis or the liver’s conversion of the amylopectin of grains and other sugars into triglyceride-rich VLDL particles that enter the bloodstream, and 2) absorption of dietary fats (which are triglycerides themselves). De novo lipogenesis dominates triglyceride levels in the bloodstream, far outstripping consumption of fat as a determinant of triglyceride levels. This simple fact was only identified recently, as the rise in triglycerides that occurs after consuming fats and oils develops within 2-4 hours, but the much larger rise in triglycerides from carbohydrate-to-triglyceride conversion starts 6-8 hours later, a fact not uncovered in older studies that failed to track this far out in time. (And, in certain genetic types, such as apo E2, the rise from carbohydrates in grains and sugars can last for days to weeks.)

LDL cholesterol is calculated, not measured. The Friedewald calculation, developed in the early 1960s to provide an easy but crude means of estimating the quantity of cholesterol in the low-density lipoprotein fraction of the blood applied several basic assumptions: 1) that everyone consumes an average diet of average macronutrient composition, and 2) that the triglyceride content of all lipoproteins remained constant from person to person (which is not true, but is wildly variable), and 3) that all LDL particles are the same (also not true, as LDL particles vary in size, conformation, surface characteristics, etc.).

Grain consumption, thanks to the process of de novo lipogenesis, increases blood levels of triglycerides and VLDL particles. VLDL particles interact with LDL particles, enriching LDL particle triglyceride content and reducing cholesterol content. This leads to a process of LDL particle “remodeling” that creates small LDL particles–glycation-prone, oxidizable, adherent to inflammatory blood cells, and persistent in the bloodstream for 7 days, rather than the 24 hours of more benign large LDL particles. Grains thereby trigger the process creating persistent and damaging small LDL particles; fats trigger the process that does not.

When we cut out grains and sugars, the Friedewald calculation is therefore no longer valid, as the assumptions–-weak to begin with–-are disrupted. LDL cholesterol, this crude, surrogate effort to indirectly quantify LDL particles, is therefore completely useless—the calculation of LDL cholesterol is INVALID. This has not, unfortunately, dampened enthusiasm among my colleagues nor the drug industry for trying to treat this number with statin drugs to the tune of $23 billion per year.

Better ways to quantify LDL particles: NMR LDL particle number (which includes quantification of small and large LDL particles) or an apoprotein B. (Each LDL particle contains one apo B, which thereby provides a virtual count of LDL particles, but no breakdown into small vs. large.) Lipoprotein testing has been around for over 20 years, is inexpensive and available—but requires an informed doctor to interpret.

HDL cholesterol is, unlike LDL cholesterol, a measured and reliable value. Ironically, it is among the most ignored. Grain-consuming humans tend to have low HDL because the high triglyceride/VLDL particles interact in the bloodstream with HDL particles, enriching HDL particles in triglycerides and reducing cholesterol content. This leads to a reduction in HDL size and HDL quantity, thus low HDL cholesterol values. The lower the HDL, the higher the cardiovascular risk.

Given the mix of values, total cholesterol is therefore essentially useless. A large increase in HDL, for instance–-a GOOD thing–-will raise total cholesterol; a large reduction in HDL–-a BAD thing–-will reduce total cholesterol: the opposite of what you would think. Total cholesterol can indeed yield useful prognostic information when applied to a population, though the relationship is weak. But it is useless when applied to an individual.

If we reject the silly and simple-minded notions of cholesterol panels, and apply the greater insights provided by advanced lipoprotein analysis, several nutritional observations can be made:

You can begin to appreciate how overly simplistic this notion of “reducing cholesterol” using statin drugs really is. You can also appreciate that the real situation is a bit more complicated and beyond the reach of most busy primary care physicians, while being outside the interests of most cardiologists, obsessed as they are with revenue-producing activities like heart catheterizations, stent and defibrillator implantation.

A typical response in the cholesterol panel of someone who has eliminated all wheat, grains, and sugars would look something like this:

I left the LDL cholesterol blank because it can do just about anything: go up, go down, remain unchanged—but it doesn’t matter, because it is inaccurate, unreliable, invalid. If you were to measure advanced lipoproteins, however, you would see a dramatic reduction or elimination of small LDL particles and reduction of the total count of LDL particles (since the small LDL component has been reduced or eliminated) with large LDL particles remaining.

Common distortions of cholesterol panels can be easily explained by the chain of events that emerge from a diet rich in “healthy whole grains.” The relatively trivial benefits of statin cholesterol drugs (about a 1% reduction in real risk, not the inflated “relative risks” quoted in ads and statistically-manipulated studies) should come as no surprise, since high cholesterol is not the cause for cardiovascular disease.

Iron is essential for human life, as it is a key part of various proteins and enzymes, involved in the transport of oxygen and the regulation of cell growth and differentiation, among many other uses.

One of the most important roles of iron is to provide hemoglobin (the protein in red blood cells), a mechanism through which it can bind to oxygen and carry it throughout your tissues, as without proper oxygenation, your cells quickly start dying.

If you have too little iron, you may experience fatigue, decreased immunity or iron-deficiency anemia, which can be serious if left untreated. This is common in children and premenopausal women.

But what many people fail to realize is that too much iron can be equally deadly, and is actually far more common than iron deficiency, thanks to a hereditary disease known as hemochromatosis.

This Health Issue Has Been of Major Importance to Me and My Family

This test saved my dad’s life 20 years ago when I discovered he had a ferritin level close to 1000. It was because he has beta-thalassemia. With regular phlebotomies, his iron levels normalized and now the only side effect he has is type 1 diabetes. The high iron levels damaged his pancreatic islet cells and now he has what is called “bronze” diabetes and so requires the use of insulin.

I also inherited this from him so this is a personal issue. Thankfully, I am able to keep my iron levels normal by removing about a quart of blood a year. This is removed not all at once but over a few dozen deposits.

I screened all my patients with ferritin levels and noticed nearly one-fourth of them had elevated levels. So I would strongly encourage you and your family to be screened annually for this, as it is SO MUCH easier to prevent iron overload than it is to treat it.

Ferritin Screen – One of Your Most Important Health Tests

Checking your iron levels is easy and can be done with a simple blood test called a serum ferritin test. I believe this is one of the most important tests that everyone should have done on a regular basis as part of a preventive, proactive health screen.

The test measures the carrier molecule of iron, a protein found inside cells called ferritin, which stores the iron. If your ferritin levels are low, it means your iron levels are also low.

The healthy range of serum ferritin lies between 20 and 80 ng/ml. Below 20 is a strong indicator that you are iron deficient, and above 80 suggests you have an iron surplus. The ideal range is between 40-60 ng/ml.

The higher the number over 100 the worse the iron overload, with levels over 300 being particularly toxic and will eventually cause serious damage in nearly everyone that sustains those levels long term. It’s important to find out if your levels are high because your body has a limited capacity to excrete iron, which means it can easily build up in organs like your liver, heart and pancreas. This is dangerous because iron is a potent oxidizer and can damage your body tissues contributing to serious health issues, including:

Cirrhosis

Liver cancer

Cardiac arrhythmias

Type one diabetes

Alzheimer’s disease

Bacterial and viral infections

Cancer researchers have also found new evidence that bowel cancers are two to three times more likely to develop when dietary iron is too high in your body.1

Risk Factors for Iron Overload

People with hemochromatosis are not the only ones who may accumulate more iron than is healthy. While premenopausal women who are menstruating regularly rarely suffer from iron overload, most adult men and postmenopausal women tend to be at a high risk, as they don’t have a monthly blood loss (one of the best ways you can get rid of excess iron is by bleeding).

Another common cause of excess iron is the regular consumption of alcohol, which will increase the absorption of any iron in your diet. For instance, if you drink wine with your steak, you will likely be absorbing more iron than you need. Other potential causes of high iron levels include:

Cooking in iron pots or pans. Cooking acidic foods in these types of pots or pans will cause even higher levels of iron absorption.

Eating processed food products like cereals and white breads that are “fortified’ with iron. The iron they use in these products is inorganic iron, not much different than rust and it is far more dangerous than the iron in meat.

Drinking well water that is high in iron. The key here is to make sure you have some type of iron precipitator and/or a reverse osmosis water filter.

Taking multiple vitamins and mineral supplements, as both of these frequently have iron in them.

Could Reducing Your Iron Level Be a Safer Alternative to Statins?

We may have garnered some valuable information about how iron drives inflammation from studying statins drugs, of all things. Statins are of course, cholesterol drugs. Statins have an anti-inflammatory effect on your body by reducing oxidative stress, which is something the drug companies tend not to disclose. The fact that statin drugs reduce inflammation, and reduce inflammatory markers like C-reactive protein, may explain why statins decrease heart attacks in somepeople. This benefit has nothing to do with the action of lowering cholesterol, but rather the reduction of inflammation.

In a study published in the April 2013 issue of American Journal of Public Health2, researchers found that statins improved cardiovascular outcomes at least partially by countering the proinflammatory effects of excess iron stores. In this study, the improved outcomes were associated with lower ferritin levels but not with “improved” lipid status. Researchers concluded iron reduction might be a safe and low-cost alternative to statins. An earlier study in the American Heart Journal3 also showed that people with a lower iron burden had less risk for heart attack and stroke.

These studies add credence to what I learned a few years ago from Dr. Steven Sinatra, one of the leading natural cardiologists in the world, that statins’ only health benefit is that of reducing inflammation.

This may be helpful for a small percentage of individuals who have a very high risk of dying from a heart attack, but NOT for those who simply have “high” cholesterol levels. Statin drugs are vastly overprescribed and are not worth the risk for the vast majority of you. In some cases, they may actually increase your risk of stroke. If elevated iron is the driving force behind your inflammation and cardiovascular disease, then it makes far more sense to simply reduce your iron level, as opposed to taking a statin drug that has the potential for many adverse effects.

What to Do if Your Iron Levels Are Too High

The good news if you find out that your iron levels are elevated or you have hemochromatosis is that remedying the condition is relatively simple. Some people advise using iron chelators like phytic acid or IP6, but I tried that with my dad and it failed miserably so I would not recommend it. Donating your blood is a far safer, more effective and inexpensive approach for this problem.

http://cadmin.mercola.com/MercolaAdmin/PetsSpanishSpecialTagContentPriority.aspx If, for some reason, a blood donor center is unable to accept your blood for donation you can obtain a prescription for therapeutic phlebotomy. At the same time, you will want to be sure to avoid consuming excess iron in the form of supplements, in your drinking water (well water), from iron cookware, or in fortified processed foods.

Additionally:

Certain phenolic-rich herbs and spices, such as green tea and rosemary, can reduce iron absorption4

The primary polyphenol in turmeric known as curcumin actually acts as an iron chelator, and in mice studies, diets supplemented with this spice extract exhibited a decline in levels of ferritin in the liver5

Astaxanthin, which has been researched to have over 100 potential health benefits, has been shown to reduce iron-induced oxidative damage6

The Ancient Origins of Iron Overload

How and why hemochromatosis – now one of the most prevalent genetic diseases in the United States – emerged is the subject of numerous theories and speculation – but its true history remains a complex mystery. In a fascinating article on the topic, The Atlantic7 recently highlighted the notion that everyone who inherited the C282Y mutation responsible for the majority of hemochromatosis cases got it from the same person. In other words, one distant ancestor passed on the mutation, which now favors people of Northern European decent.

No one knows the precise identity of the founder, but initial speculation that it was someone of Irish descent has given way to the possibility that it may have actually arisen in a Viking civilization or, even earlier, in a central European hunter-gatherer.

It takes two inherited copies of the mutation (one from the mother and one from the father) to cause the disease (and even then only some people will actually get sick). If you have just one mutation, you won’t become ill but you will absorb slightly more iron than the rest of the population, a trait that may have given people an advantage when dietary sources of iron were scarce.

Did the Hemochromatosis Mutation Emerge to Protect Humans from a Carb-Heavy Agricultural Diet?

There is speculation that the hemochromatosis mutation may have spread in ancient Europe around the time that man transitioned from hunter-gatherer to farmer. Unlike the Paleolithic diet of the “cavemen,” which by necessity included a relatively balanced diet of iron-rich meat, fish and plant foods, farming may have led humans to rely on an overabundance of grain carbohydrates. The featured article reported:

“Fossil evidence indicates early European farmers stood roughly 6 inches shorter than their hunter-gatherer ancestors, a possible indication of malnutrition… Average height and life expectancy fell, as bone infections, dental cavities, and skeletal malformations associated with anemia rose. While the exact composition of the Paleolithic plate remains debated, most agree that European hunter-gatherers ate more meat than those in modern farming communities. And this animal protein was an excellent source of one familiar micronutrient: iron.

The World Health Organization estimates that 1.6 billion persons worldwide current suffer from the lack of red blood cells known as anemia — half of which may be caused by iron deficiency. One’s inner paleo might wonder whether this pandemic of iron deficiency began in the Neolithic era as diets bloated with carbohydrates replaced those rich in meat and fish.

Anemia decreases the oxygen carrying capacity of the blood; if marked, this will hinder an individual’s ability to stay healthy, find food, and reproduce. The C282Y mutation increases iron absorption, and it may have inadvertently protected carriers against this threat.”

The Hemochromatosis Link to the Plague

Another intriguing theory suggests that the hemochromatosis mutation may have protected against the Black Death of the 14th century, by preventing the Yersinia pestis bacteria from reproducing inside of human immune cells.

“During the Black Death, mortality may have been highest, up to 50-66 percent, in the British Isles — a future hotbed of hereditary hemochromatosis … In this most unsympathetic environment, minute DNA differences may decide survival or death. A genetic advantage would quickly spread through the island population — it would have less value on the mainland where plague mortality may have been lower,” the featured article reported.

But this theory is challenged by conflicting information that suggests the plague bacteria use iron from its host to enhance its ability to infect cells. People with hemochromatosis may therefore be among the most vulnerable to succumbing to plague infections, which suggests the mutation may have nothing to do with survival. It could, perhaps, be an artifact of natural selection or there may be a different explanation entirely…

Written by: Colin Champ, MD

Several years back a scientific article revealed that those of us with high “muscular strength” have a lower risk of becoming a victim to cancer – a 40% lower risk to be exact.1 After assessment of almost 9,000 men aged 20-82, scientists found that men with a stronger one-rep max on bench press and leg press have a 40% reduction in their risk of dying from cancer. They adjusted for body mass index (BMI), body fat, and cardiorespiratory fitness and the results still held strong (pun intended).2 In other words, there is something about simply being stronger that can lower our risk of getting cancer. Many felt as though there was something innately healthy about having more muscles, but another study associated weak hand grip strength with an increased risk of cancer, even regardless of muscle size.3 So is it all about strength or do muscles fight cancer?

Strength goes beyond lowering our risk of dying from cancer; it lowers our risk of dying from most major health issues. For instance, men exhibiting a lower vertical leap, less sit-ups, and decreased grip strength have a higher risk of dying period.4Men and women with moderate and high bench press and sit-up scores have lower risks of death,5 while men with a higher 1-repetition bench and leg press apparently live longer (even when we account for other health issues, like cardiovascular disease, smoking, obesity, etc.).6

Muscles Fight Cancer – More Muscles = More Health?

The first thought that comes to mind is that more muscles means more strength, and both are a result of more exercise. Sure enough, when we take a close look through these studies, we do see that the strongest among us have less body fat, are in better shape, and have better “good” cholesterol values with lower blood sugar and triglycerides.1 This is not surprising.

However, in nearly all these “muscles fight cancer” studies, other health issues were adjusted for and the findings still held. In other words, these studies seem to suggest that strength is independently associated with a lower risk of cancer and a higher change of avoiding an untimely death, regardless of age, smoking, alcohol usage, or other health issues. But as we know, associations can only take us so far, before we must explore the mechanism that support these associations.

Muscles Fight Cancer – It’s the Muscles!

In the study above, the scientists found some intriguing results: the benefits of muscular strength overlap with cardiovascular fitness, but the benefits of muscular strength in decreasing the risk of cancer death work through different mechanisms.1Perhaps the synergy exists, or in other words, having more muscle and strength is good, and exercising them is better.

More sugar extracted from our blood by skeletal muscle and used for energy during exercise

Less cancer-promoting sugar and insulin floating around our blood

A decrease in the levels of hormones that, over a prolonged period, can lead to cancer. For instance, resistance training increases IGFBP-3, which binds to insulin-like growth factor (IGF), decreasing its ability to promote cancer (growth factors are normal within the human body, but too many can lead to excessive cellular growth, including cancer growth)7

Decreased inflammation (which when present, serves as a fertilizer for cancer)

However, recent studies have changed much of our thinking when it comes to muscle. There are many organs in our body that respond to stimuli and secrete hormones, which serve as messages to direct remote parts of the body. We are recently starting to find some more unconventional organ-like structures in the body. For instance, it is now well-established that our adipose tissue works like an endocrine organ – albeit a bad one – secreting inflammatory hormones and an excess of potentially cancer-stimulating hormones.8 Take estrogen for example, which is a hormone that both men and women require to function normally. However, when supplied in higher than physiologically normal amounts from excess body fat, it can increase a woman’s risk of breast cancer. When women lose theses additional pounds through dietary changes and exercise, estrogen levels decrease.9

Studies have now shown that fat is not the only recently discovered endocrine organ. Muscle may act similarly, though this time to the benefit of our health. The metabolic muscular organ within us secretes IL-6, an important cytokine that was once felt to be a bad guy that caused inflammation. Newer studies reveal that IL-6 has a healthy role and is actually a myokine, which is an endocrine hormone produced by muscle (myo = muscle) and released during contraction. In other words, while fat secretes harmful hormones, muscles squeeze out some healthy hormones during lifting.

Muscles Fight Cancer – The Physiologic Benefits of Having More Muscle

As discussed above, exercise has plenty of benefits. However, contracting our muscles during running, resistance training, or simply heavy lifting provides benefits that are entirely separate from those of exercise.

For instance, while fat tissue secretes the pro-inflammatory cytokine TNF-α (which stands for tumor necrosis factor since our immune cells secrete it in the presence of tumor cells), our muscles secrete IL-6, which fights inflammation. As bad as fat is generally considered, muscle seems to stand in direct opposition to fat physiologically, and TNF versus IL-6 further embodies this difference.

Adipose-derived TNF is inflammatory, while muscle-derived IL-6 is anti-inflammatory.

Muscle-derived IL-6 signals to our body to break down lipids and burn fat.10

AMPK, or AMP-activated protein kinase, is an enzyme extensively expressed in our muscles, liver, and brain. It serves as an energy sensor and regulator and closely monitors changes in energy status based on our dietary and lifestyle habits. ATP, the energy currency of our cells, is broken down to AMP by our cells. ATP has three phosphates (the T is for tri) and when it loses one becomes ADP (the D is for di, or two) and when it loses two phosphates it becomes AMP (the M is for mono). Without dipping too deep into boredom territory:

ATP → ADP + P

ATP → AMP + 2P

AMPK works to supply more ATP and increase our available energy molecules. AMPK achieves this through several mechanisms described in the picture below. The dark blue mechanisms involve breaking down glucose (sugar) to burn for energy. This can be done by pulling glucose out of our blood stream and into our cells to be consumed. The aqua circles represent the breaking down of cholesterol and fat to be used as an efficient source of energy. The purple includes building more mitochondria to use these fats and sugars to make more energy, and the light blue mechanisms turn off cell building and replication.

Basically, AMPK signals to our body and cells that it is not a time for building, but rather for breaking down.

AMPK and Cancer

AMPK is, in essence, the antithesis of cancer. While cancer cells are burning large amounts of glucose and nutrients, this is mostly to build up biomass – or simply put to keep growing and spreading. AMPK, on the other hand, shuts off this process, blocking cancer growth so we can feed our own cells.12,13 As you can see in the picture to the right, AMPK actually blocks mTOR, a pathway that leads to cancer survival and growth.14 This is the same pathway that is blocked with targeted cancer drugs. You will also notice that the pathways are all affected by intermittent fasting, labeled as “IF,” as this is a state of energy scarcity. You may also notice that increased insulin sensitivity, which happens though exercise and muscle contraction, also appears to upregulate AMPK.

AMPK and Warburg

The Warburg hypothesis is something that comes up often when dealing with cancer and metabolism. Briefly put, Warburg showed that regardless of the presence of oxygen, cancer cells prefer to use glucose for energy derivation (through a process known as glycolysis). In our normal cells, preference is given to the mitochondria for energy production, as it is significantly more efficient. While AMPK may stop several pro-cancer pathways, newer data shows that it actually blocks the Warburg Effect, by blocking the ability of cancer cells to use sugar for energy.15

AMPK is upregulated via several mechanisms (in no apparent order):

Muscle contraction during exercise,16,17 with the more intense exercise resulting in increased expression of AMPK18

Carbohydrate restriction (with or without fasting and even in the face of an increase in calories)19

Intermittent fasting20

Inflammation is the fertilizer of cancer cells; it fosters an environment where normal cells can turn cancerous and cancer cells can grow with less effort. Inflammation has recently been labeled a “hallmark of cancer cells.”21 Any method to decrease this inflammation can provide health benefits, and even decrease the risk of cancer. When muscles are contracted, they release IL-6 and several other hormonal signals that act to decrease inflammation. These “signals” alert other organs that energy status is down, stimulating processes like AMPK,22 leading to a state of breaking down components for energy instead of stimulating growth processes like cancer. In other words, our muscles are creating signals that act at distant places within the body. These signals are plenty, but one of the more famous is when muscles signal to our bones to grow stronger23 – one of the many reasons why weight training strengthens bones.24 In a sense, the way in which our muscles “talk” with the rest of our body is only one of the many ways in which they improve our health, and ultimately, help in the fight against cancer.

Muscles Fight Cancer – The Physiologic Benefits of Lifting Weights

While our muscle cells (myocytes) secrete IL-6 at baseline, exercise increases this release up to 100 times.25 Those of us that exercise and contract our muscles frequently experience a sensitization to IL-6 when not exercising and at rest.26 While excess fat tissue desensitizes us to the action of insulin (i.e. more insulin is needed to get rid of extra blood sugar), increasing harmful amounts of blood sugar, contracting our muscle sensitizes us to the benefits of muscle-derived IL-6.

The amount of IL-6 produced depends on several factors,27 including:

Intensity of the exercise

Duration of the exercise

Endurance capacity

Size of muscle contracting

As a side note, carb-loading before exercise appears to oppose this effect, blunting IL-6 release from the muscle, perhaps paying homage to our ancient times of exercise, which was often hunting for wild game on an empty stomach.28

Countering the benefits of weight-lifting are the harms of inactivity, which, much like excess body fat, increases background inflammation.29 Exercise is such a powerful anti-inflammatory, that it offsets the potential inflammatory damage from injection of the toxin E. coli into healthy volunteers. For instance, while E. coli normally causes doubling or tripling of harmful TNF, when injected during exercise, no increase occurs.30 Not surprisingly, trained athletes have lower levels of several inflammatory factors.31

Inflammation is the likely cause of or contributor to many diseases, including atherosclerosis, diabetes, and cancer. Oxidative (free radical) damage is also considered a major cause of disease and cancer.32 Much like inflammation, high levels of free radicals can damage our cells and DNA, exposing us to a higher risk of cancer. To counter this potential damage, our cells have spent millions of years developing a defense mechanism against free radicals – known as the antioxidant defense system – that creates a plethora of antioxidant compounds that can offset the harm of radicals.

When we place men on a regimen of muscle-activating resistance training twice a week, many of these antioxidant defense mechanisms are activated. For instance, glutathione peroxidase, which defuses the potential damage from free radicals that are bound to lipids, is increased. Mitochondrial and cytosolic superoxide dismutase – which break apart, or dismutase the potentially harmful free radical superoxide – are amplified. Interestingly, when weight lifting was compared to endurance training, the latter antioxidant mechanism was only increased by weight training.33 Muscle biopsies of legs after unilateral resistance training shows similar findings, that antioxidant defense mechanisms are boosted.34

Finally, while muscle and fat can be considered opposites by the hormones they produce, the same can be said about stimulated muscle versus inactivity. Muscle contraction releases large amounts of IL-6, which sensitizes our cells to its effect, resulting in less IL-6 circulating at rest. In other words, our cells get better at dealing with IL-6 and inflammation from exercise. Muscle-derived IL-6 is beneficial, but a constantly elevated amount of IL-6 can be inflammatory.

High levels of adipose tissue and inactivity lead to an opposite state when it comes to insulin. Both decrease insulin sensitivity, or in other words, more insulin is required to rid the blood of sugar, which eventually results in chronically elevated levels of circulating insulin and sugar within our blood. Both are unhealthy and can lead to cancer.35 Further closing the loop of association, exercise-derived IL-6 increases insulin sensitivity and can prevent this damaging state from inactivity and excessive body fat.

Muscles Fight Cancer – A Final Comment of Exercise, Blood Sugar and Cancer

Many people have recently questioned the benefit of exercise before or after a cancer diagnosis since it can result in elevated levels of blood glucose. This occurs when our body mobilizes available stores of glucose (from glycogen within the liver and muscles). As increased blood glucose levels correlate with an increased risk of several cancers,36 this may seem concerning on the surface. Furthermore, while IL-6 secreted from muscle increases the breakdown of fats and activation of the AMPK energy sensor can reduce the risk of cancer,37–39 the increase in PI3K, another pro-cancer pathway, is concerning.

Yet, these changes primarily occur in the muscle, which is using the mobilized glucose. Furthermore, the rise in blood sugar is transient (glucose levels drop by 30 minutes afterwards40), and as exercise and resistance training increases insulin sensitivity, overall we are left with a lower blood glucose and insulin level.41 The multitude of other physiologic changes that occur listed above provide an overwhelming anti-cancer benefit. This has played out in several recent studies, showing a decreased risk of breast cancer in women who exercise, with some data suggesting additional benefit from strenuous exercise.42,43 The benefits appear to be similar for women who were already diagnosed with breast cancer.44

Muscles Fight Cancer – Conclusions

Muscles fight cancer and strength is associated with a decreased risk of cancer. The conclusions are obvious: if you are physically able, lift more weights, build more muscle, and increase your strength. Do it safely, do it right, and do it periodically to ensure that you are “health cost averaging.” Flex your muscles and squeeze out the anti-inflammatory beneficial messengers that direct the rest of our body to be healthy.

I hope this article has convinced you to lift (or throw around) some weights, put on some muscle, and fight cancer. The added benefits are stronger bones, a better physique, and hopefully, a longer life.

It looks like muscles fight cancer, but to do so, they must be put to work.