Natural immunomodulators that can help regulate our immune system without side-effects have been sought for centuries, and all the while they've been sitting in the produce aisle. Plants produce thousand of active compounds, many of which modulate our immune system, but we can't forget the fungi (see Boosting Immunity While Reducing Inflammation).

Mushrooms have used for centuries as folk remedies, and for good reason. Some have been shown to boost immune function, so much so that a type of fiber found in shiitake mushrooms is approved for use as adjunct chemotherapy, injected intravenously to help treat a variety of cancers by rallying our immune defenses.

More than 6,000 papers have been published on these so-called beta glucans, but almost all of the data about preventing infections had come from petri dish or lab animal studies, until a few years ago when a series of experiments on athletes showed beneficial effects in marathon runners (see Preserving Immune Function in Athletes With Nutritional Yeast). What about the rest of us? We didn't know... until now.

As I explore in my video, Nutritional Yeast to Prevent the Common Cold, beta glucan fiber found in baker's, brewer's and nutritional yeast helps to maintain our body's defense against pathogens even in nonathletes, according to a double-blind, randomized, placebo-controlled trial. The recurrence of infections with the common cold was reduced by 25% in those that ate the equivalent of about a spoonful of nutritional yeast a day, and had fewer cold-related sleeping difficulties when they did get sick.

What about half a spoonful a day? Still worked! Subjects experienced a big drop in common cold incidence and a reduction in symptoms as well. Why is this? This study found that not only were upper respiratory infection symptoms diminished, but that mood states appeared to improve, for example a significant boost in feelings of "vigor." So the researchers suggest that maybe the yeast fiber is able to counteract the negative effects of stress on the immune system.

In terms of side-effects, two folks reported stomachaches, but they were both in the placebo group.

Unlike antibiotics and antivirals, which are designed to kill the pathogen directly, these yeast compounds instead appear to work by stimulating our immune defenses, and as such don't share the same antibiotic side effects. They stimulate our immune defenses presumably because our body recognizes them as foreign. But if it's treated like an invader, might it trigger an inflammatory response? Turns out these fiber compounds may actually have an anti-inflammatory effect, suggesting nutritional yeast may offer the best of both worlds, boosting the infection fighting side of the immune system while suppressing inflammatory components.

Yeast is high in purines, so those with gout, uric acid kidney stones, and new organ transplant recipients may want to keep their intake to less than a teaspoon a day. But is there any downside for everyone else? In California some packages of nutritional yeast are slapped with prop 65 warning stickers, suggesting there's something in it exceeding cancer or birth defect safety limits. I called around to the companies and it turns out the problem is lead. California state law says a product cannot contain more than half of a microgram of lead per daily serving, so I contacted the six brands I knew about and asked them how much lead was in their products.

KAL originally said "<5 ppm," but when we called back they said "<3 ppm." Even if it's 3, that translates into less than 45 micrograms per serving, nearly a 100 times more than the California limit. But perhaps that's better than Bob's Red Mill or Frontier Coop, who evidently don't test at all. But at least they got back to me. Redstar brand failed to respond to multiple attempts to contact them. Now Foods said they test for lead and claim that at least their recent batches meet the less than a half a microgram California standard. Unfortunately, despite repeated requests they would not provide me with documentation to substantiate their numbers. My favorite response was from Bragg's who sent me the analysis certificate from the lab showing less than 0.01 ppm, which means at most less than half the California standard, which I believe is the most stringent in the world. To put the numbers in context, in determining how much lead manufacturers can put into candy likely to be frequently consumed by small children, the Food and Drug Administration would allow about 2 micrograms a day in the form of lollipops, but as far as I'm concerned the less lead the better.

I was so frustrated by the lack of transparency I decided to test them for lead myself. NutritionFacts.org hired an independent lab to conduct our own tests for lead and shipped out 8 samples of nutritional yeast in their original package. The lab used standard practices for lead testing known as Official Methods of Analysis set by AOAC International. Lab technicians determined the lead values based on California Prop 65 standards. Here are the results from the brands we tested:

Whole Foods - Test report shows lead levels at 0.012 ppm. It would take six tablespoons a day to exceed the MADL.*

So what do all those numbers mean? None of the brands tested exceeded California prop 65 standards. No matter what brand, consuming a typical serving (2 tablespoons) per day is still well within safe limits.

* The Maximum Allowable Dose Level for lead as a developmental toxin is 0.5 micrograms a day. How are MADL's calculated? Basically scientists figure out what the "no observable effect level" is, the level at which no birth defects or reproductive toxicity can be found, and then introduce a 1000-fold safety buffer. So for example, let's say there's some chemical that causes birth defects if expectant moms are exposed to two drops of the chemical a day, but there's no evidence that one drop a day is harmful. Do they set the Maximum Allowable Dose Level at one drop? No, they set it at 1/1000th of a drop to account for scientific uncertainty and to err on the side of caution. So by saying six tablespoons a day of nutritional yeast may exceed the MADL is in effect saying that the level of lead found in 6,000 tablespoons of nutritional yeast may cause birth defects. Like mercury, though, as far as I'm concerned the less lead exposure the better. I hope this will inspire companies to do further testing to see if the levels we found were just flukes.

Natural immunomodulators that can help regulate our immune system without side-effects have been sought for centuries, and all the while they've been sitting in the produce aisle. Plants produce thousand of active compounds, many of which modulate our immune system, but we can't forget the fungi (see Boosting Immunity While Reducing Inflammation).

Mushrooms have used for centuries as folk remedies, and for good reason. Some have been shown to boost immune function, so much so that a type of fiber found in shiitake mushrooms is approved for use as adjunct chemotherapy, injected intravenously to help treat a variety of cancers by rallying our immune defenses.

More than 6,000 papers have been published on these so-called beta glucans, but almost all of the data about preventing infections had come from petri dish or lab animal studies, until a few years ago when a series of experiments on athletes showed beneficial effects in marathon runners (see Preserving Immune Function in Athletes With Nutritional Yeast). What about the rest of us? We didn't know... until now.

As I explore in my video, Nutritional Yeast to Prevent the Common Cold, beta glucan fiber found in baker's, brewer's and nutritional yeast helps to maintain our body's defense against pathogens even in nonathletes, according to a double-blind, randomized, placebo-controlled trial. The recurrence of infections with the common cold was reduced by 25% in those that ate the equivalent of about a spoonful of nutritional yeast a day, and had fewer cold-related sleeping difficulties when they did get sick.

What about half a spoonful a day? Still worked! Subjects experienced a big drop in common cold incidence and a reduction in symptoms as well. Why is this? This study found that not only were upper respiratory infection symptoms diminished, but that mood states appeared to improve, for example a significant boost in feelings of "vigor." So the researchers suggest that maybe the yeast fiber is able to counteract the negative effects of stress on the immune system.

In terms of side-effects, two folks reported stomachaches, but they were both in the placebo group.

Unlike antibiotics and antivirals, which are designed to kill the pathogen directly, these yeast compounds instead appear to work by stimulating our immune defenses, and as such don't share the same antibiotic side effects. They stimulate our immune defenses presumably because our body recognizes them as foreign. But if it's treated like an invader, might it trigger an inflammatory response? Turns out these fiber compounds may actually have an anti-inflammatory effect, suggesting nutritional yeast may offer the best of both worlds, boosting the infection fighting side of the immune system while suppressing inflammatory components.

Yeast is high in purines, so those with gout, uric acid kidney stones, and new organ transplant recipients may want to keep their intake to less than a teaspoon a day. But is there any downside for everyone else? In California some packages of nutritional yeast are slapped with prop 65 warning stickers, suggesting there's something in it exceeding cancer or birth defect safety limits. I called around to the companies and it turns out the problem is lead. California state law says a product cannot contain more than half of a microgram of lead per daily serving, so I contacted the six brands I knew about and asked them how much lead was in their products.

KAL originally said "<5 ppm," but when we called back they said "<3 ppm." Even if it's 3, that translates into less than 45 micrograms per serving, nearly a 100 times more than the California limit. But perhaps that's better than Bob's Red Mill or Frontier Coop, who evidently don't test at all. But at least they got back to me. Redstar brand failed to respond to multiple attempts to contact them. Now Foods said they test for lead and claim that at least their recent batches meet the less than a half a microgram California standard. Unfortunately, despite repeated requests they would not provide me with documentation to substantiate their numbers. My favorite response was from Bragg's who sent me the analysis certificate from the lab showing less than 0.01 ppm, which means at most less than half the California standard, which I believe is the most stringent in the world. To put the numbers in context, in determining how much lead manufacturers can put into candy likely to be frequently consumed by small children, the Food and Drug Administration would allow about 2 micrograms a day in the form of lollipops, but as far as I'm concerned the less lead the better.

I was so frustrated by the lack of transparency I decided to test them for lead myself. NutritionFacts.org hired an independent lab to conduct our own tests for lead and shipped out 8 samples of nutritional yeast in their original package. The lab used standard practices for lead testing known as Official Methods of Analysis set by AOAC International. Lab technicians determined the lead values based on California Prop 65 standards. Here are the results from the brands we tested:

Whole Foods - Test report shows lead levels at 0.012 ppm. It would take six tablespoons a day to exceed the MADL.*

So what do all those numbers mean? None of the brands tested exceeded California prop 65 standards. No matter what brand, consuming a typical serving (2 tablespoons) per day is still well within safe limits.

* The Maximum Allowable Dose Level for lead as a developmental toxin is 0.5 micrograms a day. How are MADL's calculated? Basically scientists figure out what the "no observable effect level" is, the level at which no birth defects or reproductive toxicity can be found, and then introduce a 1000-fold safety buffer. So for example, let's say there's some chemical that causes birth defects if expectant moms are exposed to two drops of the chemical a day, but there's no evidence that one drop a day is harmful. Do they set the Maximum Allowable Dose Level at one drop? No, they set it at 1/1000th of a drop to account for scientific uncertainty and to err on the side of caution. So by saying six tablespoons a day of nutritional yeast may exceed the MADL is in effect saying that the level of lead found in 6,000 tablespoons of nutritional yeast may cause birth defects. Like mercury, though, as far as I'm concerned the less lead exposure the better. I hope this will inspire companies to do further testing to see if the levels we found were just flukes.

Milk is touted to build strong bones, but a compilation of all the best studies found no association between milk consumption and hip fracture risk, so drinking milk as an adult might not help bones, but what about in adolescence? Harvard researchers decided to put it to the test.

Studies have shown that greater milk consumption during childhood and adolescence contributes to peak bone mass, and is therefore expected to help avoid osteoporosis and bone fractures in later life. But that's not what researchers have found (as you can see in my video Is Milk Good for Our Bones?). Milk consumption during teenage years was not associated with a lower risk of hip fracture, and if anything, milk consumption was associated with a borderline increase in fracture risk in men.

It appears that the extra boost in total body bone mineral density from getting extra calcium is lost within a few years; even if you keep the calcium supplementation up. This suggests a partial explanation for the long-standing enigma that hip fracture rates are highest in populations with the greatest milk consumption. This may be an explanation for why they're not lower, but why would they be higher?

This enigma irked a Swedish research team, puzzled because studies again and again had shown a tendency of a higher risk of fracture with a higher intake of milk. Well, there is a rare birth defect called galactosemia, where babies are born without the enzymes needed to detoxify the galactose found in milk, so they end up with elevated levels of galactose in their blood, which can causes bone loss even as kids. So maybe, the Swedish researchers figured, even in normal people that can detoxify the stuff, it might not be good for the bones to be drinking it every day.

And galactose doesn't just hurt the bones. Galactose is what scientists use to cause premature aging in lab animals--it can shorten their lifespan, cause oxidative stress, inflammation, and brain degeneration--just with the equivalent of like one to two glasses of milk's worth of galactose a day. We're not rats, though. But given the high amount of galactose in milk, recommendations to increase milk intake for prevention of fractures could be a conceivable contradiction. So, the researchers decided to put it to the test, looking at milk intake and mortality as well as fracture risk to test their theory.

A hundred thousand men and women were followed for up to 20 years. Researchers found that milk-drinking women had higher rates of death, more heart disease, and significantly more cancer for each glass of milk. Three glasses a day was associated with nearly twice the risk of premature death, and they had significantly more bone and hip fractures. More milk, more fractures.

Men in a separate study also had a higher rate of death with higher milk consumption, but at least they didn't have higher fracture rates. So, the researchers found a dose dependent higher rate of both mortality and fracture in women, and a higher rate of mortality in men with milk intake, but the opposite for other dairy products like soured milk and yogurt, which would go along with the galactose theory, since bacteria can ferment away some of the lactose. To prove it though, we need a randomized controlled trial to examine the effect of milk intake on mortality and fractures. As the accompanying editorial pointed out, we better find this out soon since milk consumption is on the rise around the world.

Milk is touted to build strong bones, but a compilation of all the best studies found no association between milk consumption and hip fracture risk, so drinking milk as an adult might not help bones, but what about in adolescence? Harvard researchers decided to put it to the test.

Studies have shown that greater milk consumption during childhood and adolescence contributes to peak bone mass, and is therefore expected to help avoid osteoporosis and bone fractures in later life. But that's not what researchers have found (as you can see in my video Is Milk Good for Our Bones?). Milk consumption during teenage years was not associated with a lower risk of hip fracture, and if anything, milk consumption was associated with a borderline increase in fracture risk in men.

It appears that the extra boost in total body bone mineral density from getting extra calcium is lost within a few years; even if you keep the calcium supplementation up. This suggests a partial explanation for the long-standing enigma that hip fracture rates are highest in populations with the greatest milk consumption. This may be an explanation for why they're not lower, but why would they be higher?

This enigma irked a Swedish research team, puzzled because studies again and again had shown a tendency of a higher risk of fracture with a higher intake of milk. Well, there is a rare birth defect called galactosemia, where babies are born without the enzymes needed to detoxify the galactose found in milk, so they end up with elevated levels of galactose in their blood, which can causes bone loss even as kids. So maybe, the Swedish researchers figured, even in normal people that can detoxify the stuff, it might not be good for the bones to be drinking it every day.

And galactose doesn't just hurt the bones. Galactose is what scientists use to cause premature aging in lab animals--it can shorten their lifespan, cause oxidative stress, inflammation, and brain degeneration--just with the equivalent of like one to two glasses of milk's worth of galactose a day. We're not rats, though. But given the high amount of galactose in milk, recommendations to increase milk intake for prevention of fractures could be a conceivable contradiction. So, the researchers decided to put it to the test, looking at milk intake and mortality as well as fracture risk to test their theory.

A hundred thousand men and women were followed for up to 20 years. Researchers found that milk-drinking women had higher rates of death, more heart disease, and significantly more cancer for each glass of milk. Three glasses a day was associated with nearly twice the risk of premature death, and they had significantly more bone and hip fractures. More milk, more fractures.

Men in a separate study also had a higher rate of death with higher milk consumption, but at least they didn't have higher fracture rates. So, the researchers found a dose dependent higher rate of both mortality and fracture in women, and a higher rate of mortality in men with milk intake, but the opposite for other dairy products like soured milk and yogurt, which would go along with the galactose theory, since bacteria can ferment away some of the lactose. To prove it though, we need a randomized controlled trial to examine the effect of milk intake on mortality and fractures. As the accompanying editorial pointed out, we better find this out soon since milk consumption is on the rise around the world.

One sign of changing U.S. demographics is that salsa has replaced ketchup as America's #1 table condiment. One of the popular salsa ingredients is cilantro, described as one of the "most polarizing and divisive food ingredients known." Some people love it; some people hate it. What's interesting is that the lovers and the haters appear to experience the taste differently. Individuals who like cilantro may describe it as "fresh, fragrant or citrusy, whereas those who dislike cilantro report that it tastes like soap, mold, dirt, or bugs." I don't know how people know what bugs taste like, but rarely are polarizing opinions about flavors so extreme. Maybe it's genetic.

Different ethnic groups do seem to have different rates of cilantro dislike, with Ashkenazi Jews scoring highest on the cilantro hate-o-meter (see The Cilantro Gene). Another clue came from twin studies, that show that identical twins tend to share cilantro preferences, whereas regular fraternal twins do not have such a strong correlation. Our genetic code is so big, though, containing about three billion letters, that to find some cilantro gene you'd have to analyze the DNA of like 10,000 people, and obviously genetic researchers have better things to do...or maybe not.

Researchers performed a genome-wide association study among 14,000 participants who reported whether cilantro tasted soapy, with replication in a distinct set of 11,000 people who declared whether they liked cilantro or not. And lo and behold they found a spot on chromosome 11 that seemed to be a match. What's there? A gene called OR6A2 that enables us to smell certain chemicals like E-(2)-Decenal, a primary constituent of cilantro and also...the defensive secretions of stink bugs. So maybe cilantro does taste like bugs! But, cilantro lovers may be genetic mutants that have an inability to smell the unpleasant compound.

That may actually be an advantage, though, since cilantro is healthy stuff. In fact, that's the justification to do these kinds of studies: to see why some people don't like the taste of healthy foods.

Are the cilantro haters really missing out on much, though? Mother nature has been described as the "oldest and most comprehensive pharmacy of all time," and cilantro--called coriander around most of the world--is one of nature's oldest herbal prescriptions, credited with anti-microbial, anti-oxidant, anti-diabetic, anti-anxiety, and anti-epilepsy properties. However, these are all from preclinical studies, meaning studies done on cells in a test tube or lab animals. Studies like the "Anti-Despair Activity of Cilantro..." in which researchers placed animals in a "despair apparatus" (you don't want to know).

Finally, though, there was a human study, on the anti-arthritis potential of cilantro. There was an earlier study performed in Germany of a lotion made out of cilantro seeds showing it could decrease the redness of a sunburn, demonstrating it had some anti-inflammatory effects )though not as much as an over-the-counter steroid, hydrocortisone, or prescription strength steroid cream). If the cilantro plant is anti-inflammatory, why nto give it to people with osteoarthritis and see if it helps? Researchers gave about 20 sprigs of cilantro daily for two months, and reported a significant drop in ESR--a nonspecific indicator of inflammation--in the cilantro group. How did the patients do clinically, though? The study didn't say, but it did report a rather remarkable 50% drop in uric acid levels, suggesting that huge amounts of cilantro may be useful for those suffering from gout.

The cilantro lovers/haters factoid reminds me of the video Pretty in Pee-nk about the phenomenon of "beeturia," pink urine after beet consumption seen in some people.

One sign of changing U.S. demographics is that salsa has replaced ketchup as America's #1 table condiment. One of the popular salsa ingredients is cilantro, described as one of the "most polarizing and divisive food ingredients known." Some people love it; some people hate it. What's interesting is that the lovers and the haters appear to experience the taste differently. Individuals who like cilantro may describe it as "fresh, fragrant or citrusy, whereas those who dislike cilantro report that it tastes like soap, mold, dirt, or bugs." I don't know how people know what bugs taste like, but rarely are polarizing opinions about flavors so extreme. Maybe it's genetic.

Different ethnic groups do seem to have different rates of cilantro dislike, with Ashkenazi Jews scoring highest on the cilantro hate-o-meter (see The Cilantro Gene). Another clue came from twin studies, that show that identical twins tend to share cilantro preferences, whereas regular fraternal twins do not have such a strong correlation. Our genetic code is so big, though, containing about three billion letters, that to find some cilantro gene you'd have to analyze the DNA of like 10,000 people, and obviously genetic researchers have better things to do...or maybe not.

Researchers performed a genome-wide association study among 14,000 participants who reported whether cilantro tasted soapy, with replication in a distinct set of 11,000 people who declared whether they liked cilantro or not. And lo and behold they found a spot on chromosome 11 that seemed to be a match. What's there? A gene called OR6A2 that enables us to smell certain chemicals like E-(2)-Decenal, a primary constituent of cilantro and also...the defensive secretions of stink bugs. So maybe cilantro does taste like bugs! But, cilantro lovers may be genetic mutants that have an inability to smell the unpleasant compound.

That may actually be an advantage, though, since cilantro is healthy stuff. In fact, that's the justification to do these kinds of studies: to see why some people don't like the taste of healthy foods.

Are the cilantro haters really missing out on much, though? Mother nature has been described as the "oldest and most comprehensive pharmacy of all time," and cilantro--called coriander around most of the world--is one of nature's oldest herbal prescriptions, credited with anti-microbial, anti-oxidant, anti-diabetic, anti-anxiety, and anti-epilepsy properties. However, these are all from preclinical studies, meaning studies done on cells in a test tube or lab animals. Studies like the "Anti-Despair Activity of Cilantro..." in which researchers placed animals in a "despair apparatus" (you don't want to know).

Finally, though, there was a human study, on the anti-arthritis potential of cilantro. There was an earlier study performed in Germany of a lotion made out of cilantro seeds showing it could decrease the redness of a sunburn, demonstrating it had some anti-inflammatory effects )though not as much as an over-the-counter steroid, hydrocortisone, or prescription strength steroid cream). If the cilantro plant is anti-inflammatory, why nto give it to people with osteoarthritis and see if it helps? Researchers gave about 20 sprigs of cilantro daily for two months, and reported a significant drop in ESR--a nonspecific indicator of inflammation--in the cilantro group. How did the patients do clinically, though? The study didn't say, but it did report a rather remarkable 50% drop in uric acid levels, suggesting that huge amounts of cilantro may be useful for those suffering from gout.

The cilantro lovers/haters factoid reminds me of the video Pretty in Pee-nk about the phenomenon of "beeturia," pink urine after beet consumption seen in some people.

Hundreds of thousands of deaths in the United States every year are attributed to obesity, now overtaking smoking as perhaps the main preventable cause of illness and premature death. In particular, excess body fatness is an important cause of most cancers, according to a meta-analysis of studies done to date. For some cancers, about half of the cases may be attributable to just being overweight or obese.

What's the connection, though? Why do individuals who are obese have increased cancer risk? To answer this question we must consider the biochemical consequences of obesity, like IGF-1; insulin like growth factor one is a cancer-promoting growth hormone associated with a variety of common cancers in adults, as well as children. Kids who got cancer had about four times the levels of IGF-1 circulating in their bloodstream, whereas people growing up with abnormally low levels of IGF-1 don't seem to get cancer at all.

So of course drug companies have come up with a variety of IGF-1 blocking chemo agents, with cute names like figitumamab, but with not-so-cute side effects "such as early fatal toxicities." So perhaps better to lower IGF-1 the natural way, by eating a plant-based diet, as vegan women and men have lower IGF-1 levels. Maybe, though, it's just because they're so skinny. The only dietary group that comes close to the recommended BMI of 21 to 23 were those eating strictly plant-based diets, so maybe it's the weight loss that did it. Maybe we can eat whatever we want as long as we're skinny.

To put that to the test, we'd have to find a group of people that eat meat, but are still as slim as vegans. And that's what researchers did - long-distance endurance runners, running an average of 48 miles a week for 21 years were as slim as vegans. If we run 50,000 miles we too can maintain a BMI of even a raw vegan. So what did they find?

If we look at blood concentrations of cancer risk factors among the groups of study subjects, we see that only the vegans had significantly lower levels of IGF-1. That makes sense given the role animal protein plays in boosting IGF-1 levels.

But the vegan group didn't just eat less animal protein, they ate fewer calories. And in rodents at least, caloric restriction alone reduces IGF-1 levels. So maybe low IGF-1 among vegans isn't due to their slim figures, but maybe the drop in IGF-1 in vegans is effectively due to their unintentional calorie restriction. So we have to compare vegans to people practicing severe calorie restriction.

To do this, the researchers recruited vegans from the St. Louis Vegetarian Society, and went to the Calorie Restriction Society to find folks practicing severe caloric restriction. What did they find?

Only the vegan group got a significant drop in IGF-1. These findings demonstrate that, unlike in rodents, long-term severe caloric restriction in humans does notreduce the level of this cancer-promoting hormone. It's not how many calories we eat, but the protein intake that may be the key determinant of circulating IGF-1 levels in humans, and so reduced protein intake may become an important component of anti-cancer and anti-aging dietary interventions.

Researchers out of Norway described the amount of pesticide residues found in GMO soy as high compared to the maximum allowable residue levels. The legal limit for glyphosate in foods had been set at 0.1-0.2 mg/kg; so these exceed the legal limits by an average of about 2000%, whereas organic and conventional non-GMO soy both had none.

So what did Monsanto do? Did the industry ditch the whole GMO thing, go back to using less pesticides so that residue levels wouldn't be so high? Or, they could just change the definition of high. What if they could get authorities to raise the maximum residue level from 0.1 or 0.2 up to 20? Then the residue levels won't look so high anymore. And this is exactly what they did. The acceptance level of glyphosate in food and animal feed has been increased by authorities in countries that use Roundup-Ready GM crops. In Brazil, they went up to ten, and the U.S. and Europe now accept up to 20. In all of these cases, the maximum residue level values appear to have been adjusted, not based on new evidence indicating glyphosate toxicity was less than previously understood, but pragmatically in response to actual observed increases in the content of residues in GMO soybeans--otherwise it wouldn't be legal to sell the stuff.

What evidence do we have, though, that these kinds of residues are harmful? For 12 years we've heard that Roundup interferes with embryonic development, but that study was about sea urchin embryos. For 14 years we heard that Roundup may disrupt hormones, but that's in mouse testicles.

Blogs will dish about concerning new studies implicating Roundup in male fertility, but if we look at the study, it's about rat testicles. Some blogs cite studies with disturbing titles like "prepubertal exposure alters testosterone levels and testicular shape," but they're talking about puberty in rats, though that doesn't make as catchy a blog title.

Why not use human tissue? Women are having babies every day--why not just experiment on human placentas, which would otherwise just get thrown away? In 2005, researchers did just that. And despite all the negative effects in rodents, glyphosate, the active ingredient in Roundup didn't seem to have much of a toxic effect on human cells even at high doses, or have much effect on a hormone regulating enzyme, leading Monsanto-funded reviewers to conclude that regardless of what hazards might be alleged based on animal studies, "glyphosate is not anticipated to produce adverse developmental and reproductive effects in humans."

But pure glyphosate isn't sprayed on crops, Roundup is, which contains a variety of adjuvants and surfactants meant to help the glyphosate penetrate into tissues. And indeed when the study was repeated with what's actually sprayed on GMO crops, there were toxic and hormonal effects even at doses smaller than the 1 or 2% concentration that's used out on the fields.

Similar results were found for other major pesticides. It took until 2014, but eight out of nine pesticide formulations tested were up to one thousand times more toxic than their so-called active ingredients, so when we just test the isolated chemicals, we may not get the whole story. Roundup was found to be 100 times more toxic than glyphosate itself. Moreover, Roundup turned out to be among the most toxic pesticides they tested. It's commonly believed that Roundup is among the safest, though, an idea spread by Monsanto, the manufacturer. However, this inconsistency between scientific fact and industrial claim may be attributed to the huge economic interests involved.

It's the dose that makes the poison, though. Do we have evidence that the levels of Roundup chemicals not only found on crops, but also in our bodies after eating those crops actually have adverse effects? That's the subject of the video: GMO Soy and Breast Cancer.

Commercial interests can have a corrupting effect on the science of nutrition and hold sway over institutions that are supposed to operate in the public interest. See for example:

Recently the prominent science journal Natureeditorialized that we are now swimming in information about genetically modified crops, but that much of the information is wrong--on both sides of the debate. "But a lot of this incorrect information is sophisticated, backed by legitimate-sounding research and written with certitude," adding that with GMOs, "a good gauge of a statement's fallacy is the conviction with which it is delivered."

To many in the scientific community, GMO concerns are dismissed as one big conspiracy theory. In fact, one item in a psychological test of belief in conspiracy theories asked people if they thought food companies would have the audacity to be dishonest about genetically modified food. The study concluded that many people were cynical and skeptical with regard to advertising tricks, as well as the tactics of organizations like banks and alcohol, drug, and tobacco companies. That doesn't sound like conspiracy theory to me; that sounds like business as usual.

We must remember there is a long legacy of scientific misconduct. Throw in a multi-billion dollar industry, and one can imagine how hard it is to get to the truth of the matter. There are social, environmental, economic, food security, and biodiversity arguments both pro and con about GMOs, but those are outside my area of expertise. I'm going to stick to food safety. And as a physician, I'm a very limited veterinarian--I only know one species (us!). So, I will skip the lab animal data and ask instead: What human data do we have about GMO safety?

One study "confirmed" that DNA from genetically modified crops can be transferred into humans who eat them, but that's not what the study found, just that plant DNA in general may be found in the human bloodstream, with no stipulations of harm (See Are GMOs Safe? The Case of Bt Corn).

Another study, however, did find a GMO crop protein in people. The "toxin" was detected in 93 percent of blood samples of pregnant women, 80 percent of umbilical cord blood samples, and 69 percent of samples from non-pregnant women. The toxin they're talking about is an insecticidal protein produced by Bt bacteria whose gene was inserted into the corn's DNA to create so-called Bt-corn, which has been incorporated into animal feed. If it's mainly in animal feed, how did it get into the bodies of women? They suggest it may be through exposure to contaminated meat.

Of course, why get GMO's second-hand when you can get them directly? The next great frontier is transgenic farm animals. A genetically modified salmon was first to vie for a spot at the dinner table. And then in 2010, transgenic cows, sheep, goats and pigs were created, genetically modified for increased muscle mass, based on the so-called mighty mouse model. Frankenfurters!

But back to children of the corn and their mothers. When they say it's a toxin, it's a toxin to corn worms, not necessarily to people. In fact I couldn't find any data linking BT toxin to human harm, which is a good thing since it's considered one of the few pesticides considered so non-toxic that it's sprayed on organic fruits and vegetables.

For more on on the public health implications of genetically modified crops, see:

Recently, there has been research examining the connection between poultry consumption and weight gain. One study out of the Netherlands examining about 4,000 people, correlated chicken consumption with weight gain. Another study followed 89,000 people in four other countries and found that animal protein intake was associated with long-term weight gain, and poultry was the worst, with 40 percent more weight gain than red meat or processed meat.

What makes poultry so bad? Yes, chickens are fatty these days because of the way we've genetically manipulated them--up to ten times more fat and calories than they used to have--but one bizarre theory postulated that it might be due to an obesity-causing chicken virus. In one study, one in five obese humans tested positive to the chicken virus SMAM-1, with those exposed to the chicken virus averaging 33 pounds heavier than those testing negative.

SMAM-1 was the first chicken virus to be associated with human obesity, but not the last. The original obesity-causing chicken virus SMAM-1 was able to effectively transmit obesity from one chicken to another when caged together, similar to a human adenovirus Ad-36, a human obesity-associated virus first associated with obesity in chickens and mice. Ad-36 spreads quickly from one chicken to another via nasal, oral or fecal excretion and contamination, causing obesity in each chicken. This of course raises serious concerns about Ad-36-induced adiposity in humans.

The easiest way to test this hypothesis is to experimentally infect humans with the virus. However, ethical reasons preclude experimental infection of humans, and so the evidence will have to remain indirect. In the absence of direct experimental data, we must rely on population studies, similar to how researchers nailed smoking and lung cancer. About 15 percent of Americans are already infected with Ad-36, so we can follow them and see what happens. That's exactly what a research team out of Taiwan did (highlighted in my video Infectobesity: Adenovirus 36 and Childhood Obesity). They followed 1,400 Hispanic men and women for a decade and found that not only were those exposed to the virus fatter than those who were not, but also over the ten years, those with a history of infection had a greater percentage of body fat over time.

Most studies done to date on adults have found a connection between exposure to Ad-36 and obesity, and all studies done so far on childhood obesity show an increase in prevalence of infection in obese children compared to non-obese children. We're now up to more than a thousand children studied with similar findings. Obese children who tested positive for the virus weighed 35 pounds more than children who tested negative.

The virus appears to both increase the number of fat cells by mobilizing precursor stem cells and increase the accumulation of fat within the cells. If we take liposuction samples of fat from people, the fat cell precursors turn into fat cells at about five times the rate in people who came to the liposuction clinic already infected. Fat taken from non-infected people that was then exposed to the virus start sucking up fat at a faster rate, potentially inducing obesity without increasing food intake.

Just as Ad-36 can be transmitted horizontally from one infected chicken to another in the same cage, subsequently causing obesity in each chicken, this same virus is also easily transmitted among humans, raising the question as to whether at least some cases of childhood obesity can be considered an infectious disease. Researchers publishing in the International Journal of Pediatric Obesity speculate that this animal adenovirus may have mutated to become a human adenovirus capable of infecting humans and causing obesity.