Milk is touted to build strong bones, but a compilation of all the best studies found no association between milk consumption and hip fracture risk, so drinking milk as an adult might not help bones, but what about in adolescence? Harvard researchers decided to put it to the test.

Studies have shown that greater milk consumption during childhood and adolescence contributes to peak bone mass, and is therefore expected to help avoid osteoporosis and bone fractures in later life. But that's not what researchers have found (as you can see in my video Is Milk Good for Our Bones?). Milk consumption during teenage years was not associated with a lower risk of hip fracture, and if anything, milk consumption was associated with a borderline increase in fracture risk in men.

It appears that the extra boost in total body bone mineral density from getting extra calcium is lost within a few years; even if you keep the calcium supplementation up. This suggests a partial explanation for the long-standing enigma that hip fracture rates are highest in populations with the greatest milk consumption. This may be an explanation for why they're not lower, but why would they be higher?

This enigma irked a Swedish research team, puzzled because studies again and again had shown a tendency of a higher risk of fracture with a higher intake of milk. Well, there is a rare birth defect called galactosemia, where babies are born without the enzymes needed to detoxify the galactose found in milk, so they end up with elevated levels of galactose in their blood, which can causes bone loss even as kids. So maybe, the Swedish researchers figured, even in normal people that can detoxify the stuff, it might not be good for the bones to be drinking it every day.

And galactose doesn't just hurt the bones. Galactose is what scientists use to cause premature aging in lab animals--it can shorten their lifespan, cause oxidative stress, inflammation, and brain degeneration--just with the equivalent of like one to two glasses of milk's worth of galactose a day. We're not rats, though. But given the high amount of galactose in milk, recommendations to increase milk intake for prevention of fractures could be a conceivable contradiction. So, the researchers decided to put it to the test, looking at milk intake and mortality as well as fracture risk to test their theory.

A hundred thousand men and women were followed for up to 20 years. Researchers found that milk-drinking women had higher rates of death, more heart disease, and significantly more cancer for each glass of milk. Three glasses a day was associated with nearly twice the risk of premature death, and they had significantly more bone and hip fractures. More milk, more fractures.

Men in a separate study also had a higher rate of death with higher milk consumption, but at least they didn't have higher fracture rates. So, the researchers found a dose dependent higher rate of both mortality and fracture in women, and a higher rate of mortality in men with milk intake, but the opposite for other dairy products like soured milk and yogurt, which would go along with the galactose theory, since bacteria can ferment away some of the lactose. To prove it though, we need a randomized controlled trial to examine the effect of milk intake on mortality and fractures. As the accompanying editorial pointed out, we better find this out soon since milk consumption is on the rise around the world.

Milk is touted to build strong bones, but a compilation of all the best studies found no association between milk consumption and hip fracture risk, so drinking milk as an adult might not help bones, but what about in adolescence? Harvard researchers decided to put it to the test.

Studies have shown that greater milk consumption during childhood and adolescence contributes to peak bone mass, and is therefore expected to help avoid osteoporosis and bone fractures in later life. But that's not what researchers have found (as you can see in my video Is Milk Good for Our Bones?). Milk consumption during teenage years was not associated with a lower risk of hip fracture, and if anything, milk consumption was associated with a borderline increase in fracture risk in men.

It appears that the extra boost in total body bone mineral density from getting extra calcium is lost within a few years; even if you keep the calcium supplementation up. This suggests a partial explanation for the long-standing enigma that hip fracture rates are highest in populations with the greatest milk consumption. This may be an explanation for why they're not lower, but why would they be higher?

This enigma irked a Swedish research team, puzzled because studies again and again had shown a tendency of a higher risk of fracture with a higher intake of milk. Well, there is a rare birth defect called galactosemia, where babies are born without the enzymes needed to detoxify the galactose found in milk, so they end up with elevated levels of galactose in their blood, which can causes bone loss even as kids. So maybe, the Swedish researchers figured, even in normal people that can detoxify the stuff, it might not be good for the bones to be drinking it every day.

And galactose doesn't just hurt the bones. Galactose is what scientists use to cause premature aging in lab animals--it can shorten their lifespan, cause oxidative stress, inflammation, and brain degeneration--just with the equivalent of like one to two glasses of milk's worth of galactose a day. We're not rats, though. But given the high amount of galactose in milk, recommendations to increase milk intake for prevention of fractures could be a conceivable contradiction. So, the researchers decided to put it to the test, looking at milk intake and mortality as well as fracture risk to test their theory.

A hundred thousand men and women were followed for up to 20 years. Researchers found that milk-drinking women had higher rates of death, more heart disease, and significantly more cancer for each glass of milk. Three glasses a day was associated with nearly twice the risk of premature death, and they had significantly more bone and hip fractures. More milk, more fractures.

Men in a separate study also had a higher rate of death with higher milk consumption, but at least they didn't have higher fracture rates. So, the researchers found a dose dependent higher rate of both mortality and fracture in women, and a higher rate of mortality in men with milk intake, but the opposite for other dairy products like soured milk and yogurt, which would go along with the galactose theory, since bacteria can ferment away some of the lactose. To prove it though, we need a randomized controlled trial to examine the effect of milk intake on mortality and fractures. As the accompanying editorial pointed out, we better find this out soon since milk consumption is on the rise around the world.

The food industry, like the tobacco companies and other drug lords, has been able to come up with products that tap into the same dopamine reward system that keeps people smoking cigarettes, using marijuana, and eating candy bars (See Are Sugary Foods Addictive?). New research, highlighted in my video Are Fatty Foods Addictive? suggests that fat may have similar effects on the brain. If people are fed yogurt packed with butter fat, within 30 minutes they exhibit the same brain activity as those who just drank sugar water.

People who regularly eat ice cream (sugar and fat) have a deadened dopamine response in their brains in response to drinking a milkshake. It's similar to when drug abusers have to use more and more to get the same high. Frequent ice cream consumption "is related to a reduction in reward-region (pleasure center) responsivity in humans, paralleling the tolerance observed in drug addiction." Once we've so dulled our dopamine response, we may subsequently overeat in an effort to achieve the degree of satisfaction experienced previously, contributing to unhealthy weight gain.

What do fatty and sugary foods have in common? They are energy-dense. It may be less about the number of calories than their concentration. Consumption of a calorie-dilute diet doesn't lead to deadened dopamine responsivity, but a calorie-dense diet with the same number of calories does. It's like the difference between cocaine and crack: same stuff chemically, but by smoking crack cocaine we can deliver a higher dose quicker to our brain.

As an aside, I found it interesting that the control drink in these milkshake studies wasn't just water. They can't use water because our brain actually tastes water on the tongue (who knew!). So instead the researchers had people drink a solution "designed to mimic the natural taste of saliva." Ew!

Anyway, with this new understanding of the neural correlates of food addiction, there have been calls to include obesity as an official mental disorder. After all, both obesity and addiction share the inability to restrain behavior in spite of an awareness of detrimental health consequences, one of the defining criteria of substance abuse. We keep putting crap in our bodies despite the knowledge that we have a problem that is likely caused by the crap, yet we can't stop (a phenomena called the "pleasure trap").

Redefining obesity as an addiction, a psychiatric disease, would be a boon to the drug companies that are already working on a whole bunch of drugs to muck with our brain chemistry. For example, subjects given an opiate blocker (like what's done for people with heroin overdoses to block the effects of the drug) eat significantly less cheese -- it just doesn't do as much for them anymore when their opiate receptors are blocked.

Rather than taking drugs, though, we can prevent the deadening of our pleasure center in the first place by sticking to foods that are naturally calorically dilute, like whole plant foods. This can help bring back our dopamine sensitivity such that we can again derive the same pleasure from the simplest of foods (see Changing Our Taste Buds). And this is not just for people who are obese. When we regularly eat calorie dense animal and junk foods like ice cream, we can blunt our pleasure so that we may overeat to compensate. When our brain down-regulates dopamine receptors to deal with all these jolts of fat and sugar, we may experience less enjoyment from other activities as well.

That's why cocaine addicts may have an impaired neurological capacity to enjoy sex, and why smokers have an impaired ability to respond to positive stimuli. Since these all involve the same dopamine pathways, what we put into our body--what we eat--can affect how we experience all of life's pleasures.

So to live life to the fullest, what should we do? The food industry, according to some addiction specialists, "should be given incentives to develop low calorie foods that are more attractive, palatable and affordable so that people can adhere to diet programs for a long time." No need! Mother Nature beat them to it--that's what the produce aisle is for.

By starting to eat healthfully, we can actually change how things taste. Healthiest means whole plant foods, which tend to be naturally dilute given their water and fiber content. Not only is fiber also calorie-free, but one might think of it as having "negative" calories, given the fermentation of fiber in our bowel into anti-obesity compounds (as well as anti-inflammatory, anti-cancer compounds). For this reason, those eating plant-based diets eat hundreds of fewer calories without even trying. (See my video Nutrient-Dense Approach to Weight Management).

How can we overcome our built-in hunger drives for salt, sugar, and fat? We now have evidence showing that if we go a few weeks cutting down on junk food and animal products, our tastes start to change. We may actually be able to taste fat--just like we taste sweet, sour, and salty--and people on low fat diets start liking low fat foods more and high fat foods less.

Our tongues appear to become more sensitive to fat if we eat less of it. And the more sensitive our tongues become, the less butter, meat, dairy, and eggs study subjects ate. We also get a blunted taste for fat if we eat too much. This diminished fat sensitivity has been linked to eating more calories; more fat; more dairy, meat, and eggs; and becoming fatter ourselves. And this change in sensation, this numbing of our ability to taste fat, can happen within just a few weeks.

In my video,Changing Our Taste Buds, you can see when researchers put people on a low-salt diet, over the ensuing weeks, study subjects like the taste of salt-free soup more and more, and the taste of salty soup less and less. Our tastes physically change. If we let them salt their own soup to taste, they add less and less the longer they're on the diet. By the end, soup tastes just as salty with half the salt. For those who've been on sodium restricted diets, regularly salted foods taste too salty and they actually prefer less salty food. That's why it's important for doctors to explain to patients that a low-salt diet will gradually become more palatable as their taste for salt diminishes. The longer we eat healthier foods, the better they taste.

That's why I've always encouraged my patients to think of healthy eating as an experiment. I ask them to give it three weeks. The hope is by then they feel so much better (not only physically, but in the knowledge that they don't have to be on medications for chronic diseases the rest of their lives after all!--see Say No to Drugs by Saying Yes to More Plants) and their taste sensitivity has been boosted such that whole foods-as-grown regain their natural deliciousness.

To see how a healthy diet can make you feel, check out the Physicians Committee for Responsible Medicine's 21-Day Kickstart program at http://www.21daykickstart.org/.

Could exercise be creating harmful free radicals? Oxidizing glucose to produce energy for our bodies is messy, creating free radicals the way cars burning their fuel produce combustion by-products out the exhaust. This happens even if we’re just idling, living our day-to-day lives. What if we rev our bodies up and start exercising and really start burning fuel? Then we create more free radicals, more oxidative stress, and so, need to eat even more antioxidant-rich foods.

Why do we care about oxidative stress? Well, it’s “implicated in virtually every known human disease and there is an increasing body of evidence linking free radical production to the process of aging.” Why? Because free radicals can damage DNA, our very genetic code. Well, if free radicals damage DNA, and exercise creates free radicals, does exercise damage our DNA if we don’t have enough antioxidants in our system to douse the radicals? Yes, in fact, ultra-marathoners show evidence of DNA damage in about 10% of their cells tested during a race, which may last for up to two weeks after a marathon. But what about just short bouts of exercise? We didn’t know until recently.

After just five minutes of moderate or intense cycling we can get an uptick in DNA damage. We think it’s the oxidative stress, but “regardless of the mechanism of exercise-induced DNA damage” the fact that a very short bout of high-intensity exercise can cause an increase in damage to DNA is a cause for concern. But we can block oxidative damage with antioxidant-rich foods.

Of course, when drug and supplement companies hear “antioxidant-rich foods” they think, pills! We can’t make billions on broccoli, so “Pharmacological antioxidant vitamins have been investigated for a prophylactic effect against exercise-induced oxidative stress.” However, large doses are often required and in pill form may ironically lead to a state of pro-oxidation and even more oxidative damage. For example, guys doing arm curls taking 500 mg of vitamin C appeared to have more muscle damage, inflammation, and oxidative stress.

So, instead of vitamin supplementation, how about supplementation with watercress, the badass of the broccoli family? What if, two hours before exercise, we eat a serving of raw watercress, then get thrown on a treadmill whose slope gets cranked up until we basically collapse? Athletes who didn't preload with watercress before working out developed a certain amount of free radicals in their blood stream at rest and after exhaustive exercise (See Preventing Exercise-Induced Oxidative Stress with Watercress), which is what we’d expect. If we eat a super-healthy antioxidant-packed plant food like watercress before we exercise can we blunt this effect? We actually end up better than we started! At rest, after the watercress, we may start out with fewer free radicals, but only when we stress our body to exhaustion can we see the watercress really flex its antioxidant muscle.

What happens to DNA damage? Well, in a test tube, if we take some human blood cells bathed in free radicals, we can reduce the DNA damage it causes by 70% within minutes of dripping some watercress on them. But does that happen within the human body if we just eat it?

If we exercise without watercress in our system, DNA damage shoots up, but if we’ve been eating a single serving a day for two months our body’s so juiced up on green leafy goodness that we get no significant damage after punishing ourself on the treadmill. So with a healthy diet, can we get all the benefits of strenuous exercise without the potential risks?

Regular physical exercise is a key component of a healthy lifestyle, it can elicit oxidative stress. To reduce that stress, some have suggested pills to improve one’s antioxidant defense system, but “those eating more plant-based diets may naturally have an enhanced antioxidant defense system” without eating pills to counter exercise-induced oxidative stress. Plant foods average 64 times more antioxidants than meat, fish, eggs, and dairy (See Antioxidant Power of Plant Foods Versus Animal Foods). And on top of that, the animal protein itself can have pro-oxidant effects. Anyone eating sufficient quantities of whole healthy plant foods could plausibly reach an antioxidant status similar to vegetarians. It’s not just about what we’re eating less of – saturated fat and cholesterol – but what we're eating more of, the phytonutrients. Whether it’s about training longer or living longer, we’ve got to eat more plants.

The lactic acid that makes yogurt tangy is the same lactic acid that builds up in our muscles when we exercise strenuously. Instead of bacteria fermenting the sugar in milk to make energy for themselves, our muscles ferment sugar in our diet to produce energy to contract. If, like when we’re sprinting, lactic acid builds up in our muscles faster than it can be removed we can end up with a burning sensation in our muscles, forcing us to stop.

Now if we train, we can increase the number of blood vessels in our muscles and clear out the lactate faster. For example, when researchers took some overweight sedentary women and started them on an aerobic training program of running and walking, at the end of three months, their lactate levels during exercise dropped 17%. But those on the same program who drank two cups of orange juice a day dropped their levels 27%. They did the same exercise program, but the citrus group experienced a significant decrease in blood lactate concentration, indicating an improvement in physical performance with less muscle fatigue.

I don’t recommend drinking juice, though, because we’re losing all that wonderful fiber that slows the rate of fruit sugar absorption into our system. In my video, Reducing Muscle Fatigue with Citrus, we can see the blood sugar spike one might expect after drinking Coca-Cola. Compare that to the spike we see with orange juice? No difference. However, if we eat the same quantity of sugar in the form of orange slices we experience a significantly smaller spike in blood sugar.

So the whole fruit is nearly always better than fruit juice. Now this is not to say OJ isn’t better than coke. OJ has those citrus phytonutrients like hesperidin, which may be why the women’s triglycerides didn’t go up even though they were drinking two cups of fruit juice every day. Hesperidin may actually help lower our digestion of fats, but once we get up to three cups a day, we really can start bumping our triglycerides.

The burning sensation during strenuous exercise may be related to the build-up of lactic acid in our muscles, but that’s different than the delayed onset muscle soreness that occurs in the days following a bout of extreme physical activity. That’s thought to be due to inflammation caused by muscle cell damage, little micro-tears in the muscle. If it’s an inflammatory reaction then might anti-inflammatory phytonutrients help? Find out in my video Reducing Muscle Soreness with Berries.

Six hundred years ago, people living along the coast of Carragheen County Ireland started using a red algae, which came to be known as Irish moss, to make a jellied dessert. This moss is now the source of carrageenan, a fat substitute (perhaps most famously used in the failed McLean Deluxe) and a food additive used as a thickener in dairy and nondairy products.

In 2008 I raised a concern about carrageenan. We had known for decades that it had harmful effects on laboratory animals, but in 2008 the first study on human cells to “suggest that carrageenan exposure may have a role in development of human intestinal pathology” was conducted. This was all five years ago, though. What’s the update? (See Is Carrageenan Safe?)

After the activation of inflammatory pathways was demonstrated in actual human colon tissue samples, Europe pulled it from infant formula, concerned that infants might be getting too much at such a vulnerable age. The latest suggests carrageenan consumption could possibly lead to a leaky gut by disrupting the integrity of the tight junctions that form around the cells lining our intestine—the barrier between our bloodstream and the outside world. This was just an in vitro study, though, done in a Petri dish. We still don’t know what effects, if any, occur in whole human beings. Some researchers advise consumers to select food products without carrageenan, accusing the FDA of “ignoring [its] harmful potential.”

Personally, after having reviewed the available evidence, I continue to view carrageenan the way I view acrylamide, another potential, but unproven hazard. Acrylamide is a chemical formed by cooking carbohydrates at high temperatures. So should we avoid eating such foods, like the EPA suggests? Well, “Food safety concerns must also be considered [in the context of dietary] consequences.” Where’s it found the most? Foods that are already unhealthy.

So sure, we can use our concern about the probable carcinogen,acrylamide as yet another reason to avoid potato chips and French fries, but until we know more I wouldn’t cut out healthful foods like whole grain bread. (For more on Acrylamide, see my video Acrylamide in French Fries).

Similarly, I’d use potential concerns about carrageenan as additional motivation to avoid unhealthy foods like cream cheese, but I wouldn’t cut out healthful foods until we know more. I would, however, suggest that those with inflammatory bowel syndrome or other gastrointestinal problems try cutting out carrageenan at least temporarily to see if symptoms improve.

Trans fats are basically found in only one place in nature: animal fat. The food industry, however, found a way to synthetically create these toxic fats by hardening vegetable oil in a process called hydrogenation, which rearranges their atoms to make them behave more like animal fats.

Although most of America’s trans fat intake has traditionally come from processed foods containing partially-hydrogenated oils, a fifth of the trans fats in the American diet used to come from animal products—1.2 grams out of the 5.8 total consumed daily. Now that trans fat labeling has been mandated, however, and places like New York City have banned the use of partially hydrogenated oils, the intake of industrial-produced trans fat is down to about 1.3, so about 50 percent of America’s trans fats come now from animal products.

Which foods naturally have significant amounts of trans fat? According to the official USDA nutrient database, cheese, milk, yogurt, burgers, chicken fat, turkey meat, bologna, and hot dogs contain about 1 to 5 percent trans fats (see the USDA chart in Trans Fat In Meat And Dairy). There are also tiny amounts of trans fats in non-hydrogenated vegetable oils due to steam deodorization or stripping during the refining process.

Is getting a few percent trans fats a problem, though? The most prestigious scientific body in the United States, the National Academies of Science (NAS), concluded that the only safe intake of trans fats is zero. In their report condemning trans fats, they couldn’t even assign a Tolerable Upper Daily Limit of intake because “any incremental increase in trans fatty acid intake increases coronary heart disease risk.” There may also be no safe intake of dietary cholesterol, which underscores the importance of reducing animal product consumption. See my video Trans Fat, Saturated Fat, and Cholesterol: Tolerable Upper Intake of Zero.

There’s been controversy, though, as to whether the trans fats naturally found in animal products are as bad as the synthetic fats in partially hydrogenated junk food. The latest study supports the notion that trans fat intake, irrespective of source—animal or industrial—increases cardiovascular disease risk, especially, it appears, in women.

“Because trans fats are unavoidable on ordinary, non-vegan diets, getting down to zero percent trans fats would require significant changes in patterns of dietary intake,” reads the NAS report. One of the authors, the Director of Harvard’s Cardiovascular Epidemiology Program, famously explained why—despite this—they didn’t recommend a vegan diet: “We can’t tell people to stop eating all meat and all dairy products,” he said. “Well, we could tell people to become vegetarians,” he added. “If we were truly basing this only on science, we would, but it is a bit extreme.”

Wouldn’t want scientists basing anything on science now would we?

“Nevertheless,” the report concludes, “it is recommended that trans fatty acid consumption be as low as possible while consuming a nutritionally adequate diet.”

Even if you eat vegan, though, there’s a loophole in labeling regulations that allows foods with a trans fats content of less than 0.5 grams per serving to be listed as having—you guessed it—zero grams of trans fat. This labeling is misguiding the public by allowing foods to be labeled as ”trans fat free” when they are, in fact, not. So to avoid all trans fats, avoid meat and dairy, refined oils, and anything that says partially hydrogenated in the ingredients list, regardless of what it says on the Nutrition Facts label.

While unrefined oils such as extra virgin olive should not contain trans fats, to boost the absorption of carotenoids in your salad why not add olives themselves or whole food sources of fat such as nuts or seeds? Other videos on oils include:

Babies delivered via caesarean section appear to be at increased risk for various allergic diseases. The thought is that vaginal delivery leads to the first colonization of the baby’s gut with maternal vaginal bacteria. C-section babies are deprived of this natural exposure and have been found to exhibit a different gut flora. This concept is supported by research noting that a disturbance in maternal vaginal flora during pregnancy may be associated with early asthma in their children. This all suggests our natural gut flora can affect the development of our immune system (for better or for worse).

In adulthood, two studies published back in 2001 suggested that probiotics could have systemic immunity-enhancing effects. Subjects given a probiotic regimen saw a significant boost in the ability of their white blood cells to chomp down on potential invaders. (You can watch a video of white blood cells doing their thing in my video Clinical Studies on Acai Berries. A must-see for biology geeks :). And even after the probiotics were stopped, there was still enhanced immune function a few weeks later compared to baseline (check out my 4-min video Preventing the Common Cold with Probiotics? to see the graph). A similar boost was found in the ability of their natural killer cells to kill cancer cells.

Improving immune cell function in a petri dish is nice, but does this actually translate into people having fewer infections? For that, we had to wait another 10 years, but now we have randomized double-blind placebo controlled studies showing that those taking probiotics may have significantly fewer colds, fewer sick days, and fewer symptoms. The latest review of the best studies to date found that probiotics, such as those in yogurt, soy yogurt, or supplements, may indeed reduce one’s risk of upper respiratory tract infection, but the totality of evidence is still considered weak, so it’s probably too early to make a blanket recommendation.

Unless one has suffered a major disruption of gut flora by antibiotics or an intestinal infection—in other words unless one is symptomatic with diarrhea or bloating—I would suggest focusing on feeding the good bacteria we already have, by eating so-called prebiotics, such as fiber. After all, as I noted in Preventing and Treating Diarrhea with Probiotics, who knows what you’re getting when you buy probiotics. They may not even be alive by the time we buy them. Then they have to survive the journey down to the large intestine (Should Probiotics Be Taken Before, During, or After Meals?). Altogether, this suggests that the advantages of prebiotics—found in plant foods—outweigh those of probiotics. And by eating raw fruits and vegetables we may be getting both! Fruits and vegetables are covered with millions of lactic acid bacteria, some of which are the same type used as probiotics. So when studies show eating more fruits and vegetables boosts immunity, prebiotics and probiotics may be playing a role.

When researchers last year at the Emerging Pathogens Institute ranked foodborne pathogens to figure out which was the worst, Salmonella was number one on their list. Salmonella was ranked the food poisoning bacteria with the greatest public health burden on our country, the leading cause of food poisoning hospitalization, and the number one cause of food-related death. Where do you get it from?

In my video Total Recall I talked about the threat of eggs. According to the FDA, 142,000 Americans are sickened every year by eggs contaminated with Salmonella. That’s an egg-borne epidemic every year. But Salmonella in eggs was only ranked the tenth worst pathogen-food combination. Salmonella in poultry ranks even worse, the fourth worst contaminated food in the United States in terms of both cost and quality-adjusted years of life lost. In terms of getting Salmonella poisoning from various U.S. foods, eating chicken may be eight times riskier than eating eggs.

Due to strengthening of food safety regulations under the Clinton administration, the number of Americans poisoned by chicken dropped every year from about 390,000 to 200,000. This was rightly hailed as a significant accomplishment. So now eating chicken only sickens 200,000 people in the U.S. every year. Isn’t that a bit like some toy company boasting that they’ve reduced the amount of lead in their toys and they’re now poisoning 40 percent fewer kids? Hundreds of thousands sickened isn’t exactly something to boast about, and the numbers have since rebounded upwards.

Since the late ’90s human Salmonella cases have increased by 44 percent. The rebound in incidence of Salmonella infection is likely a result of several factors, but one important risk factor singled out is eating chicken, since the proportion of chicken carrying infection has increased. When people think manure in meat they typically think ground beef, but when you look at E. coli levels there’s fecal matter in about 65 percent of American beef, yet in more than 80 percent in poultry (chicken and turkey).

Why have we seen a decrease in the Jack-in-the-box E. coli O157 but not chicken-borne Salmonella? In the last decade or so, E. coli infected beef and children has dropped by about 30 percent. Salmonella, on the other hand, has actually increased over the last 15 years. One reason for the difference is that the O157:H7 was declared an “adulterant,” defined as any poisonous or deleterious substance that may render meat injurious to health. So selling E. coli laden beef is illegal.

Why is beef laced with E. coli contaminated fecal matter considered adulterated, but chicken laced with Salmonella contaminated fecal matter okay? Salmonella certainly kills more people than the banned E.coli. It all goes back to a famous case I detail in my video Salmonella in Chicken & Turkey: Deadly But Not Illegal, when the American Public Health Association sued the USDA for putting its stamp of approval on meat contaminated with Salmonella.

What could the USDA possibly say in meat’s defense? They pointed out that there have been Salmonella outbreaks linked to dairy and eggs, for example, too, so since “there are numerous sources of contamination which might contribute to the overall problem.” It would be “unjustified to single out the meat industry and ask that the Department require it to identify its raw products as being hazardous to health.” That’s like the tuna industry arguing there’s no need to label cans of tuna with mercury levels because you can also get exposed eating a thermometer.

The DC Circuit Court of Appeals upheld the meat industry position, arguing you can allow potentially deadly Salmonella in meat because, “American housewives are…normally are not ignorant or stupid and their methods of preparing and cooking of food do not ordinarily result in salmonellosis.” What?! That’s like saying oh, minivans don’t need seatbelts because soccer moms don’t ordinarily crash into things.