Tuesday, December 16, 2014

For the tl;dr crowd, if you're not familiar with Nina Teicholz's The Big Fat Surprise before,
I just came across this review of her book in "The BMJ (formerly the British Medical Journal)" I thought you'd all find of interest:

"From low fat to Atkins and beyond, diets that are based on poor nutrition science are a type of global, uncontrolled experiment that may lead to bad outcomes, concludes Richard Smith"

Note that in the summary they include Atkins in the list of "diets that are based on poor nutrition science", despite the author of the review's conclusion to the contrary.

"By far the best of the books I’ve read to write this article is Nina Teicholz’s The Big Fat Surprise, whose subtitle is “Why butter, meat, and cheese belong in a healthy diet.”3 The title, the subtitle, and the cover of the book are all demeaning, but the forensic demolition of the hypothesis that saturated fat is the cause of cardiovascular disease is impressive. Indeed, the book is deeply disturbing in showing how overenthusiastic scientists, poor science, massive conflicts of interest, and politically driven policy makers can make deeply damaging mistakes. Over 40 years I’ve come to recognise what I might have known from the beginning that science is a human activity with the error, self deception, grandiosity, bias, self interest, cruelty, fraud, and theft that is inherent in all human activities (together with some saintliness), but this book shook me...."

"Reading these books and consulting some of the original studies has been a sobering experience. The successful attempt to reduce fat in the diet of Americans and others around the world has been a global, uncontrolled experiment, which like all experiments may well have led to bad outcomes. What’s more, it has initiated a further set of uncontrolled global experiments that are continuing. Teicholz has done a remarkable job in analysing how weak science, strong personalities, vested interests, and political expediency have initiated this series of experiments.... It’s surely time for better science and for humility among experts."

Thursday, November 20, 2014

In this study, to our best knowledge, we report for the first time that 13-S-HODE, a linoleic acid metabolite, causes mitochondrial dysfunction and bronchial epithelial injury. Although much is known about leukotrienes in asthma12, much less attention has been given to other lipid metabolites. We studied 13-S-HODE because of increasing evidence of the role of mitochondrial dysfunction in asthma7, 8, 9 and high concentrations of 13-S-HODE are found in reticulocytes during degeneration of mitochondria of reticulocytes16. On the other hand, mitochondrial dysfunction seems to be crucial in the genesis of epithelial injury and asthma pathogenesis in mice7, 8, 9. Similarly dysfunctional mitochondria have been found in human asthmatic bronchial epithelia30. Transfer of mitochondria from stem cells to alveolar epithelial cells reversed acute lung injury in sepsis, indicating the crucial role that mitochondrial health in lung diseases17. In this context, understanding the effects of 13-S-HODE on airway epithelium is essential because we found its levels to be high in the airway secretions and extracellular fluids. Also it is practically difficult to reduce the levels of 13-S-hydroxyoctadecadienoic acid (13-S-HODE) as there are many sources for its synthesis13, 14, 15.

So metabolite damages your mitochondria, causing asthma. Among other things, I'd not be surprised.

"The fatty acid composition of erythrocyte and liver mitochondrial lipids was readily and drastically altered by varying the fatty acid content of the diet."

That's in rats, but the same thing has been found in humans, in all the tissues I've read reports about. To quote from the original post: "...the risk for uncontrolled asthma increased with a higher n-6:n-3 PUFA ratio."

So that suggests that a low-omega-6 diet would be useful in inhibiting or preventing asthma.

I'd always assumed it was wheat, as it's been shown to be involved with exercise-induced asthma, along with other varieties of asthma. Perhaps they work in concert?

Friday, October 24, 2014

"I didn’t take any supplemental salt or electrolytes during the run. Further, the foods I consumed were generally low in sodium. None of my teammates took salt or electrolyte supplements.

While the sports-drink industry markets the importance of electrolytes, Tim Noakes has argued that the body contains enough salt and electrolytes to last for weeks. Further, he points out that in a hot environment, the salt content of sweat and urine drops by 90-plus percent. People see white, crusty sweat on their clothes after running because their bodies are ridding themselves of the excessive salt contained in the typical western diet (Noakes 2012). Ironically, people see the salt crusts and, worrying that they are running low on electrolytes, eat more salt—when they actually may have more than they need.

Monday, October 6, 2014

"iatrogenic /iat·ro·gen·ic/ (i-ă´tro-jen´ik) resulting from the activity of physicians; said of any adverse condition in a patient resulting from treatment by a physician or surgeon."

I was surprised to read this:

"In the end, you discovered that the Belgian nuns had unwittingly spread the virus. How did that happen?

"In their hospital they regularly gave pregnant women vitamin injections using unsterilised needles. By doing so, they infected many young women in Yambuku with the virus. We told the nuns about the terrible mistake they had made, but looking back I would say that we were much too careful in our choice of words. Clinics that failed to observe this and other rules of hygiene functioned as catalysts in all additional Ebola outbreaks. They drastically sped up the spread of the virus or made the spread possible in the first place. Even in the current Ebola outbreak in west Africa, hospitals unfortunately played this ignominious role in the beginning."

Thursday, September 11, 2014

"For the last decade, the cereal business has been declining... And the drop-off has accelerated lately, especially among those finicky millennials who tend to graze on healthy options...
"...Cereal sales have long been subject to dips brought on by food fads like the Atkins diet or bagel mania. And many cereals are neither gluten-free nor protein-rich, so they fail to resonate with the growing number of consumers who are gluten-intolerant or adherents of the so-called paleo diet.

"...“Additionally, there’s a small but very active and influential group of millennials who are focused on health and don’t like processed food. Guess what, cereal companies? They want to kill you.”"

Hey, it's nothing personal, I'm sure.

"...It also found a way to capitalize on Chex, which had produced consistent sales but little growth since General Mills acquired it in the 1990s. “We had tried everything to move the needle: new advertising, new flavors — and then we marketed it as gluten-free, and it took off,” Mr. Murphy said.

"...MOM, which is owned by the descendants of its founder, John Campbell, also has had success with a relatively new brand, Mom’s Best cereals, “because of an absence of negatives,” Mr. Reppenhagen said. “No hydrogenated oils, no preservatives, no artificial flavors and colors — we even use vegetable dyes in the packaging.”"

Adding protein is also big, apparently. Ick: I wonder what kind of protein they're adding?

Monday, September 8, 2014

"In a recent paper in the Proceedings of the National Academy of Sciences USA, Qing-Yuan Sun of the Chinese Academy of Sciences and colleagues used a drug to make male mice prediabetic. And unexpectedly, their offspring also became prediabetic as adults.

Was this due to the fathers' behavior around their offspring? No—the males were there solely for mating. Instead, becoming prediabetic caused epigenetic silencing of some genes in the pancreas of these males (an organ centrally involved in diabetes). And the same epigenetic changes occurred in their sperm as well, also affecting their offspring's pancreases.

This applies to behavior too, as reported in a recent paper in Nature Neuroscience by Isabelle Mansuy of the University of Zurich and colleagues. Prior work showed that if you stressed young male mice, as adults, they differed from control mice in how readily they explored a new environment and how quickly they gave up trying to cope with a challenging task (findings pertinent to understanding anxiety and depression).

Critically, the offspring of those males showed the same behaviors. Again, Dad wasn't doing any parenting. Instead, the stressful upbringing caused epigenetic changes (due to those micro-RNAs) in sperm. In a tour de force, the authors injected micro-RNAs from sperm of stressed males into fertilized eggs—passing on the behavioral trait. Thus, early life stress changed adult behavior of male mice, who passed it on to their offspring via epigenetic changes in their sperm."

Summing up the prior state of knowledge about sperm and epigenetics, the author states: "Naturally, this is turning out to be wrong."

Naturally. The most remarkable thing about Science, I think, is how it lays bare the arrogance of man.

"While it has been known that certain injuries were directly related to an inadequate nutrition of the mother during the formative period of the child, my investigations are revealing evidence that the problem goes back still further to defects in the germ plasms as contributed by the two parents. These injuries, therefore, are related directly to the physical condition of one or of both of these individuals prior to the time that conception took place.

"A very important phase of my investigations has been the obtaining of information from these various primitive racial groups indicating that they were conscious that such injuries would occur if the parents were not in excellent physical condition and nourishment.

"...In the light of these data important new emphasis is placed on the quality of the germ cells of the two parents as well as on the environment provided by the mother. The new evidence indicates that the paternal contribution may be an injured product and that the responsibility for defective germ cells may have to be about equally divided between the father and mother.

"...We are apparently dealing here with a factor which, while it may be related to the germ plasm and to the prenatal growth period, clearly involves other forces than those that are at work in the case of hereditary defectives. Since these changes have to do directly with disturbances in growth of the head, particularly of the face and of the dental arches, we are concerned with such evidence as may be available as to the nature of the forces that readily affect the anatomy of the skull.

"The general architecture of the body is apparently determined primarily by the health of the two germ cells at the time of their union. This architectural design may not be completely fulfilled due to interference with nutritive processes both before and after birth."

Nice to see Price confirmed.

And if there's a way to turn these triggers on, there's a way to turn them off.

"Because germline mutations are the source of all evolutionary adaptations and heritable diseases, characterizing their properties and the rate at which they arise across individuals is of fundamental importance for human genetics.

After decades during which estimates were based on indirect approaches, notably on inferences from evolutionary patterns, it is now feasible to count de novo mutations in transmissions from parents to offspring.

Surprisingly, this direct approach yields a mutation rate that is twofold lower than previous estimates, calling into question our understanding of the chronology of human evolution and raising the possibility that mutation rates have evolved relatively rapidly...."

Surprisingly? Direct measurement is best for a reason, any carpenter can tell you that!

But at any rate, I suspect that the epigenetic effect will prove to be much more important that currently thought, and will explain how a low rate of mutation can produce high rates of change: if you can turn genes on and off in response to the environment, you can adapt without needing to change genes, and in a more predictable fashion.

"#1: Paleo does not equal low-carb, and very low-carb/ketogenic diets are not our “default” nutritional state, as some have claimed."

I'm just going to follow up on this one point. And not get into the rest of his post, or Laura's post. I have issues with both of them, but all those issues stem from this one fundamental misconception. So I'm going to cover just one topic: was the Paleolithic Diet low-carb?

I'm not going to cover what people say is a paleolithic diet today, or what modern hunter-gatherers eat (there is one fundamental problem with that, but that's for later), just that one issue, because it's pretty easy to cover.

The Smithsonian Institute recently announced a massive study of Kennewick Man, an anomalous skeleton that appeared to pre-date the Indians and Eskimos who populated the Americas when my European ancestors first arrived. Initially they thought it was a modern skeleton, as it didn't have the physical attributes they associate with Indians, but one item distinguished it:

"The skull, while clearly old, did not look Native American. At first glance, Chatters thought it might belong to an early pioneer or trapper. But the teeth were cavity-free (signaling a diet low in sugar and starch) and worn down to the roots—a combination characteristic of prehistoric teeth.

(My emphasis) Yes, it turns out that being free of cavities is a hallmark of people who ate the paleolithic diet. That's zero cavities, btw, not "not many". I was told when visiting Harvard that the human relics we have from the paleolithic contain "zero" cavities, a statement which I found astounding. (More on that later.)

In fact, whether or not cavities are present (they're known as "caries" to anthropologists) is how one knows what the individual ate.

"The tooth decay is significant because it shows how starchy foods and the agriculture that created them were a part of Ötzi's regular diet. The team attributes his cavities to eating more breads and cereals."

Unlike Kennewick man, Ötzi's soft tissue was preserved, so they were able to do a DNA analysis on the contents of his gut, and, sure enough:

"According to the DNA reconstruction, the man's last meal was composed of red deer (Cervus elaphus) meat, and, possibly, cereals; this meal had been preceded by another one based on ibex (Capra ibex), different species of dicots, and cereals."

That, in a nutshell, is the difference between a paleolithic diet and a neolithic diet: cavities, and starchy foods. Starchy foods are tough to track through the archaeological records, but happily for us, cavities are easy to track.

"When hunter-gatherers started adding grains and starches to their diet, it brought about the "age of cavities." At least that's what a lot of people thought. But it turns out that even before agriculture, what hunter-gatherers ate could rot their teeth.
"The evidence comes from a cave in Morocco — the Cave of the Pigeons, it's called — where ancient people lived and died between 12,000 and 15,000 years ago. These were hunters and gatherers; they didn't grow stuff. And what was astonishing to scientists who've studied the cave people was the condition of their teeth.

""Basically, nearly everybody in the population had caries," or tooth decay, says Louise Humphrey, a paleo-anthropologist with the Natural History Museum in London."Humphrey says 94 percent of the more than 50 people from the cave she studied had serious tooth decay. "I was quite surprised by that," says Humphrey. "I haven't seen that extent of caries in other ancient populations.""

That's unique, in other words. So what did they eat?

"But apparently, these ancient people had a thing for acorns.

""Acorns," says Humphrey, "are high in carbohydrates. They also have quite a sticky texture. So they would have adhered easily to the teeth."

"Yes, these people did eat meat. And snails, apparently, whose shells littered the cave. But they also ate a lot acorns, judging by the debris they left behind.

So these are your teeth on a non-low-carb Paleo diet. (See picture above) Not a pretty sight.One other researcher has the following insight on what caries prevalence tells us about ancient diet:

"...it is supposed that H. erectus, a hunter-gatherer, obtained approximately 50% of its calories from carbohydrates (Wrangham, 2009) and under the hypothesis of cooking (that obviously included meat and vegetables), caries should have been present much earlier in the fossil record. However, caries appears clearly much later. So, the data on oral does not support the idea of a cariogenic diet based on cooked [tuberous] vegetables from the earliest periods. Maybe, in the beginning, fire was employed only for cooking meat."

Most likely, of course, it's because they weren't eating much in the way of carbohydrates:

"Caselitz (1998) analyzed the historical evolution of caries in 518 human populations of Europe, Asia and America in a wide timeline from the Paleolithic to the present, confirming that during Paleolithic and Mesolithic periods, the hunter-gatherers had less caries and lesions progressed more slowly. Caries indices have increased gradually from Neolithic times, until they reach the high rates observed at the present.... This phenomenon, observed in North Africa, Near East, China and Europe has been attributed to the drastic change in the diet that means the introduction and spread of cereals in the entire antique world..."

Alright, but all this is some random blogger putting together a bunch of posts and drawing a conclusion that reinforces his prejudices. What good is that?

(I'll put my biases on display: I had massive cavities when I was a kid, I don't know how many. I had eight teeth pulled, including my wisdom teeth; what got me interested in the Paleo diet was when the afore-mentioned Stephan Guyenet convinced me that my tooth issues were soley the result of diet. I'd already gone largely sugar-free and had been cavity-free for about 20 years. So I was receptive to the idea that my tooth issues were the result of diet, and not genetics.)

In this book, he discusses a number of mismatch diseases; which are caused by a discrepancy between our environment and the environment we evolved to live in. One of the first mismatch diseases he discusses is cavities.

(Full disclosure: While Dr. Lieberman does not share all my opinions on the Paleo Diet, I need to note the following, from the Acknowledgements in that book: "Special thanks go to David Pilbeam, Carole Hooven, Alan Garber, and Tucker Goodrich , who read multiple chapters." I'm biased toward this book.)

"“Notice anything?” Dr. Lieberman asked. “Look at the teeth. They’re straight. And no cavities. His wisdom teeth came in just fine. Humans, like all animals, have evolved teeth that are well suited to their natural diet. An infected tooth can easily kill you, and there were no dentists in the Paleolithic.”

"Nearly one hundred thousand years before dentists and orthodontists, this hunter-gatherer had a strong, straight set of chompers. Skhul V challenged much of what I’d been taught about the history of human health.

"“Now, look,” Dr. Lieberman continued, “hunter-gatherers didn’t have perfect teeth. This guy has well-worn teeth, and he’s actually missing one due to an abscess. So don’t stop going to the dentist. But wait until I show you the skulls of early farmers—a lot of them would need to get fitted for dentures.”

"“So what’s the secret? Eat less sugar?” I asked.

"“Well, yes, but healthy teeth depend on a variety of factors,” Dr. Lieberman explained. “First, yes, it matters what you eat. Amylase in your saliva breaks down carbohydrate into sugar in your mouth. Bacteria feed on the sugar and produce acid that wears away the enamel on teeth, giving you cavities. We’ll see what happened to the early farmers who started eating a starchier diet.”

"To figure out what humans used to eat, teeth are a good place to start. Not only do teeth fossilize well, but they’re the first point of contact between the food we eat and our body, the first part of our internal digestive tract. And if our teeth aren’t well adapted to a particular food, it’s unlikely the rest of our digestive tract is. But whatever Skhul V was eating, his teeth seemed to be up to the challenge."

Professor Lieberman in his book is pretty succinct in his coverage of the topic of cavities:

"Unfortunately, humans have little natural defense against cavity-causing microbes other than saliva, presumably because we did not evolve to eat copious quantities of starchy, sugary foods. Cavities occur at low frequencies in apes, they are rare among hunter-gatherers, they started to become rampant following the origin of agriculture, and they spiked in the nineteenth and twentieth centuries. Today cavities afflict nearly 2.5 billion people worldwide."

"...if we really wanted to prevent them, we would have to reduce our consumption of sugar and starch drastically. However, ever since farming, most of the world’s population has been dependent on cereals and grains for most of their calories, making a truly cavity-preventing diet impossible for all but a few. In effect, cavities are the price we pay for cheap calories. Like most parents, I let my daughter eat cavity-causing foods, encouraged her to brush her teeth, and sent her to the dentist, knowing full well that she’d probably get a few cavities. I hope she forgives me."

To me, this seems to be a pretty clear indication that the Paleolithic diet as actually eaten in the Paleolithic period did not include a large amount of carbohydrates; or, if it did, they were eaten infrequently enough so that the population of bacteria required to rot your teeth did not have the opportunity to arise in your mouth.

Keep in mind, other than the acorn-eaters, the Paleolithic cavity count was around zero.

I'd be very interested to hear another explanation for this particular phenomena, but for me, it's pretty cut-and-dried. Our Paleolithic ancestors ate fewer carbohydrates than any other group currently existing, or there was some other unknown phenomena that protected their teeth.

So what's the right amount of carbohydrates to eat on a Paleo diet? The amount that means you won't need to brush your teeth, ever. If you don't get any cavities, then you're eating the right amount. If you're getting cavities, you're eating too much. Based on the evidence above, I'm going to guess that's a low number.
So why not err on the safe side, as there's no good evidence not to?

We got by for quite a long time on that diet, and became human while eating it.

"On the other hand, there are some foods that inhibit the formation of caries. Diets rich in
meat lead to low caries frequencies due to the fatty acids’ antibacterial power and their
capacity to reduce the adherence of plaque on dental surfaces. The intake of dairy products
and fish (foods rich in calcium and casein that can increase urea concentration) modifies pH
values and the quantity of salivary production, inhibiting the formation of dental plaque.
Finally, a food rich in polyphenols (such as cacao, coffee and tea) inhibits the bacterial
metabolism and stimulates the salivary secretion representing, thus, another mechanism of
caries prevention (Bowen, 1994; Touger-Decker & Loveren, 2003).

(From the Caries Overview.)

P.S. I see from the comments below that I never did get around to covering this point:

" I'm not going to cover what people say is a paleolithic diet today, or what modern hunter-gatherers eat (there is one fundamental problem with that, but that's for later)..."

The problem with using modern hunter-gatherers to infer what our Paleolithic ancestors ate is that their environments were totally different. Paleolithic man was the apex predator on the planet, dominating the premier hunting grounds and sending many species into extinction through over-hunting. He traveled the world in search of game.

Modern hunter-gatherers live in fringe, largely less-productive environments. They're getting by on what's left, in other words. The primary environments for ruminants, our preferred food, has largely been given over to agriculture. See the decline of the bison in the United States. HGs eat a lot of tubers but mostly talk about eating meat, because that's what they really want, but can't get. Tubers are a "fall-back" food: it's what you eat when your primary foodstuff is not available.

Thursday, July 31, 2014

"...From this database, the researchers chose the records of 55,137 healthy men and women ages 18 to 100 who had visited the clinic at least 15 years before the start of the study. Of this group, 24 percent identified themselves as runners, although their typical mileage and pace varied widely.

Running towards a longer life.

"The researchers then checked death records for these adults. In the intervening 15 or so years, almost 3,500 had died, many from heart disease.
"But the runners were much less susceptible than the nonrunners. The runners’ risk of dying from any cause was 30 percent lower than that for the nonrunners, and their risk of dying from heart disease was 45 percent lower than for nonrunners, even when the researchers adjusted for being overweight or for smoking (although not many of the runners smoked). And even overweight smokers who ran were less likely to die prematurely than people who did not run, whatever their weight or smoking habits.

"As a group, runners gained about three extra years of life compared with those adults who never ran...." (Top link is to the NY Times account, here's the actual study.)

This is consistent with the Stanford Running Study, which I blogged about several years ago. They found that even people who ran for a period and then stopped enjoyed the same benefits, for the rest of their lives:

"Remarkably, these benefits were about the same no matter how much or little people ran. Those who hit the paths for 150 minutes or more a week, or who were particularly speedy, clipping off six-minute miles or better, lived longer than those who didn’t run. But they didn’t live significantly longer those who ran the least, including people running as little as five or 10 minutes a day at a leisurely pace of 10 minutes a mile or slower."

The study did not directly examine how and why running affected the risk of premature death, he said, or whether running was the only exercise that provided such benefits. The researchers did find that in general, runners had less risk of dying than people who engaged in more moderate activities such as walking.

"But “there’s not necessarily something magical about running, per se,” Dr. Church said. Instead, it’s likely that exercise intensity is the key to improving longevity, he said, adding, “Running just happens to be the most convenient way for most people to exercise intensely.”

"Anyone who has never run in the past or has health issues should, of course, consult a doctor before starting a running program, Dr. Church said. And if, after trying for a solid five minutes, you’re just not enjoying running, switch activities, he added. Jump rope. Vigorously pedal a stationary bike. Or choose any other strenuous activity. Five minutes of taxing effort might add years to your life."

"Differences between runners and controls for all outcomes continued to diverge after 21 years of follow-up. Interestingly, in our analysis of 21 years of data, aerobic exercise was no longer a statistically significant independent predictor of mortality. Sixty percent of deaths occurred during the 8-year period between our last report and the present analysis, and it is possible that with this additional mortality data, vigorous exercise has become more collinear with running and no longer is identified as an independent predictor of death. Further observation of this cohort, as the remaining 440 participants reach the biological limits of life expectancy, may be required to further clarify the independent role of nonrunning, vigorous exercise."

I'll keep running. And I'll keep doing other stuff, for the fun of it.

Wednesday, July 23, 2014

"Common sense says that talking on a cellphone while driving is not a particularly safe thing to do. But recent studies have found banning cellphone use while behind the wheel is not leading to a decrease in accidents."

"The most recent study, published this summer in the journal Transportation Research Part A: Policy and Practice, analyzed California's cellphone ban for drivers in 2008. Researchers found the number of accidents only dropped from 66.7 per day to 65.2 per day statewide, a statistically minor decline. The results were mirrored in many of the state's major cities, including San Francisco, though Los Angeles did experience a slight decrease in accidents. Researchers looked at the six-month periods before and after the ban went into place on July 1, 2008.

"We went in there expecting to see something," Daniel Kaffine, one of the study's authors, told Autoblog. "[But] it was pretty clear to us that there was no compelling evidence of a decrease in accidents."

Hardly surprising.

"Though offsetting for safety advocates, Kaffine's research is in line with other findings. The Highway Loss Data Institute, the research arm of the Insurance Institute for Highway Safety, studied insurance claims rates in 2009 and 2010 studies, and found no link that bans helped decrease crashes.

Thursday, July 10, 2014

I should just farm this series out to Derek Lowe, who is after all a practicing scientist.
But here we go, from his blog:

"If you go back and look at the instances where observational effects in nutritional studies have been tested by randomized, controlled trials, the track record is not good. In fact, it's so horrendous that the authors state baldly that 'There is now enough evidence to say what many have long thought: that any claim coming from an observational study is most likely to be wrong.'"

Science has spoken: we can now stop paying attention to these nonsensical studies.
But do read the whole thing, as there is also a recommendation on how to fix it. I think it's unlikely to ever be fixed, however, as the real "product" of this process is employment for academics, not actionable science.
If they employed real process control and measurement, someone might figure out that the whole thing is a waste of time and money. And then those academics would be out of work.

Friday, June 27, 2014

"EVOLUTION IN ACTION, “Convicted criminal offenders had more children than individuals never convicted of a criminal offense. Criminal offenders also had more reproductive partners, were less often married, more likely to get remarried if ever married, and had more often contracted a sexually transmitted disease than non-offenders. Importantly, the increased reproductive success of criminals was explained by a fertility increase from having children with several different partners. We conclude that criminality appears to be adaptive in a contemporary industrialized country, and that this association can be explained by antisocial behavior being part of an adaptive alternative reproductive strategy.”"

This is hardly new, however. As Napoleon Changnon notes in his book Noble Savages, about a primitive Amazonian tribe:

"The more interesting finding has to do with the comparative reproductive success of unokais and non-unokais. It should be intuitively clear that if unokais are more successful at acquiring wives, they are likely to have more children as well....

...Unokais have three times as many offspring as non-unokais."

Unokais are "killers of men". Chagnon observes:

My article also raised the possibility that human violence had something to do with the evolved nature of Homo sapiens."

Wednesday, June 25, 2014

"Barefoot running shoes and shoes with extra cushioning seek to protect runners—but despite all the new technology, running injuries are no less common than they were 30 years ago."

Most runners are still running in the same sort of shoes that they've been wearing for the last 30 years. It shouldn't be surprising that we're seeing the same outcome, if we assume that running shoes can affect injury rates.

"In the past 30 years running has changed from something done by trained runners who competed for sport, to an activity that is enjoyed by the masses."

"The Clary Grove boys, the Island Grove boys, the Sangamon River boys and the Sand Ridge boys, each designated by the part of the country from which they came, would gather there to indulge in horse racing, foot racing, wrestling, jumping, ball playing and shooting at a mark."

So it's a typically weak article on the topic, but he at least gets points for quoting Jay Dicharry and Dan Lieberman, and mentioning Born to Run.

"Confronted with the baffling array of running shoes, the prevailing wisdom seems to be to pick a shoe that fits your running style, not to hope a shoe will change you."

"...Over the next 40 years, we have seen the height as well as the cushion gradually increase. These developments inadvertently made runners adopt a “heel to toe” gait or “heel strike” when running. Bowerman and W.E. Harris authored a primer entitled Jogging: A [Medically Approved] Physical Fitness Program for All Ages in 1967. In this very popular book, they noted the most efficient way to run should be landing or striking on the heel first. The authors specifically stated that forefoot striking is incorrect and not the proper way to land.

"Bowerman and Harris had no scientific basis for this explanation. Several years later, they went on to create a running shoe that contained a cushioned heel...."

Friday, June 13, 2014

First we had Born to Run, then Eat & Run, now we have Run or Die. Killian is not only a famed runner, but a famed skier. This review is a by another famed skier, but the book is all about running. Sounds interesting.

Tuesday, June 10, 2014

Makes sense. We're uniquely evolved to throw, and I guess that includes punches.

"...The researchers found that bones that suffer the highest rates of fractures in fights are the same parts of the skull that exhibited the greatest increase in sturdiness during the evolution of our early human relatives. These bones are also the parts of the skull that show the greatest difference between males and females in both australopiths and humans today.
""In other words," Carrier said, "male and female faces are different because the parts of the skull that break in fights are bigger in males. Importantly, these facial features appear in the fossil record at approximately the same time that our ancestors evolved hand proportions that allow the formation of a fist."..."

One of the scientists involved, David Carrier, did the original research that led to the Born to Run hypothesis.

Monday, May 19, 2014

"So does Tamiflu work? From the Cochrane analysis – fully public – Tamiflu does not reduce the number of hospitalisations. There wasn't enough data to see if it reduces the number of deaths. It does reduce the number of self-reported, unverified cases of pneumonia, but when you look at the five trials with a detailed diagnostic form for pneumonia, there is no significant benefit. It might help prevent flu symptoms, but not asymptomatic spread, and the evidence here is mixed. It will take a few hours off the duration of your flu symptoms."

The difference in duration could well be just a statistical artifact. Hardly worth the effort, at any rate.

Wednesday, May 14, 2014

So I got sick. For the first time in such a long time that I have trouble remembering the last time... That's been one of the biggest benefits of the Paleo diet, as the flu used to visit like clockwork.

This one was a stomach bug—it's apparently going around, from the symptoms I'd say it's viral.

The immediate sign of it was that the mere thought of food was enough to make me nauseous and dizzy. At which point I decided that instead of getting lunch I would leave the office and go home.

Later, in the evening, my daughter broke out some ice cream. That was appealing!

After eating a few bites I came down with a fever. Literally, in the middle of eating the bowl of ice cream I started getting chills. 99.5, not huge, but I hadn't had a fever prior to that moment.

One of the ways your body disposes of excess carbs is by increasing the metabolic rate. Which is basically what a fever is. And the fact that your body can anticipate the effect of glucose on the metabolism is, as they say, well-established.

So was my two-hour fever after eating some ice cream my body's way of getting rid of the glucose, which it recognized was counter-productive?

I didn't eat anything else until the following evening, and nothing that was high-carb, and the fever didn't come back. But I also didn't try eating more ice cream (which had lost it's appeal after my body reacted to it; I wanted to get better, not conduct an experiment).

Of course the doctors will tell you that you should eat when you have a fever, because your body needs calories. Oy vey... I'm going to stick with grandma's advice: you've got plenty of calories, losing a few pounds (I lost 5) won't kill you, and you should listen to your body. Nausea's about the clearest sign you're going to get...

OK, well that's a nice hypothesis, but it's based on an anecdote. Any support in the scientific literature?

"Specifically, they could suppress viral infection of cells by dismantling the V-ATPase through the lowering of glucose levels. In addition, they could inhibit infection by treating cells with chemical inhibitors of glycolysis, the initial pathway of glucose catabolism. Conversely, influenza viral infection of cells could be increased by giving cells more glucose than normal, the researchers report in the journal Virology.

"The ease with which the researchers could dial viral infection down by controlling glucose levels and thus V-ATPase activity suggested a new strategy for throttling influenza viral infection. "Taken together, we propose that altering glucose metabolism may be a potential new approach to inhibit influenza viral infection," say Adamson and Kohio."

A "new approach". Yeah, listen to grandma, that's radical. Now it's entirely possible that different virii work in different ways, but it does lend some support to the "starve a fever" argument, as starvation (aka ketogenesis) is the easiest way to suppress glucose metabolism!

I drank caffeine, despite this: "Caffeine enhances dehydration." That advice is just not correct, unless you're not used to drinking caffeine. I drank a little vodka last night to settle my stomach, despite this: "Avoid caffeine and alcohol." I have no idea why that worked, but it certainly didn't make things worse, and it helped me sleep. Of course an alcoholic hot toddy is another old grandmother's cure. (I'd make mine minus the sugar, of course.)

The flu did not progress to a more serious case. After a few years of checking doctors' recommendations on many topics, I've learned you're better off ignoring them on a lot of matters. Unless you have a traumatic injury. Especially when they've not read the literature, which is more often than you would expect.

P.S. Apparently the relationship between glucose metabolism and influenza has been known since the 1950s:

"The relatively low toxicity in both experimental animals2 and humans3 of the potent glucose antimetabolite 2-deoxy-D-glucose suggested the feasibility of employing this compound in in vivo influenza virus infection. The present studies, undertaken in the intact chick embryo, demonstrate that the synthesis of influenza virus is markedly inhibited by 2-deoxy-D-glucose. Ancillary studies with in vitro systems show this inhibition to be reversible with glucose and therefore not related to permanent host cell damage."

"The extent of viral replication in the lungs was proportional to blood glucose levels in the mice at the time of infection, and the enhanced susceptibility of diabetic mice was reversed with insulin...."

Although (as I cautioned about above) it doesn't seem to work for all strains:

"...Growth of A/HKx31 (H3N2) virus was also enhanced in diabetic mice, whereas the highly virulent strain A/PR/8/34 (H1N1) showed no difference in virus yields in diabetic and nondiabetic mice..."

So it's worth a shot, as it seems the worst thing that's going to happen is nothing. There don't seem to be any studies done on a ketogenic diet and influenza, however. That would be interesting.

Monday, May 12, 2014

"Since 1978 when Dr. Gabe Mirkin coined the term RICE, Rest, Ice, Compression, and Elevation have been the gold standard for treating athletic injuries. But now the ice age is melting, and a series of studies that show that injury treatment with cold therapy and total rest may actually delay healing has even Dr. Mirkin changing his mind.

"...A study published in The Journal of Strength and Conditioning Research examined the influence of icing on muscle damage. Data from the study did show that icing delays recovery and should not be the first choice of treatment for injuries. After icing there was an immediate increase in swelling. Indicators of muscle damage increased after application of ice.

"...And research published in The American Journal of Sports Medicine in June, 2013, said that although icing an injury relieved swelling it did not make recovery from muscle damage quicker. If the treatment reduces inflammation it delays healing. This includes the use of anti-inflammatory pain relievers like ibuprofen."

I said I my prior post:

"Now, you might find this hard to believe, but devising a treatment and then never bothering to test it in a scientific fashion is par for the course in the medical profession, as is continuing to use the treatment after it's been shown to have no supporting science behind it and to be of questionable efficacy. People would rather "do something", than do nothing, even if doing nothing is the correct course of action."

Dr. Mirkin obliges to confirm that part of my post as well:

"Mirkin says it is okay to apply ice for pain relief immediately after the injury occurs, but for short periods only. He suggests icing for 10 minutes, removing the ice for 20 minutes, and repeating the process once or twice, but stresses that there is no reason to continue icing more than six hours after injury...."

So it worsens the injury, but you should do it for up to six hours? Ridiculous.

P.S. Dr. Mirkin has a post on his blog that has enough in common with the news article linked to above that I suspect the article is simply a re-purposing of his post. (If you wonder where reporters get their news ideas...) The first comment on that news story is, "Well written and well done. Gabe Mirkin"

But at any rate, he has a nice additional comment that's not included in the news article, and with which I fully agree:

Wednesday, April 30, 2014

"We found that exposure of mice and rats to male but not female experimenters produces pain inhibition. Male-related stimuli induced a robust physiological stress response that results in stress-induced analgesia. This effect could be replicated with T-shirts worn by men, bedding material from gonadally intact and unfamiliar male mammals, and presentation of compounds secreted from the human axilla. Experimenter sex can thus affect apparent baseline responses in behavioral testing."

In my twenties I got really sick; lying in bed for 5 days, bleeding from the lower part of my digestive tract: not pretty. . . Delirious days later and ten pounds lighter and I was recovered, except for one problem: I had diarrhea for the subsequent 14 years. . . . Two years ago [2008] I passed out on the toilet on a ski weekend. The emergency room at Bennington Hospital [Vermont] told me it was a stomach flu.
Four weeks later I got cramps at work. I had to lie on the floor until it passed. Then I drove to my doctor’s office, and he told me that I had diverticulitis, and I had to go to the emergency room. I drove myself, and barely made it. I was in agony; I nearly passed out again while they were interviewing me to see if it was “serious”. . . . I had a perforated colon. . . . I spent the next four days in the pre-operative ward, so if it got worse they could cut me open immediately. I lost 10 pounds. Then I started bleeding, and I realized these were all the same symptoms that I had had 14 years before. My blood pressure got so low that the automated blood-pressure machine wouldn’t work . . .
I mentioned to all three of the doctors I saw that I had had constant diarrhea for the last 14 years, since the first attack, and they shrugged. They told me to eat more fiber, and whole wheat, even though that was what I had been eating for the last 20 years. So I avoided surgery, started eating salad with salad dressing (containing industrial seed oils) and lots of whole wheat. . . . But the more salad and whole wheat I ate, the worse it got. I couldn’t understand why. Finally had to have eight inches of my colon removed. The diarrhea continued, so obviously the cause remained.

Then something happened that, before blogging, wasn’t possible:

Someone sent me a post that Stephen Guyenet did about how dental problems were pretty much all due to diet, not genetics, as I’d been told. As someone who’d had a ton of cavities, and 8 teeth pulled, and was determined to spare his daughters the same fate, I found this of interest.
I started reading the blog. 6 months later, I decided to stop eating seed oils, which eliminated my carb cravings, hence no wheat. Two days later, [unexpectedly] my diarrhea stopped. A good bit of trial and error, some accidental, ensued. [I learned that both] wheat and seed oils cause distress, but different types. The two combined can cause me to pass out. If I eat wheat by accident, then eating saturated animal fats (like cream) causes things to settle down.

After 16 years my symptoms are now completely under my control. . . . I read the ingredients on everything. I make a big mistake once every 6-9 months. [Other benefits:] I’m much more resistant to sunburn, for instance, and my vision improved a bit.

So his problems were due to (a) wheat and (b) too much omega-6. His doctors had no idea.
The Mayo Clinic recommends a “diverticulitis diet” that is clear liquids and low-fiber foods. According to the Mayo Clinic, “mild cases of diverticulitis can be treated with rest, changes in your diet and antibiotics. But serious cases of diverticulitis may require surgery.” The Mayo Clinic, it appears, has no idea what causes diverticulitis.
Tuck added:

It really pisses me off when people dismiss this, because it really makes a difference. I had a colleague who was in the hospital for a colon resection for diverticulitis. When he heard my story, he had the hospital put him on a gluten-free diet. Four days later, instead of having surgery as scheduled, he checked out: cured. He’s symptom-free on a gluten-free diet to this day.

In this post a contract artist who calls himself Wolverine gives a long list of life-threatening medical errors that happened to him. I hope that he will eventually add dates so that the rate of error becomes clearer [more: all the errors happened within a 14-month period] but even without them the stories suggest that life-threatening errors are common. (As does the effectiveness of surgical checklists.) Medicine is a job where if you make a mistake only the customer suffers not you. Surely this is why the error rate is so high. Wolverine was operated on by a surgeon who, because of a fatal error, had lost his license to practice in California. He changed states, was hired again, and made the same error on Wolverine.
I learned about this from Tucker Goodrich, who has been corresponding with the author and told me something remarkable:

He’s eating a paleo with raw milk diet. The other transplant patients he knows are all eating the modern American diet and dying of infections; he’s been infection-free for two years.

"Hello, this is Seth’s sister, Amy, with the sad news that Seth died on Saturday, April 26, 2014. He collapsed while hiking near his home in Berkeley, CA. He had asked that any memorial gifts be made to Amnesty International. Thank you to all for following and sharing Seth’s work."

I've followed Seth's blog for quite a while, and was lucky enough to have met him at the Ancestral Health Symposium in 2012. He will be missed.

"These results made me stop believing saturated fat is bad. The results were very clear. Sleep is central to health. Something that improves your sleep is likely to improve your health. My experiments were far from perfect but I found them more convincing than any of the evidence used to claim saturated fat is bad. Almost all that evidence came from surveys that compared people who ate more saturated fat with people who ate less. The two groups always differed in other ways (e.g., income) besides fat intake."

Seth was the archetypal scientist: unafraid to follow the evidence to its logical conclusion.

Monday, April 7, 2014

One regularly hears that when one goes on a low-carb diet, one loses glycogen, and the weight loss is due to the fact that glycogen is stored with water, so the weight one loses is due to the water lost with the glycogen.

"Glycogen losses or gains are reported (2) to be associated with an additional three to four parts water, so that as much as 5 kg weight change might not be associated with any fat loss."

"Muscle water content expressed as mumol H2O lost/g wet tissue weight or made relative to protein content showed no consistent relationship to the glycogen content. These data, therefore, do not support the commonly accepted muscle glycogen-to-water ratio of 1.0:2.7 (g:g). Further work is necessary to quantify the exact amount of water that is actually associated with the glycogen complex."
"Muscle glycogen storage and its relationship with water."

I can't find any evidence that anyone's tried to confirm this assumption in man.

An assumption does not equal knowledge. It's a guess, however educated. And a guess is not "science".

I do think, however, that body weight can go up with glycogen increase. But the water's not in the muscles, it's in the gut, buffering the sugar that is generally used to replenish muscle glycogen:

"What he discovered was that if he drank sugary water, his body reacted to it by flooding his stomach with water.

"...Two lights went on over my head on hearing this. The first is that this explains diabetics’ frequent thirst and urination: they’re thirsty because diabetics consume a lot of sugar, which requires their bodies to dump water into their guts. They need to replenish this water in their systems, and perhaps drinking the water helps in diluting the sugar in their guts."

And going on a low-carb diet allows the body to lose the water that's in the gut (doing you no good from a running performance perspective), not the muscles.

"...Several attempts have been made to estimate the amount of water stored in muscle in association with glycogen, but this is not easily quantified. Early data suggested a value of about 3 g of water for each gram of glycogen (Olsson & Saltin, 1970). However, a subsequent study by Sherman et al. (1982), in which the muscle glycogen content of rats was manipulated by exercise and diet, suggested that there was no consistent association between the amount of glycogen stored in a muscle and the muscle water content. Richter, Hansen, and Hansen (1988) found a decrease in muscle water concentration after muscle glycogen loading. Nygren, Karlsson, Norman, and Kaijser (2001) have also suggested that glycogen loading may alter the disposition of water molecules within the muscle. These apparently conflicting data may result from different amounts of water storage with different structural forms of glycogen and changes in the proportions of these different forms as the total amount of glycogen changes.

"Notwithstanding the uncertainties, there is good evidence of gross changes in body mass as a consequence of diet and exercise manipulations designed to induce alterations in glycogen storage in humans. Any major change in the amount of glycogen stored in muscle will result in a change in body mass, with the major part of the mass change being a consequence of the storage of the associated water (Olsson & Saltin, 1970)...."

The studies they cite are the studies linked to above. They're not particularly logical with that last statement, however, as there's still no evidence for the proposition that glycogen is stored with water.

Thursday, March 27, 2014

Fruits and veggies, fermented or otherwise, aren’t the only source of prebiotics in your diet. Eat a whole sardine and some of the ligaments, tendons, bones, and cartilage will surely escape digestion to reach the distal intestine where they will be fermented by the resident microbes.

Read the whole thing, but this explains why populations that don't eat much or any plant fiber, like the Maasai warriors or Eskimos of yore, do perfectly fine.

Monday, March 24, 2014

"A certain section of medical opinion, in late years, has succumbed to the messianic delusion. Its spokesmen are not content to deal with the patients who come to them for advice; they conceive it to be their duty to force their advice upon everyone, including especially those who don't want it. That duty is purely imaginary. It is born of vanity, not of public spirit. The impulse behind it is not altruism, but a mere yearning to run things." — H.L. Mencken

In their role in society, medical doctors have replaced priests to a large extent.

Tuesday, March 4, 2014

...here's an outspoken interview with Sydney Brenner, who's never been the sort of person to keep his opinions bottled up inside him. Here, for example, are his views on graduate school in the US:

Today the Americans have developed a new culture in science based on the slavery of graduate students. Now graduate students of American institutions are afraid. He just performs. He’s got to perform. The post-doc is an indentured labourer. We now have labs that don’t work in the same way as the early labs where people were independent, where they could have their own ideas and could pursue them.

The most important thing today is for young people to take responsibility, to actually know how to formulate an idea and how to work on it. Not to buy into the so-called apprenticeship. I think you can only foster that by having sort of deviant studies. That is, you go on and do something really different. Then I think you will be able to foster it.

But today there is no way to do this without money. That’s the difficulty. In order to do science you have to have it supported. The supporters now, the bureaucrats of science, do not wish to take any risks. So in order to get it supported, they want to know from the start that it will work. This means you have to have preliminary information, which means that you are bound to follow the straight and narrow.

And:

Here are Brenner's mild, temperate views on the peer-review system and its intersection with academic publishing:

. . .I don’t believe in peer review because I think it’s very distorted and as I’ve said, it’s simply a regression to the mean.

I think peer review is hindering science. In fact, I think it has become a completely corrupt system. It’s corrupt in many ways, in that scientists and academics have handed over to the editors of these journals the ability to make judgment on science and scientists. There are universities in America, and I’ve heard from many committees, that we won’t consider people’s publications in low impact factor journals.

Now I mean, people are trying to do something, but I think it’s not publish or perish, it’s publish in the okay places [or perish]. And this has assembled a most ridiculous group of people. I wrote a column for many years in the nineties, in a journal called Current Biology. In one article, “Hard Cases”, I campaigned against this [culture] because I think it is not only bad, it’s corrupt. In other words it puts the judgment in the hands of people who really have no reason to exercise judgment at all. And that’s all been done in the aid of commerce, because they are now giant organisations making money out of it.

I don't find a lot to disagree with there, either.

The point of academic Science is to provide jobs for academics. Any "science" that occurs is a happy coincidence, in large part.

Thursday, January 30, 2014

"...Instead of resorting to running in “high heels” as he called them, [Golden] Harper “zero-dropped” his shoes by cutting the heel elevation out of them. “It was effortless,” Harper said. “I had shoes on but I ran naturally like I was barefoot! I wanted everyone to feel that feeling of being able to run natural.”

"Harper and some of the employees at the store started cutting shoes flat for people to try at their running store, and within months, more than 1,000 people had paid the local shoemaker $20-$60 to cut the heel elevation out of their shoes.

"Harper knew he had something great and set out to create and market what he had named Altra Zero Drop, stemming from the innovation of “AL-tered shoes” and his love for ul-TRA marathons.

"The shoes had a foot-shaped toe box for more room in the front, as well as a cushioned, zero-drop sole.

"However, after proposing the concept to several companies that showed no interest, he spoke to his cousin, Jeremy Howlett. “Let's just do it ourselves," Howlett told him...."

Saturday, January 18, 2014

My title is more appropriate than the one the New York Times offers: An Epidemic of Attention Deficit Disorder, since the point of their editorial is that the "epidemic" they're discussing doesn't actually exist, but has largely been manufactured by predatory drug companies and compliant or ignorant doctors:

"...A two-decade campaign by pharmaceutical companies promoting the pills to doctors, educators and parents was described by Alan Schwarz in The Times on Sunday. The tactics were brazen, often misleading and sometimes deceitful. Shire, an Irish company that makes Adderall and other A.D.H.D. medications, recently subsidized 50,000 copies of a comic book in which superheroes tell children that “Medicines may make it easier to pay attention and control your behavior!” Advertising on television and in popular magazines has sought to persuade mothers that Adderall cannot only unleash a child’s innate intelligence but make the child more amenable to chores like taking out the garbage.

"The potential dangers should not be ignored. The drugs can lead to addiction, and, in rare cases, psychosis, suicidal thoughts and hallucinations, as well as anxiety, difficulty sleeping and loss of appetite. On Tuesday, the Food and Drug Administration warned that some A.D.H.D. medications, including Ritalin, Concerta, and Strattera, may, in rare instances, cause prolonged and sometimes painful erections known as priapism in males of any age, including children, teens and adults.

"So many medical professionals benefit from overprescribing that it is difficult to find a neutral source of information. Prominent doctors get paid by drug companies to deliver upbeat messages to their colleagues at forums where they typically exaggerate the effectiveness of the drugs and downplay their side effects. Organizations that advocate on behalf of patients often do so with money supplied by drug companies, including the makers of A.D.H.D. stimulants. Medical researchers paid by drug companies have published studies on the benefits of the drugs, and medical journals in a position to question their findings profit greatly from advertising of A.D.H.D. drugs.

"The F.D.A. has cited every major A.D.H.D. drug, including the stimulants Adderall, Concerta, Focalin and Vyvanse, for false and misleading advertising since 2000, some of them multiple times. The companies, when challenged, typically stop those misleading claims, but the overall impact appears marginal. The number of prescriptions for A.D.H.D. drugs for adults ages 20 to 39 nearly tripled between 2007 and 2012, and sales of stimulant medications in 2012 were more than five times higher than a decade earlier.

"Curbing the upsurge in diagnoses and unwarranted drug treatments will require more aggressive action by the F.D.A. and the Federal Trade Commission, which share duties in this area. It will also require that doctors and patients recognize that the pills have downsides and should not be prescribed or used routinely to alleviate every case of carelessness, poor grades in school or impulsive behavior."

Millions of children, mostly boys, are on these drugs for no good reason. The difference between this and the old snake-oil salesmen is that snake oil was ineffective, but had no side effects. It was a simple fraud. This is malpractice on an industrial scale.

Patients need to realize that First, do no harm is no longer a directive for the medical profession, and perjury, the breaking of an oath, is no longer considered a crime.

"I will apply, for the benefit of the sick, all measures [that] are required, avoiding those twin traps of overtreatment and therapeutic nihilism."

Thursday, January 16, 2014

"...Hoping to better understand what happens to an ultra-endurance athlete’s body, researchers at Stanford University and the University of California, Davis recently contacted more than 1,200 experienced ultra-marathon runners and asked them probing and almost impolite questions about the past and current states of their bones, hearts, blood pressures, prostates, breasts, skin, lungs, moods, bowels, eyes, waistlines, livers and many other body parts and systems. They also asked about their race histories, times, training regimens and any recent injuries and illnesses....

"...And there can be substantial, accruing benefits to covering those miles, says Dr. Eswar Krishnan, an assistant professor at Stanford and co-author of the new study. Over all, the ultra-runners in the study were absent from work less often than other American adults because of illness or injury, he said, and rarely felt compelled to see a physician, with almost half visiting a doctor only once in the past year, usually because of a running injury.

"Of course, the ultra-competitors may have “developed stoicism” from their many hours of training, Dr. Krishnan said, and ignored niggling ills that would keep the rest of us from work or send us hurrying to the doctor. But they also displayed a substantially reduced risk of developing many of the common diseases of modern life...."

The present work provides an analysis of medical issues in a large cohort of ultramarathon runners. As expected, the work demonstrates that, with the exception of asthma and allergies, ultramarathon runners have fewer chronic medical conditions than the general population, tend to miss little time from work or school due to illness or injury, and make limited use of the medical care system.

So go run. I'll note that most of the injuries they describe could be reduced with a barefoot-style running form...

Thursday, January 2, 2014

"... When your bread and butter is randomized intensity, performed at near max or to exhaustion, you can’t just simply push beyond exhaustion to the next level. Once fitness gains flat line, no amount of pushing will create a new stimulus. You’re maxing out the intensity, and because you don’t believe in progressive, controlled, low-moderate and high intensity mixes, you’ve got to nowhere to go. There’s no way to progressively overload and create new stimuli and adaptation....

"...But what about those “joggers” who don’t look fit who you may see out at the park. You know the ones who may have gained a few pounds, yet still can crank out their 5 mile daily run. I often get pointed towards these people as evidence that running somehow makes you fat…

"The reality is that jogging syndrome describes the opposite side of the coin. It’s the person who gets caught running their same volume at the same moderate intensity every single day. When we do this never-ending cycle of same run or same workout each and every day, we get really efficient at doing that same run or workout. It’s no longer pushing us outside of homeostasis. We’ve nailed it and it’s a walk in the park.

"Of course not every run needs to challenge us, as we need to recover and then cement adaptations, but if we never challenge our norm, we will not adapt. So we get really efficient at running our 5mi run at 8min pace every day for example. Our body hones in on the most efficient way to run at that pace and distance, and that’s about it. Our “fitness” won’t progress, and those extra 400 calories burned a day might not be enough to stave off gaining a layer of fat. So to me this presentation of the jogger as unfit, is simply not true. He’s prepared for what he continuously does, jog 5 miles, but not much else. But if that’s all he cares about, then that’s fine I guess, but he probably should include some variation....

"...The problem is we’ve got this polarized argument of long slow VS. super intense. When in reality, that is an argument no one is having, or should be having...."

When I started running seriously 12 years ago, I started by running a 5.1 mile loop in a local park. I did that same loop religiously, in the same direction, for the next year. (Yes, I can be OCD when I want to be. :)

The first time I did it, it took me 2.5 hours. It was agony, a total suffer-fest. I couldn't run up any of the hills. When I stopped doing that run regularly, I did it in 49 minutes, a year later.

But I always did that run the same way: I ran as hard as I could for the whole run. In early runs this was typically interrupted by an exercise-induced asthma attack at the start of the first hill. (Oddly, that was always it for the asthma for the rest of the run...)
What I was not getting was the training effect at longer distances that one needs to run longer races easily, as Steve would have predicted. I also stopped improving, as my 5k and half-marathon PRs were stuck at the ones that I set on my first races.