Pages

02 December 2012

Why you can all stop saying meat eating fueled evolution of larger brains right now

Hadza returning from hunt in Tanzania. Credit Andy Lederer.

In William Shakespeare's comedy Twelfth Night, Sir Andrew, who was worried that a joke may have been made at his expense, reasons out loud that maybe his diet had something to do with his lack of intelligence, saying, "But I am a great eater of beef, and I believe that does harm to my wit" (Act I, Scene III). Dialogue like that was how Shakespeare famously poked fun at what he considered "foolery" in his time; it was a common belief of the Elizabethan Age that eating too much meat made you a meat-head. Now, it appears the tables have turned. Vegetarians are getting a taste of similar medicine from comedians of our time.

On November 15th's episode of The Colbert Report, Stephen Colbert interviewed one of the world's foremost paleoanthropologists, Chris Stringer of the Natural History Museum, about his newly published book. During their conversation, Stringer sums up nicely why meat eating may have been the primary force that drove evolution of a brain-gut tradeoff, where a shrinking gut allowed for more energy input into the brain. Here is Stringer's explanation at about minute 18:30 in the episode:

Chris Stringer: "There's a thing called 'expensive tissue hypothesis'. And this says we evolved our large brains by changing our diets. Our ancestors had great big guts because they were vegetarian. They never had enough spare energy because their guts were using 20 percent of their energy; they never had enough spare energy to evolve a large brain. When we started eating meat, a much more concentrated sort of food, it freed up energy and we could start to run a bigger brain."

Stephen Colbert: "That's why vegetarianism seems so stupid to me."

There's little use challenging the evidence that meat, whether eaten raw or cooked, was "brain food" for our ancestors. The fossil record shows a clear correlation between the appearance of hominin meat eating some two million years ago -- as evidenced by stone tools, cutmarked bones, and changes in hominin teeth structure -- and the drastic increase in cranial size that continued until the agricultural revolution. Studies in genetics and in primatology add more support that meat helped enable our ancestors to attain the nutrients required for the development of all our extra neurons. In my last post, for example, I discuss Greg Wray's research on how changes in differences of fat metabolism and neuronal signaling that may have developed as a result of more meat in the diet and probably helped produce our fatter brains.

However, while meat eating may have played a role, the arguments espoused often by writers, "paleo" and Atkins enthusiasts, and meat-lovers of all kinds are often flawed from a nutritional standpoint. The problem is, it just doesn't make a lot of sense that meat eating could have fueled evolution of our larger brains. It's more likely that carbs did.

Neurons run on glucose, not meat

Neurons, which use twice the energy as any other cell type in the body, run almost exclusively on glucose. They don't run on protein and fat.* Moreover, because neurons aren't able to store glucose as glycogen as other cells in the body do, they must receive glucose in constant supply. That's glucose that must be received from the bloodstream 24 hours a day, seven days a week, even while you're asleep. That's glucose for some 86 billion neurons, more than any other primate; by comparison, gorillas contain about 33 billion and chimpanzees only 28 billion neurons. That's glucose in amounts that could not possibly be supplied by any abundance of meat eating.

A human brain is ~3x larger than a chimp's.

To get an expert's standpoint on the topic, I wrote to Peter Ungar, distinguished professor and chair of anthropology at University of Arkansas, who was recently named a fellow of the American Association for the Advancement of Science, what his take was on the role of meat eating versus carbs in human brain evolution. He wrote back,

Even the staunchest meat advocates recognize that protein and fat cannot power the brain – and we lose much of our gluconeogenesis capabilities at weaning. The argument is that meat eating provided the calories needed to power other parts of the body, freeing available carbohydrates to focus on the brain… Even in that case, it’s carbs, not meat that powers the brain (even though meat facilitates the process).

So, let me repeat in my own way: Why was meat important for evolution of larger brains? Although meat does provide some valuable micronutrients and essential fats, there may not be anything incredibly special about meat nutritionally except that it freed up carbohydrate calories for feeding brains that were roughly three times larger than chimps without the use of gluconeogenesis (synthesis of glucose).

According to Wray's evidence from regulatory genes, humans also developed double the glucose transporters in the brain in comparison to chimps. Chimps, on the other hand, have double the glucose transporters in their muscles. That translates to about four times more glucose going to our brains versus our muscles in comparison to chimps. When I asked Ungar what he thought regarding the role of other types foods (e.g. tubers), as well as preparation techniques (e.g. cooking) in also freeing carbs for the brain, he responded,

I believe the hominin lifestyle is more about a broadened niche than meat per se… Lots of people live in lots of places because they can find something to nourish themselves in whatever environment they find themselves in. Western Australian aboriginals did quite well without lots of meat.

On Wray's research, Ungar also added,

I’m not an expert on fueling the brain, but it makes sense to me that we’re more efficient at getting glucose across (we need it). One could argue that this allows LESS carb intake for a given brain size… Since our brains are roughly 4x the size of Chimps, does that mean we need about the same number of carbs to power them?

As for evidence of increased starchy food intake – I think the amylase gene copies didn’t take off until relatively recently. Still, evidence of a broadened subsistence base in Homo could certainly have included starchy foods, meat, or anything else they could get their hands on.

I’m pretty conservative in terms of single-cause, magic bullet explanations without solid evidence from the fossils themselves… Maybe it’s my time chasing broad-diet primates in the rainforest, but I’m hesitant to invoke one food type to explain brain expansion.

Lastly, he wrote,

...the problem now is coming up with ways of testing those hypotheses (meat, cooking, flexibility, fish, etc). In the end, I woudn’t be surprised if there were no magic bullet… but after all the new and exciting findings coming from isotopes, microwear, etc. not much would surprise me re: the evolution of human diet.

So there you have it -- it's more appropriate to say that a "broadened niche" is what helped fuel evolution of larger brains. Meat provided micronutrients, but more of it more often probably did nothing more than provide more calories so that we could use carbs to fuel our brains. It's certainly possible that other foods like cooking, shore-based foods, and other foods could've played the same role, too. Cooking Was More Important Than Meat for Brains

In a recent live chat with science writer Ann Gibbons over at ScienceLive, biological anthropologist Richard Wrangham at Harvard and comparative neuroanatomist Suzana Herculano-Houzel had a few interesting things to say on this topic while discussing their recent study published in Proceedings of the National Academy of Sciences. Their study showed that humans would have to spend more than nine hours a day eating raw foods to feed hungry brains (at 2,000 kcal a day) despite whether raw meat was included or not.

Cooking increases the net energy gained from food from 2 main routes. (1) It increases digestibility (the proportion of food that gets digested and absorbed). For instance cooking is estimated to increase the digestibility of starch in grains by around 30%. It also increases the digestibility of meat, by denaturing the protein. (2) It reduces the physiological costs of digesting our food - because cooking softens food, so it is easier to digest.

A commenter, Fred, questioned the role of meat eating versus cooking in leading to larger brains, to which Wrangham responded,

The meat vs. veg question is totally unresolved! Recent hunter-gatherers ate least meat near the equator, about 35% of diet, rising to ~100% near the poles. But recent hunter-gatherers are not necessarily good models for our ancestors. They have technology (bows and arrows, and poisons) that made hunting easier; but there again, there may have been more animals in the past. Also, the truly great places for hunter-gatherers to live in may have been those where farmers took over in masses, such as the Nile delta, with terrific plant supplies. So the 'paleo-diet' idea that our ancestors were all heavy meat-eaters doesn't acknowledge the likely large variation in % meat / % plant. Anyway, nowadays vegetarians have about the same body mass index as meat-eaters. So meat doesn't seem to do much to affect energy gain. (But raw-dieters, whether vegetarian or meat-eaters, are much thinner on average than cookivores, whether vegetarian or meat-eaters.) Bottom line: meat is less important than cooking when it comes to energy gain.

Is Wrangham suggesting redemption for raw dieting and vegetarianism as a healthy dietary approach in a modern world? Some people might think so. Although, as a nutritionist, I wouldn't recommend them a raw or vegetarian diet. After all, animal protein is valuable for maintaining or building muscle because of its higher quality amino acid profile; specific animal fats (DHA and EPA) are valuable for cardiovascular and brain health; and animal foods generally do provide necessary micronutrients such as zinc that are important for brain health.

Yet, the point here is there was probably "no magic bullet" that led to evolution of human brains and that meat is not necessarily a brain food. Whatever the combination of factors -- cooking, broadened niche, even food sharing -- that led to larger brains, carbs are what really fuel your neurons, and that's not to say you should overconsume carbs either; most of us already do too much of that. So, there's no sense in using evolution of larger brains as an argument for gorging on steak. Too much beef (and too little glucose), as The Bard would've believed, really might do "harm to your wit."

Update: I'm thrilled to share that this blog post was blogged about by Barbara J. King of NPR over at 13.7 -- check it out here. Enjoy!

*12-7 note: Many of you have pointed out my statement "They don't run on protein and fat" is not entirely accurate. Amino acids can be used to synthesize glucose, although this is a metabolically costly process in terms of ATP. As Ungar said, gluconeogenesis is limited in humans. Also, it is possible for the brain to use products of fat catabolism called ketones as fuel (to spare muscle amino acids; in fact, muscle will use ketones first); however, ketones are only use in the brain in periods of prolonged starvation (or on a diet extremely low in calories with little to no carbohydrates) and may lead to adverse health consequences.

24 comments:

I have a question. You point out that carbs are more important than protein for running a brain, but is that also true for building a brain? What is most important for a developing fetus or small child?

"Glucose is the dominant oxidative fuel for brain, but studies have indicated that fatty acids are used by brain as well." In this paper, about 20% of brain energy is from lipids:http://www.jneurosci.org/content/23/13/5928.shortlactate from muscles seems to be the first brain fuel for newborns before suckling: http://www.ncbi.nlm.nih.gov/pubmed/2998491 ketones are used to make brain lipids:http://www.ncbi.nlm.nih.gov/pubmed/6487643

Another way in which primate brains secured more glucose is by no longer converting it to ascorbate.Glutamine and aspartate are amino acids that are easily converted to glucose.Much of energy for growing brain comes from dietary fat, which is why milk is full of SFAs: http://sphotos-d.ak.fbcdn.net/hphotos-ak-snc7/417019_513098665375563_1634613470_n.jpgGalactose also contributes.

In humans, the first milk (colostrum) which is available for the first 5-7 days is relatively high in protein: 2.3g compared to mature milk which is 0.9g

Fats as a whole go from 2.9 mg to 4.2 and lactose from 5.3g to 7.0g as we switch from colostrum to mature milk.

(numbers in literature vary partly because human milk varies from woman to woman, within a feed & with time of day - but these are from current lit)

So big protein push at beginning of extrauterine life, then we switch to being fuelled more by carbs.

There has been some interesting research about protein deprivation in pregnancy causing negative effect on brain development but I think the link is with alterations to maternal lipid metabolism. LC PUFA's are critical for the developing brain.

regards, from an International Board Certified Lactation Consultant (& vegetarian)

Man, you got me all excited about maybe some new research or something, but this is nothing new.

I've read this "we need glucose to fuel the brain" argument before, and it's pretty easily refuted by the fact that, after a relatively short adaptation period during which the individual consumes little to no carbohydrate (e.g. during a period in which hunting is going well but there are no starches or fruits to be found), the brain reduces its need for glucose by quite a bit and replaces that requirement with ketones.

Under those keto-adaptive conditions, the brain is kept well fueled by the small amount that can be easily provided by gluconeogenesis, and the cells of the body become far more efficient over time at utilizing ketones.

i've read, or read of, a paper where brain glucose uptake actually increases in ketotis - fat-burning shift in muscles etc means more of the blood glucose is available to the brain. Which would match a higher-fatdiet fueling brain growth.However other primates also eat high-fat once fibre is fermented - but they need the bigger gut to do this.A combination of increasing cooking of carbs and sparing the gut by eating meat and fat and nutrient-rich organs, the two needn't be contradictory.

First of all, you say that neurons run exclusively on glucose, but according to Volek and Phinney (http://www.amazon.com/The-Art-Science-Carbohydrate-Living/dp/0983490708/ref=pd_bxgy_b_img_y), the brain uses glucose for only about 1/3 of its fuel in the keto-adapted state.

But the bigger mistake is inferring from the fact that the brain depends on glucose to the belief that the organism depends on eating carbohydrates. That doesn't follow, because gluconeogenesis is more than sufficient to provide the necessary blood glucose. You mention gluconeogenesis, but I didn't understand why you are discounting it as a mechanism for fueling the brain.

As far as I can tell, I can continue saying that meat-eating (and/or fire) probably fueled the evolution of larger brains.

There have been several comments suggesting that I've left out the role of gluconeogenesis from amino acids and use of ketones in the brain for energy. Yes, they can assist in brain function -- no one is arguing otherwise.

The argument made here is meat eating and gluconeogenesis were probably not enough to power evolution of larger brains (except in that it freed up glucose for use in the brain), as argued by top experts on the evolution of the human diet Peter Ungar and Richard Wrangham, and Suzana Herculano-Houzel.

Partly, that's probably due to limitations of gluconeogenesis in humans and limitations of use of ketones in some 86 billion neurons 24x7. So, a broadened niche of some kind was needed and that probably resulted from cooking.

I'm sensing some confusion along these lines, so perhaps I'll need to revisit in a future post.

Question: I tend to believe your arguments that proteins, and esp raw meat, are not a good source of energy. But I don't get the connection to brain size. The primary relevant factor for brain growth is survival. For that you need sufficient energy intake, alright, but how to achieve that is influenced by many other factors. It might well enough be that proteins play a relevant role for other physiological changes that were beneficial for survival, which is correlated with brain growth without being a direct causation. Does that seem possible?

> The argument made here is meat eating and gluconeogenesis were probably not enough to power evolution of larger brains (except in that it freed up glucose for use in the brain)

Hi David. There are two problems packed into that sentence. ☺ First, the argument that carnivory was insufficient to power the evolution of larger brains is undermined by the fact that carnivory is sufficient to power the operation of the modern human brain! Now, there could be some reason why some other food source was necessary, or at least helpful, for the evolution of large brains even though it is not necessary for the modern maintenance of same, but that would be a more complicated argument. The simple argument that you seem to be making is:

1. Eating meat alone is insufficient to power the human brain.2. Therefore the evolution of large brains must have been caused by something other than eating meat.

That simple argument is incorrect because its first premise (1) is incorrect.

Note that it is entirely possible that the scientists that you interviewed (Ungar, Wrangham, Herculano-Houzel) are confused about this. It is an ill-understood area right now and it is common for people to incorrectly state that meat-eating is insufficient to power the human brain.

The second problem in your sentence above is "(except in that it freed up glucose for use in the brain)". Whether that mechanism was or was not part of the process by which meat-eating caused the evolution of larger brains is irrelevant to the question of whether or not meat-eating caused the evolution of larger brains!

If meat-eating caused the evolution of larger brains, and if it did so by freeing up glucose for use in the brain, then so be it. That doesn't make any less the cause of the change, does it?

Barbara King from NPR blogged about this blog post http://www.npr.org/blogs/13.7/2012/12/02/166360654/carbs-not-meat-fueled-evolution-of-the-enlarged-human-brain?guid=1354552380257

I have responded to a couple of the comments there, including one from Peter who once again pointed out that ketones and gluconeogenesis could contribute energy to the brain. I'm republishing my response here:

Peter,

You are not the only person who's pointed out that I didn't address ketones. Perhaps, more also needs to be said about gluconeogenesis. I've addressed this question on my own blog comments and may even need to write another blog post.

Of course, I'm well aware that the brain is capable of using ketones and that gluconeogenesis can supply extra glucose, but I'll point out that these are inefficient processes with limitations and their own metabolic costs in terms of ATP. In short, the argument here is mainly that, because of 86 billion neurons (3x more than chimps) that prefer glucose, even top experts in evolution of the human diet -- anthropologist Peter Ungar, biological anthropologist Richard Wrangham, and comparative neuroanatomist Suzana Herculano-Houzel don't necessarily buy into the theory that meat eating was the primary driver that fueled evolution of the human brain. The way meat did play a role was in freeing up glucose calories. And it's more likely, per the experts, that a "broadened niche" and cooking probably had a lot more to do with our ability to fuel increases in brain size than did meat.

We also have four times more capability at getting glucose across into the brain in comparison to chimps, as per Duke biologist Greg Wray. That certainly should be evidence to the role of glucose in fueling our brains despite limited gluconeogenesis and inefficient ability to use ketones.

Lastly, I'll refer you to a relatively recent research article of Suzana Herculano-Houzel, who I mention in my article, but who I didn't quote. In her article, she discusses the metabolic costs of our 86 billion neurons, and how cooking evolution of the human brain may have only been possible through cooking freeing up calories from meat and starches alike. http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0017514

"We also have four times more capability at getting glucose across into the brain in comparison to chimps, as per Duke biologist Greg Wray. That certainly should be evidence to the role of glucose in fueling our brains despite limited gluconeogenesis..."

That glucose played a *role* in human evolution seems much closer to the truth here.

Maybe the reason we're four times more efficient than chimps at processing glucose across the blood-brain barrier is because glucose was a rare and precious nutrient.

"... and inefficient ability to use ketones."

I think Drs Volek and Phinney would debate you on that.

Again, there's this concept of a keto-adaptation period that a lot of researchers and theorists seem to be ignorant or dismissive of, so it's not surprising (as Zooko alluded) that the argument would be that fats couldn't possibly fuel the brain "efficiently".

For my readers' sakes, I want to explain a little more about ketone bodies. Ketones are products of fat catabolism. That's to say when your cells stop burning glucose, fat is mobilized from adipose tissue, it's broken down and ketones are used as fuel for several tissues. For example, muscles will often use ketone bodies to spare their amino acids from being used to synthesize glucose. But the brain itself is different -- ketones will only replace glucose use only after long periods of starvation. That's a nice adaptation when food is scarce that I'm not discounting, but it's not really ideal long-term (also since ketone accumulation can lead to ketoacidosis). I wouldn't consider that "efficient". Here's the thing: I certainly understand that a ketogenic diet might have some uses (such as in epilepsy) or even for weight loss.

Yet, in regards to evolution of larger brains, I'll point out again that the experts suggest there's certainly reasons for four times more glucose entering the brain (beyond glucose's scarcity) and that is more likely what helped fuel larger brains. Consider, for example, children with Glut1 deficiency syndrome (a deficiency in glucose transporters in the brain). They certainly show cognitive deficits. As Wray explains in my earlier post, "the brain literally starves" in a child with this syndrome.

In his talk, Wray suggests meat eating may have played a role too because of extra fatty acids provided for brain development. Other experts suspect meat played a role, too. Although, as pointed out here, Wrangham says cooking was most important and Herculano-Houzel (a neuroanatomist who's written heavily on the topic of brain metabolism!) does too.

Bottom line: it's probably more appropriate to say "broadened diet" or "cooking" played the greater role post 2mya (ancestral humans were already hunting by around 2mya) for increases in cranial size and it was likely due to more glucose.

I think we're rounding off to nearly the same conclusion, that it was the discovery of a method of exposure of more nutrients from food (i.e. cooking and maybe other tool use) that allowed the brain to continue expanding its volume in our evolution.

So the following is just nitpicking at this point:

"But the brain itself is different -- ketones will only replace glucose use only after long periods of starvation. That's a nice adaptation when food is scarce that I'm not discounting, but it's not really ideal long-term (also since ketone accumulation can lead to ketoacidosis). I wouldn't consider that "efficient". Here's the thing: I certainly understand that a ketogenic diet might have some uses (such as in epilepsy) or even for weight loss. "

That's definitely what the biochemistry and nutrition textbooks will tell you.

But it's also a "Just So Story" about the reasons why this mechanism exists.

What's interesting is that the biochemical milieu of what's called "starvation" is almost identical to that of "nutritional ketosis", in which the body really is not starving. An individual can be getting ample calories from fat and protein and be in this so-called "starvation" state, from a hormonal profile perspective.

We only see this state as an aberration because nowadays, a person generally has to choose to go into a ketogenic state (e.g. following an Atkins style diet), whereas -- if some paleolithic theorists are to be believed -- periods of obligate ketogenesis were endemic to the Paleolithic condition due to fluctuating food availabilities.

And again, ketogenesis really isn't inefficient. Emily Deans, MD, argues that it appears that the use of ketones to power the brain is *more* efficient, since there are fewer byproducts of metabolism, and there are no noticeable downsides to fueling the brain in this way.

You're taking the perspective that ketogenesis is inefficient because in our modern food economy, it's a rare state to be in, and thus our adult bodies will take a few days or weeks to adapt fully to the use of ketones.

But it's useful to note that newborns come into this world in a state of ketogenesis, and while they are breastfeeding they fluctuate in and out of ketogenesis pretty effortlessly.

So it seems disingenuous to call ketogenesis a state of "starvation", to imply that it's an aberrant condition and assign a negative connotation to it.

It makes more sense (to me) to say that we are born with a metabolic flexibility to survive equally well on sugar and on fat as our primary fuel source, and that the ubiquity of glucose in our everyday lives removes us from that flexibility.

It bears disambiguating the terms "ketosis" and "ketogenesis", because the concept of being in a ketogenic state leading to ketoacidosis is a misunderstanding.

Ketosis is the state of generating excess ketones, such that you can observe the excess being eliminated through the urine.

In a state of ketogenesis, an individual only produces as many ketone bodies as necessary, and very few, if any, ketone bodies will be observed in urine.

Thus the general progression of a person on Atkins is that they go into ketosis because their body needs to adapt and become more efficient at this type of metabolism, and over the course of days or weeks or sometimes months (depending on the individual), you see the ketone bodies decrease in urine.

The standard line in traditional medical literature about ketoacidosis being a danger of being in ketosis is technically correct.

But it's only people in a diseased state (such as diabetes) who end up in constant ketotic overflow, because failing to "keto-adapt" (as Drs Volek and Phinney would put it) is a symptom of metabolic dysfunction -- not an inevitability.

I have one complaint to raise in regards to Wrangham and the other study mentioned wrt comparisons between modern primate's diets and the energy needs of modern humans, typically concluding that we would have to spend the majority of our waking lives eating if not for cooking.

Chewing raw meat is not nearly so slow and difficult if basic stone tools are used for some pre-processing, and human cranial morphology has been shown to be more suited to shearing than crushing (opposite to extant apes), which makes a chimp chewing all day on a monkey carcass a poor parallel.

Perhaps more significantly, little consideration is made for human's greater capacity to access high-calorie fat-rich tissues such as brain and marrow using stone tools.

The presence of now-extinct megafauna (typically with greater fat stores) and involvement of shore foods also seems to escape consideration. It makes little sense to assume an identical food environment.

To be clear, I agree that cooking likely played a vital role in increasing nutrient availability for our ancestors but the methodology used in cases such as that recent study or arguments put forth in "Catching Fire" have the potential to greatly exaggerate the case.

Observations of modern raw foodists also tend to be a poor argument as so much of that movement is driven by preexisting health issues, eating disorders and psychological issues. There do seem to be at least some raw foodists with good health and body mass, who typically incorporate lots of fatty tissue and fermentation techniques and so it is unclear whether our ancestors could have successfully sourced such diets, but as there is no real study on such subjects as of yet I would suggest that the subject as a whole is not a good source of evidence wrt cooking vs. non-cooking.

OK, let's take animal products out of the equation and restrict early humans to only plants and cooking. Do early humans evolve into us....? Perhaps, but it seems like the inclusion of animal products became the catalyst of our large brains. So this would support your assertion that broadening our diet was key, but it was the expansion into animal products that was critical.