Food for thought: Cooking in human evolution

Did cooking make us human by providing the foundation for the rapid growth of the human brain during evolution? If so, what does this tell us about the diet that we should be eating, and can we turn back the culinary clock to an evolutionarily ideal diet? A number of provocations over the last couple of weeks have me thinking about evolution and diet, especially what our teeth and guts tell us about how our ancestors got their food.

This piece is going to ramble a bit, as it will also include some thoughts on the subject of diet and brain evolution sparked by multiple conversations: with Prof. Marlene Zuk (of the University of California Riverside), with Paul Mason (about Terrence Deacon’s article that he and Daniel wrote about), and following my annual lecture on human brain evolution as well as conversations today with a documentary crew from SBS. So let’s begin the meander with Dr. Watzke’s opening bit on why he thinks humans should be classified as ‘coctivors,’ that is, animals that eat cooked food, rather than ‘omnivores.’

Although I generally liked the talk, I was struck by some things that didn’t ring quite right, including Dr. Watzke’s opening bit about teeth (from the online transcript):

So everyone of you turns to their neighbor please. Turn and face your neighbors. Please, also on the balcony. Smile. Smile. Open the mouths. Smile, friendly. (Laughter) Do you — Do you see any Canine teeth? (Laughter) Count Dracula teeth in the mouths of your neighbors? Of course not. Because our dental anatomy is actually made, not for tearing down raw meat from bones or chewing fibrous leaves for hours. It is made for a diet which is soft, mushy, which is reduced in fibers, which is very easily chewable and digestible. Sounds like fast food, doesn’t it.

Okay, let’s not be pedantic about it, because we know that humans, in fact, do have canines. Watzke’s point is that we don’t have extended canines, long fangs that we find in most carnivorous mammals or in our primate relatives like chimps or gorillas.

The problem is that the absence of projecting canines in humans is a bit more interesting than just, ‘eat plants=less canine development.’ In fact, gorillas are completely vegetarian, and the males, especially, have massive canines; chimpanzees eat a very small amount of animal protein (something like 2% of their caloric intake), and they too have formidable canines. Our cousins don’t have extended canines because they need them for eating – rather, all evidence suggests that they need big fangs for fighting, especially intraspecies brawling among the males in order to reproduce.

The case of chimpanzee canines is especially intriguing because, with the remains of Ardipithecus ramidus now more extensively discussed, a species potentially close to the last common ancestor of humans and chimps, we know very old hominids didn’t have pronounced canines. If the remains are indicative of our common ancestor with chimpanzees (and there’s no guarantee of that), then it’s not so much human canine shrinkage alone that’s the recent evolutionary development but also the re-development of chimpanzee canines, probably due to sexual competition.

Even with all the possible points of disagreement, the basic point is that human teeth are quite small, likely due both to shifts in our patterns of reproduction and sexual selection and to changes in our diet. Over the last few million years, our ancestors seemed to have gotten more and more of their calories out of meat, one argument goes, at the same time that our ancestors’ teeth were getting less and less capable of processing food of all sorts (or, for that matter, being effectively used as a weapon).

Hungrier and hungrier, with weaker jaws and smaller teeth

As I always remind my students in my lecture on human brain evolution, if big brains are so great, why doesn’t every animal have one? The answer is that big brains also pose certain challenges for an organism (or, if you prefer, ‘mo’ neurons, mo’ problums’).

The first and most obvious is that brains are hungry organs, devouring energy very fast and relentlessly, especially as they grow. The statistic that we frequently throw around is that the brain constitutes 2% of human body mass and consumes 25% of the energy used by the body; or, to put it another way, brain tissue consumes nine times as many calories as muscle at rest. So, if evolution is going to grow the brain, an organism is going to have to come up with a lot of energy – a smaller brain means that an animal both can eat less and be more likely to survive calorie drought.

But hominin brain growth also presents a few other problems, which sometimes get underestimated in accounts of our species’ distinctiveness. For example, natural selection had to solve a problem of excess heat, especially if big-brained hominids were going to do things that their big brains should tell them are ill advised, like run around in the hot sun. As your brain chews up energy, it generates heat, and the brain can overheat, a serious problem with sunstroke. The good news is that somewhere along the line our hominin ancestors picked up a number of adaptations that made them very good at shedding heat, from a low-fur epidermis and facility to produce copious sweat to a system of veins that run from the brain, shunting away heat (for a much more extensive discussion, see Sharma, ed. 2007, or the work of anthropologist Dean Falk, including her 1990 article in BBS laying out the ‘radiator theory’).

Not only is our brain hungry and hot; our enlarged cranium also poses some distinctive challenges for our mothers, especially as bipedalism has narrowed her birth canal by slowly making the pelvis more and more basket shape (bringing the hips under our centre of gravity). The ‘obstetrical dilemma,’ the narrowing of the birth canal at the same time that the human brain was enlarging, led to a bit of a brain-birth canal logjam, if you’ll pardon the groan-worthy pun (see Rosenberg and Trevathan 1995).

Although frequently presented as a significant constraint on brain growth (and I’m sure all mothers would agree that they wouldn’t want those brains getting too much larger), the obstetrical dilemma likely also led to hominin infants being born more and more premature, relative to other primates. Humans possess only 23% of their mature brain size at birth. Other mammals are much more; macaques, for example, attain about 65% of their adult brain size at birth. Some comparative research suggests human infants are basically extra-uterine fetuses, relative to other mammals, until about 21 months of development (a year after birth).

The early birth eases labor, of course, but it’s also great for making humans teachable, extending the rapid rate of immature growth much longer. Humans go from having about normal brain-body weight ratios for great apes at birth, until our brains are approximately 3.5 times larger relative to our weight than for our cousins. But like so many evolutionary changes, the extraordinary helplessness of the infant for so long brings with it enormous logistical challenge. I like to joke with students that getting the sperm and egg together is the easy part of human reproduction, childbirth only a bit worse (that passes for humour in my lectures); the real trial is keeping this helpless little thing alive long enough to become viable on its own. Currently, in Australia, this appears to be approaching three decades after birth (… I will refrain from making any jokes about Generation ‘Why?’…).

But the real kicker, for a discussion of diet, is that the whole biological apparatus for getting energy (teeth, jaw muscles, stomach, guts) appears to be inversely related to the enlarging brain over the last few million years. As the hominin body routed more energy to the brain, one of the systems that lost out is the gut; this brain-gut trade-off was one of the basic points argued by Aiello and Wheeler (1995) in their original explanation of the ‘expensive tissue’ hypothesis. In fact, Aiello and Wheeler (1995: 205) calculate that the energy savings gained from the liver and gastro-intestinal tract being smaller than predicted for a human-sized mammal exactly offset the unusually high energy demands of the disproportionately large human brain.

Cooking up some brain evolution

So how, given the decreasing efficiency of jaw and gut, did our ancestors get enough energy? According to Heribert Watzke’s TED talk, our ancestors figured out fire and cooking, and thereby used these techniques to extract more energy with less effort, from a range of foods. Watzke doesn’t explicitly cite his work, but the leading advocate of the evolutionary significance of cooking is Richard Wrangham, and I’ve been thinking a lot about his work because I’ve just lectured on the topic, and because of a conversation I had with Prof. Marlene Zuk a couple of weeks ago.

Prof. Marlene Zuk (UC Riverside)

Prof. Zuk is a specialist in the study of sexual behaviour, sex selection, and parasites, and, although she’s done a lot of interesting work, I find some of her research (and her lab’s) on the possibility of conflict between natural and sexual selection to be especially intriguing, likely because in my own life, the sexual drive and survival instincts seem to be so at odds (but perhaps this is just too much personal information for our readers…).

In that piece, I pointed out intersections between Prof. Zuk’s thoughts on ‘paleofantasies’ in diet and my own work on sports (the quote is from my earlier piece):

Zuk draws on Leslie Aiello’s concept of ‘paleofantasies,’ stories about our past spun from thin evidence, to label the nostalgia some people seem to express for prehistoric conditions that they see as somehow healthier. In my research on sports and masculinity, I frequently see paleofantasies come up around fight sports, the idea that, before civilization hemmed us in and blunted our instincts, we would just punch each other if we got angry, and somehow this was healthier, freer and more natural (the problems with this view being so many that I refuse to even begin to enumerate them). It’s an odd inversion on the usual Myth of Progress, the idea that things always get better and better; instead, paleofantasies are a kind of long range projection of Grumpy Old Man Syndrome (‘Things were so much better in MY day…’), spinning fantasies of ‘life before’ everything we have built up around us. (From Paleofantasies of the perfect diet – Marlene Zuk in NYTimes)

In that piece, I tried to point out that you can’t recreate a ‘Paleolithic diet’ by simply rolling your shopping trolley down the aisles of the shopping market, picking out nuts and fruit and lean meat – our ancestors ate on the move, seasonally, from hundreds of food sources, including stuff we likely wouldn’t have laid a hand on, let alone eaten raw. Think the Road Kill Café with a side of grubs and nuts, and then remember that even our meat is domesticated, all soft and flabby and buttery-easy to eat. Not the kind of dinner that was a fair fight.

I won’t re-write the first essay, but I do want to reflect on one thing that we talked about: the role of cooking in human brain evolution. Prof. Zuk and I specifically discussed, in our lunch meeting over cold finger sandwiches and fruit slices, Richard Wrangham’s thoughts on the subject, and I tried to describe why I felt uncomfortable with his specific theory, although I generally thought he was right about the overall pattern.

Richard Wrangham on cooking

Richard Wrangham, Harvard (Photo: Jim Harrison)

Richard Wrangham is Ruth Moore Professor of Biological Anthrpology at Harvard University, with a strong background in the study of chimpanzees. Wrangham argues that hominins began cooking their food 1.8 million years ago (mya) even though the earliest evidence of controlling fire is, at best, half that old. He points to the diet of chimpanzees:

Richard Wrangham has tasted chimp food, and he doesn’t like it. “The typical fruit is very unpleasant,” the Harvard University biological anthropologist says of the hard, strangely shaped fruits endemic to the chimp diet, some of which look like cherries, others like cocktail sausages. “Fibrous, quite bitter. Not a tremendous amount of sugar. Some make your stomach heave.” (from Cooking Up Bigger Brains)

Wrangham has elaborated the argument in a number of his academic papers (for an excellent review, see Wrangham and Conklin-Brittain 2003). He’s also argued that the kind of raw bush meat that chimpanzees eat – usually smaller monkeys, and even then only the mature males get it – is tough and unpleasant, with large portions of it in skin and fur that don’t go down easy, by any stretch.

Wrangham’s theory is controversial in anthropology, and I don’t fully agree with him, but he does put his finger on the complexity of the brain-jaw trade-off in human evolution. Our ancestors were steadily growing larger brains, energy-hungry organs, while the on-board apparatus that they used to get energy out of food (teeth, jaws, guts) was diminishing in effectiveness. Our ancestors had to come up with some sort of better solution, either better food or stronger food processors.

Wrangham suggests that a marked decrease in tooth size and gut length with the advent of Homo erectus, especially as H. erectus had a large brain and a tall, lanky body, suggest that a profound change in diet must have occurred. The usual explanation, the one Wrangham reacted against, is that our hominin ancestors were steadily shifting to a diet heavier and heavier in animal protein as first scavenging and then hunting techniques improved. Stone tools were the key technological innovation, first because they could become a kind of prosthetic teeth, allowing us to ‘pre-chew’ food by pounding, cutting, grinding and butchering it with our tools.

Wrangham, however, thinks that only cooking could have unlocked the calories from food efficiently enough to make the growing hominin brain viable, that chimpanzee food – including the meat that chimpanzees get – is hard to chew, and not sufficiently energy dense. I’m trying to remember which of his publications it appears in, but I recall reading an estimate that I think Wrangham made: if we ate chimpanzee food (which would be unpleasant, as it’s lousy), we would need five kilograms a day to feed ourselves. This bulk of dry fruit would require something like six hours of chewing and demand that our bodies process outrageous amounts of fibre.

An epicurean revolution?

I think Wrangham is only partially right, just as the older theory of increasing carnivorousness in the human diet is probably accurate, to some degree, as well. To me, the evolutionary growth curve of the hominin brain doesn’t look like a single revolution, but a steady, accelerating growth that is more likely the result of multiple changes over time rather than a single innovation.

The steady, accelerating pattern of brain growth was likely supported by shifts in diet as new food-procuring and preparation techniques steadily lifted the energy constraint on the brain’s development. That is, when we look at the upward turn of brain growth over hominin evolution, I think we’re seeing the effects of a set of constraints that decreased: our ancestors, through biological, technological and social change, overcame a number of selective constraints on brain growth, so the organ steadily increased in size.

Rather than being contained by stabilizing selection, the lifted constraint model of selective shift suggests that directional selection could then push the brain toward greater size, selecting for any number of possible greater abilities: social intelligence, strategic foresight, problem solving, or, later, language ability. In other words, brain growth could be driven both by adaptive benefit and by decreasing biological ‘cost’; changing the diet could decrease the cost of the bigger brain just as having a more-and-more premature infant could decrease the obstetrical constraint on encephelization.

Of course, you’d still need a genetic mechanism that could generate variation toward greater brain size and selective advantages that would make the larger brain beneficial. In our case, a significant shift in the maturation pattern to extend the immature growth curve, probably from regulation of gene expression, could provide the mechanism, and any number of advantages have been offered to explain why a bigger brain would be a groovy thing to have. But the removal of constraint would upset a pattern of stabilizing selection, shifting our ancestors into a pattern of directional selection toward greater brain size.

What I like about Wrangham’s approach, then, is that he focuses on relaxed constraint, not simply on adaptive advantage (the idea of relaxed constraint has been on my mind especially since Paul and Daniel’s piece on Terrence Deacon). Just because a big brain is nice to have doesn’t mean a species gets one – the species has to develop ways of dealing with the downsides and limitations, lifting the selective pressures that suppress greater brain growth. In humans, these constraints would include, among others, energy demands, heat dissipation, childbirth, and anatomical remodeling, as I outlined.

I don’t think it was just cooking that overcame the energy constraint, although I think Wrangham is right (and Watzke, too) that using fire to process and concentrate the energy in food would have been a major breakthrough. But unlike Wrangham, I think cooking was part of a pattern of innovation in finding, exploiting and processing high-energy caches of food. The option wasn’t just a forced choice between six hours of chewing on a chimpanzee-like diet or Master Chef. Humans would have found numerous ways to exploit more animal protein, including a lot of invertebrates and aquatic sources, improved their ability to follow seasonal fluctuations in high energy food, pounded fibrous foods, and a number of other techniques.

For example, Watzke highlights how fire could have been used to process unfamiliar foods as hominin range expanded. While this is true, there are other techniques as well that can be just as effective, or even more effective for dealing with some foods, such as soaking, drying, or fermenting. Some of the cognitive traits necessary for effectively transforming foods through non-cooking methods also likely contributed to the ability to use fire effectively: strategic thinking, restraint, planning.

Not all contemporary societies (even our own) rely completely on fire for food preparation, finding other ways to soften, concentrate and prepare raw food so that we don’t wind up chewing all day. Meat can be prepared with acid, such as fruit juice, or cut into strips or small pieces to ease eating; many societies eat raw fish, shellfish, eggs or other foods; tough roots get pounded, ground, and soaked; the sun can be used to dry and preserve meat; liquids and even proteins can be fermented, letting decomposition partially process food; and other animals can be made to process food for us, as we intercept their milk, their honey, or other edible products, often intended for their own consumption. Cooking then was probably one of the most important innovations for concentrating and softening food, but it was not the only one.

I agree with Wrangham that cooking is extraordinarily important, that you can’t grow a human brain on a chimpanzee diet. But, ultimately, fire is not the only thing that makes the human diet different from a chimp diet; our pre-fire hominin ancestors were likely already becoming much more versatile, discerning, wide-ranging eaters, with a whole bag of food preparation tricks and a growing ability to find high-return energy-dense food sources.

One of the many interesting wrinkles in the evolutionary story of our species is that hominins went from being largely, perhaps almost entirely, herbivorous, to being staggeringly versatile eaters. Our hunting and foraging ancestors (and our contemporaries) found their calories all over the place, from sources that far exceed the variety that we can find today in a well-stocked grocery store. As I remind my undergraduate students, with their often chicken-beef-pork urban diet of constant repetition, our ancestors were getting their animal protein from such a variety of species that we can scarcely imagine the selection.

John Durant tries to explain the ‘Caveman Diet’ to Stephen Colbert — Thanks to the Wednesday Roundup (and Daniel) for the link! Offers some good advice on exercise and diet, but thankfully doesn’t attempt to get to deep into the paleoanthropological evidence. Don’t eat heavily processed simple carbohydrates and mix up your fitness regimens; that, I can agree with!

About gregdowney

Greg Downey is Associate Professor of Anthropology at Macquarie University in Sydney. A cultural anthropologist by training, neuroanthropologist by passion, Greg studies sport and dance as methods for perceptual, neurological, motor and phenotypic change. You can follow Greg @GregDowney1

17 Responses to Food for thought: Cooking in human evolution

Thanks for your article which portrays the issues nicely, but I have to protest at this: “But unlike Wrangham, I think cooking was part of a pattern of innovation in finding, exploiting and processing high-energy caches of food. The option wasn’t just a forced choice between six hours of chewing on a chimpanzee-like diet or Master Chef. Humans would have found numerous ways to exploit more animal protein, including a lot of invertebrates and aquatic sources, improved their ability to follow seasonal fluctuations in high energy food, pounded fibrous foods, and a number of other techniques.” I protest because you make it seem that I think of cooking as a one-time advance; and elsewhere you imply that I see cooking as an alternative, rather than a complementary, explanation to meat-eating for significant changes in human evolution. I think you will find that my 2009 book ‘Catching Fire: How Cooking Made Us Human’ corrects both impressions. There I discuss how increased meat-eating and cooking worked together; and how advances in cooking can be expected to have occurred throughout the Pleistocene, and might contribute to the steady and accelerating rise in brain size.
Cook on!

Alas, the rhetorical excesses of writing for the web often miss the subtleties of academic writing, it’s true. I tried very much to capture the difference in emphases between Prof. Wrangham’s model of evolution, but I have to admit that I’ve only read the papers, not the entire book, so I obviously (from Prof. Wrangham’s comment) didn’t really get my tone right.

I think my objection, as so often is the case with science writing versions of academic research, is to other people’s similarly skewed accounts of the original work. (Yes, I realize that I’m pleading guilty to what I also accuse others of doing.) Prof. Wrangham’s research is often described as being an argument that cooking is an alternative explanation to brain science increase. So I’m taking issue more with the popular science version of Prof. Wrangham’s argument, rather than his book. For that, I apologize.

But what I hope is more important from the article is my admiration for Prof. Wrangham’s shift of the argument about brain evolution, away from thinking of only the adaptive advantages of a big brain toward the lifting of constraints on brain growth. This shift in explanatory emphases, to me, is a crucial re-figuring of how we think about brain evolution, one that Prof. Wrangham shares with people like Terrence Deacon, to highlight lifted constraints and not just adaptive advantage.

Thanks to Prof. Wrangham, and to those who really want to follow this argument, please consider picking up Prof. Wrangham’s books!

Our Pages

Neuroanthropology. Sometimes it’s straight-up neuroscience, sometimes it’s all anthropology, most of the time it’s somewhere in the middle. Greg is the cultural guy, now interested in bio stuff. Daniel is the bio guy, now interested in cultural stuff. Or, to say it differently, Greg does capoiera, mixed martial arts, and rugby. Daniel does alcohol, drugs, and video games. Two very different styles of recreation.