Tag Archives: Forager

Ritual has come to be thought of in popular discourse as a kind of action that is ineffective, superficial, and/or purely formal, and this view is the unexamined premise behind much of ritual studies. This attitude explains why …we “know it when we see it” – and what we know to be rituals when we see them are acts that are apparently non rational, in which the means do not seem proportionate to the ends, the intended objects of human action are non empirical beings, or the theories of efficacy that ostensibly explain the ritual acts are inconsistent with modern, scientific paradigms. This reaction is similar to what an archaeologist does when he discovers a structure whose purpose is unclear – he calls it a temple. … The notion that ritual is ineffective is false. … Shamatic rituals heal, legal rituals ratify, political rituals unify, and religious rituals sanctify. Rituals transform sick persons into healthy ones, public spaces into prohibited sanctuary, citizens into presidents, princesses into queens … One of our most important tasks as scholars is to explain how rituals accomplish these things. (pp.6-7)

I bought and read that book, and have also been reading The Creation of Inequality, which emphasizes the centrality of rituals to foragers:

Cosmology, religion, and the arts were crucial to hunters and gatherers. … The lessons of myth were passed on audio visually. Performances combining art, music, and dance fixed in memory the myth and its moral lessons. … We doubt that art, music, and dance arose independently. More than likely the evolved as a package that committed sacred lore to memory more effectively than any lecture. … The archeological data suggest … that the use of the arts increased as larger social units appeared, because each moiety, clan, section, or subsection had its own body of sacred lore to commit to memory. … Dancing, drinking, and singing for days, as some tribes did, opened a window in the the spirit world and thereby confirmed it existence. (pp. 62-63)

Also, reading a BBS article on “ritual behavior”, I came across this comment by Bjorn Merker:

Literal duplication … lies at the very heart of ritual. The need to remember and reproduce essentially arbitrary details on an obligatory basis burdens behavior with a handicap, and the ability to sustain that burden is proof of capacity, and hence tends to impress. … There is reason to believe that humans by nature are carriers of ritual culture in the sense just defined. We, in contrast to chimpanzees – indeed, alone among all the primates but like many songbirds and the humpback whale – are in possession of a neural mechanism that allows us to duplicate with our voice that which we have heard with our ears. Most mammals, who excel at learning in other respects, are incapable of doing so; yet we humans do so with every song we know how to sing and with every word we know how to pronounce. … Such a perspective helps us understand why ritual form marks human culture not only in domains touched by precautionary concerns, but in well nigh every area of human pursuit. (p.624)

While much about ritual remains puzzling, one thing seems clear: the essential human difference, the one that has let us conquer the Earth, was an ability to accumulate useful innovations via culture. And this primarily required a good ritual sense, i.e., a good way to watch and copy procedures like starting a fire, shaping a knife, etc. Having language and big neighbor-friendly social groups helped, but were less essential.

The fact that language require good vocal ritual skills suggests better ritual abilities appeared before full recursive language. Since they were very social, pre-language but post-ritual humans probably filled much of their social lives with complex rituals, showing off their abilities to precisely execute complex procedures, and showing loyalty via doing their group’s procedures. Music, dance, and other such ritual habits continued after full language.

Language let us better express and enforce complex social norms, and helped us gain a more conscious and flexible understanding and control of our procedures. But we still have poor introspective access to our pre-language systems, and so still don’t know a lot about why we do which procedures when, and why they make us feel good or bad.

When humans believe in hidden spirits who take an interest in whether they follow social norms, hard-to-understand rituals offer a natural place to locate their connection to spirits. Humans can believe that spirits watch their rituals, and then respond by making them feel good or bad. This helps us understand why religions emphasize rituals.

Added: This poll has a majority favoring culture coming before language.

Looking for insight into farmer-era world views, I just read the 1931 novel The Good Earth, about Chinese farmers. It is of course more a morality tale than a documentary, and the main character soon gets rich, and is then no longer a representative farmer. But the story illustrates differences between farmer vs. forager style morality.

Foragers live in close egalitarian bands, with behavior well adapted to their environment. So forager morality issues are mostly about well-adapted personal behavior in conflict with group interests. Foragers sin by bragging, not sharing, being violent against associates, etc.

Farmer morality, in contrast, is much more about conflicts within people than within groups. Farmers sin by being lazy, wanting overly fancy foods, taking drugs, having sex with prostitutes, wanting status markers that cost too much in the long run, etc. Farmers need to resist internal temptations to do things that might make sense for foragers, but which can ruin farmers. These can also ruin one’s family and friends, so farmer sins also have shades of selfishness.

Of course farmers also care about bragging, violence, etc. In some sense farmers have more morality – more and stronger rules, to fight against stronger natural inclinations. So farming culture introduced religion and stronger social pressures to enforce their rules, to keep farmers from relapsing into foragers.

This helps me make sense of Jonathan Haight’s observations that liberals, who I’ve called forager-like, rely on fewer moral principles than conservatives, who I’ve called farmer-like:

The current American culture war, we have found, can be seen as arising from the fact that liberals try to create a morality relying primarily on the Care/harm foundation, with additional support from the Fairness/cheating and Liberty/oppression foundations. Conservatives, especially religious conservatives, use all six foundations, including Loyatly/betrayal, Authority/subversion, and Sanctity/degradation. (more)

I’ve suggested that as we’ve become richer, we’ve become more forager-like. If our descendants get poor again, they’ll probably need stronger social norms again, to get them to resist temptations to act like foragers and do what is functional in their world. Their morality would probably rely on a wider more-conservative-like range of moral feelings.

In the em scenario I’ve been discussing here, sex would be unimportant except as a possible way to waste too much time. So em morality would be pretty liberal on sex. But money, work, and reputation would be important – ems would probably have pretty conservative attitudes on keeping their word, doing their job, obeying their boss, and not stealing. When mind theft or virus corruption are big risks, they’d also probably have strong purity feelings about avoiding acts that could risk such harms. And they’d probably feel strong clan loyalty, even beyond what farmers feel, to the clan of copies of the same original human.

Foragers distinguish between camp and the wild. In camp, things are safe and comfortable, and people should be pleasant. The wild, in contrast, is dangerous and uncontrolled. In camp, some of us must watch out for intrusions from wild, such as storms, wild animals, or hostile tribes.

Some of us must also periodically venture into the wild, to bring back food and other useful materials. But it is important that whatever we bring back be tamed before it gets here. Don’t bring back live dangerous animals, don’t leave poison berries around camp where people might think they are safe, and leave violent aggressive hunt habits out there in the wild. What happens in the wild, should stay in the wild.

Ideas and concepts can be dangerous and disruptive. Ideas influence the status and attractiveness of people and activities, and who is blamed and credited for what outcomes. For a society vulnerable to social disruption, ideas can be wild.

Today, most of the ideas and concepts that we come across have been tamed. They have long been integrated into our ways of thinking, and we have worked out attitudes and opinions to help us avoid being cut by their sharp edges.

But today we must also deal with a steady stream of new untamed ideas. Some of these are the side effect of ordinary people doing ordinary things. Others come from intellectual explorers, who purposely venture into the wild in search of new ideas. How do we tame such ideas?

We celebrate our intellectual explorers, both those who come back with useful ideas, and those whose useless ideas show off their impressive explorer abilities. But we are also wary of their trophies, just as foragers would be way of a hunter bringing a strange live animal into camp. We want people we trust and respect to tame those ideas before let them flow free in our camp of easily discussed ideas. Wild explorers, who may have “gone native”, can be useful in expeditions, but must remain under the control of more civilized explorers.

I think this helps us understand why universities, some of the most conservative institutions we have, are home to our most celebrated intellectuals. Academic institutions such as universities, academic journals, peer review, etc. seem far from ideal ways to encourage innovative ideas. But they seem like better ways to ensure outsiders that ideas have been safely tamed. The new ideas that academics endorse can be safely quoted and an applied with minimal risk of wild uncontrolled disruption. So when ideas originate among wild untamed academic-outsiders, we prefer to attribute them to the safe academic insiders who tame them.

When we are willing to risk being exposed to wild untamed ideas, we turn less to academics, and more to startup companies, passionate writers, activists, etc. And in our youth, many of us are eager for such exposure, to show that we are no longer children who must stay safely in camp – we are strong and brave enough to venture into the wild.

But when we get children of our own, and feel less a need to show off our derring-do, we prefer tamed idea sources. We prefer to hire kids who got their ideas from universities, not startups or activists. And most prefer their news to come from similarly tamed journalists. We applaud wild ideas, but prefer them tamed.

I have suggested that long run growth can be described as a sequence of exponential growth modes, from primates to foragers to farmers to industry, where mode transitions are similar in their degree of suddenness and growth rate change factors. This model will be tested in the future – it suggests that within a century or so we’ll see a change within five years to a new mode where the economy doubles every month or faster.

But my model can also be tested against the past. Our data on the animal, forager, and early farming eras is pretty poor. My hypothesis suggests that the forager era was one big growth mode similar to the farming or industry eras, with a relatively smooth rate of growth in capacity (even if rare disasters temporarily disrupted the use of that capacity), and that the forager to farming transition has a level of smoothness similar to that of the farming to industry transition.

Contrary to my model, many have suggested there was an important comparable revolution in human behavior around 50,000 years ago. My model predicts that growth accelerated smoothly from around 100,000 years ago to the near full speed farming world of about 5000 years ago, similar to the way growth accelerated from 1600 to 1900.

The latest results seem to support my model:

Back in 2000, a now famous scientiﬁc paper called “The Revolution That Wasn’t” argued that the then-conventional wisdom that modern human behavior had erupted in a “creative explosion” about 50,000 years ago in Europe was wrong. Rather, anthropologists Sally McBrearty and Alison Brooks contended that modern behavior, including creativity, has deep and ancient roots, going back some 300,000 years ago in Africa (Science, 15 February 2002, p. 1219).

At a meeting here last month, researchers heard new evidence that human evolution took a gradual, rather than revolutionary, course during two other key junctures in prehistory. A study of ancient stone tools from South Africa concludes that hunters manufactured spears with stone points—a sign of complex behavior—200,000 years earlier than had previously been thought. And new excavations at a 20,000-year-old settlement in Jordan, laden with artifacts typical of much later sites, suggest that the dramatic rise of farming villages in the Near East also had early and deep roots. … Many archaeologists now think that apparent “revolutions” are due to gaps in the record or to behavioral shifts triggered by changing conditions, rather than sudden advances in cognition. What appear to be precociously sophisticated behaviors are really reﬂections of what prehistoric humans were capable of all along. (more)

The pubic louse evolved around 3.3 million years ago, … and it could not have done so until ancestral humans lost their body fur, creating its niche. What’s more, [we have] dated the evolution of body lice, which live in clothing, to around 70,000 years ago. So it looks like our ancestors wandered around stark naked for a very long time. (more)

Theories of why we stayed naked for so long vary:

When our ancestors moved to more open ground, natural selection would have favoured individuals with very fine hair to help cooling air circulate around their sweaty bodies. But sweating requires a large fluid intake, which means living near rivers or steams, whose banks tend to be wooded and shady – thus reducing the need to sweat. What’s more, the Pleistocene ice age set in around 1.6 million years ago and even in Africa the nights would have been chilly. … Other animals on the savannah have hung on to their fur. …

[Some argue] that we did not shed our pelts until we were smart enough to deal with the consequences, which was probably after modern humans evolved, about 200,000 years ago. “We can make things to compensate for fur loss such as clothing, shelter and fire.” … [Some argue] natural selection favoured less hairy individuals because fur harbours parasites that spread disease. Later, sexual selection lent a hand, as people with clear, unblemished skin advertising their good health became the most desirable sexual partners and passed on more genes.

Suresh Naidu pointed me to a fascinating 1930 essay (excerpts below) by the famous economist John Maynard Keynes on the long term future. Consideration of the far future put Keynes into a very far mode, where he upheld far ideals against near practical constraints. While Keynes accepted that farmer ideals of work, property, and saving for the future were needed to maintain economic growth, he detested such ideals, and looked to a future roughly a century hence when, humanity’s absolute material needs being satisfied, we embraced forager ideals for sharing material goods, living in and enjoying the moment, and just doing what feels right.

Now Keynes did note that humans also seek relative status, but he seemed to assume that humans would coordinate to suppress such urges, and to keep just enough farmer habits to preserve material wealth. In his far idealistic mode, he didn’t even seem to consider the possibility that nations would still compete for relative status, and promote farmer norms for that purpose, or that individuals would still work full time seeking personal relative status.

We are now only eighteen years shy of Keynes’ 2030 forecast date. While there has certainly been a weakening of farmer ideals, especially on fertility, we are far from embracing forager ideals overall. We still work hard for material wealth. Our lower classes have moved furthest, more rejecting marriage, religion, and full-time work, via relying heavily on the sharing of others, and this is considered a big problem. Give us another century of similar economic growth, and this lower class malaise might well infect most everyone. But it is far from clear that this would settle at a stable rich no growth equilibrium, rather than economic and population collapse.

In any case, even we preserve farmer norms enough to support continued growth, material wealth per person will only be high until we find new techs to increase population faster than we increase wealth. While in theory-overconfident far mode, Keynes’ is tempted to see his forager-value future as lasting indefinitely, it would in fact only be temporary. The em transition that I envision within a century or two should quickly return most people (i.e., most ems) to near subsistence income, and put a huge premium on reviving farmer-like norms and ideals. And even if that doesn’t happen, growth must slow in the very long run.

In my culture, most stories are not about work life, and the few stories that are focus on a narrow set of unusual jobs like soldier, detective, politician, artist, doctor, lawyer, or teacher. Why?

One explanation is that work is usually boring. But this seem weak to me. I’m often fascinated to read business-book stories about work teams and firms competing (I’m enjoying The Innovator’s Solution) and Horatio Alger type stories were once more popular in my culture. Furthermore, a recent New Yorker article (quotes below) says similar stories are now very popular in China.

The author of that article seemed displeased by this trend, and what it says about Chinese culture. She talks of “get-rich” “Darwinian” “combat”, “manipulation and deceit”, and a loss of “morals”. And this seems to me a clue about why we don’t tell such stories – they push realism on topics where we’d rather stay idealistic.

Consider that we avoid telling young kids stories about corrupt police and teachers taking advantage of their power, since we are trying to get kids to respect and trust such authorities. Similarly, we avoid telling kids stories about selfishness and betrayal in romantic and sexual relations, as we push idealized accounts of marriage, love, etc. Similarly, we may as adults avoid stories that threaten other ideals.

Stories need conflict. For stories about soldiers, detectives, politicians, artists, doctors, lawyers, and teachers, we know of socially acceptable types of conflict, which do not challenge key ideals. But stories about conflicts in ordinary jobs more easily violate key ideals, and trigger moral outrage.

We don’t mind stories about independent professionals competing to please costumers. But the foragers inside us hates hearing about team members who don’t work entirely for the good of the team, and especially about bosses insisting that things be done their way. Foragers are ok with being “lead” covertly, by someone who has gained their respect and agreement. But taking orders just to get material goods, that seems immoral. The moral priority of war, or of medicine, may make it ok to take orders there. But otherwise, no!

We sometimes have stories about heroic employees resisting an evil boss. But overt moralizing gets boring fast, especially when we realize these employees could just quit their jobs. Worse, we know that most of us don’t resist bosses – we obey them, mainly because we like getting paid. We don’t like admitting that that while we are returning to forager ways in our leisure time, we have become hyper-farmers in our work life. And so in our story worlds, we mostly try to pretend that work doesn’t exist. Props to the Chinese, for facing reality more.

Firms often have big obvious misallocations of resources, where … many highest status folks in the firm resist … changes. … If a prestigious outside consulting firm weighs in, that can turn the status tide. Coalitions can often successfully block a CEO initiative, and yet not resist the further support of a prestigious outside consultant. … Good-looking kids from our most prestigious schools … are the cheapest folks you can buy with our most prestigious affiliations.

What is status? One theory is that status is a commonly-seen summary of one’s value as an ally. In places where physical strength is more useful, strength counts more for status. In places where knowing the king is more useful, knowing the king counts more. And so on. But the consulting tale I tell above seems at odds with this theory.

Imagine that status in a firm was a proxy for one’s usefulness as an ally within that firm, summarizing the threats one could credibly make, the people one could fire, the favors one could plausibly call in, etc. And imagine that the current equilibrium was that opponents of change together held more of these useful resources – they successfully blocked change.

Now imagine that the CEO hires an outside consultant who writes a report recommending change. It should be clear to everyone that this outside firm has no direct power within the firm. It cannot fire anyone, go slow on a project, etc. So if status was just a proxy for relevant local abilities, then this consultant should have little status. Thus if a consultant actually does help the CEO by lending status to the CEO’s side, status must be something else.

So I’m led to consider a sticky-feature concept of status. Long ago coalition politics was important, and foragers had to estimate how useful each person would be if they joined a coalition. So our distant ancestors considered a standard set of features, such as strength, intelligence, charisma, etc., that tended then to indicate that someone would be a useful ally. Humans evolved specialized mental modules for making such estimates, and for estimating common perceptions of such estimates.

Today we have inherited such mental modules, and often use them to estimate which side will win a contest of coalitions. And even though relevant abilities have changed somewhat, our inherited expectations about who will win a coalition contest are somewhat self-reinforcing. For example, if we expect that coalitions of taller people tend to win, then we will be reluctant to cross such a coalition, which will tend to make them win. This can be a self-reinforcing focal equilibrium of the coordination game that is coalition politics.

If the features that define status are sticky, being somewhat locked into mental models that estimate which coalitions would win contests, then outside consultants with no formal power inside a firm could still tip the balance of status by siding with a CEO. Celebrities who know little about a product could make us more willing to buy it by endorsing it, and students could gain status via past affiliation with professors who have no power in their future work world.

If students gain status by graduating from prestigious schools, and if employers hire students for the status they add to a work coalition, is school productive? Well in this situation school is privately productive, both for the student and the employer. The employer isn’t inferring a hidden ability, but buying a visible feature. So this isn’t signaling exactly. But on the other hand, it isn’t obviously globally productive. The gain an employer gets from adding status to his coalition may well come at the expense of competing coalitions.

By the 1980s, the onset of puberty, if not actual menstruation, had gone into free fall–a change so sudden and pronounced that something more than normal evolution must have been at work. In a landmark 1997 study of 17,000 [US] girls … more than 10% of white girls and an astonishing 37.8% of black girls were showing early breast development by age 8. … Later studies, one in 1998 and another in 2010, included Hispanics and produced similar results. On average, 2 out of every 10 white girls, 3 out of 10 Latinas and 4 out of 10 black girls are showing breast development by age 8. (more)

They consider some possible explanations:

Obesity, a well-established puberty accelerant, is high on the list of suspects. … Data from China and India similarly indicate that race by itself isn’t a factor but general prosperity is. Onset of puberty is on a downward march in those countries too. … But even in Europe, where the standard of living has been high for decades and diets haven’t changed much, something strange is going on. A study of girls conducted in Denmark in 2008 found that the average age of breast development there is 8.86 years, which … is a full year earlier than it was for Danes as recently as 1993. … Some investigators are focusing on environmental contaminants like PBBs and … bisphenol A … A number of studies have found that overweight boys may, if anything, suffer from delayed puberty.

Oddly they don’t even mention divorce and out-of-wedlock birth, factors that some theory suggests are crucial:

Father absence is indicative of the degree of polygyny (simultaneous and serial) in society. Polygyny of both kinds creates a shortage of women in reproductive age, and thus, early puberty will be advantageous. Available comparative data indicate that the degree of polygyny is associated with a decrease in the mean age of menarche across societies, as is the divorce rate a presumptive index of serial polygyny, in strictly monogamous societies. (more)

This theory has some empirical support:

As specified by evolutionary causal theories, younger sisters had earlier menarche than their older sisters in biologically disrupted families (n = 68) but not biologically intact families (n = 93). This effect was superseded, however, by a large moderating effect of paternal dysfunction. Younger sisters from disrupted families who were exposed to serious paternal dysfunction in early childhood attained menarche 11 months earlier than either their older sisters or other younger sisters from disrupted families who were not exposed to such dysfunction. (more)

I heard of this theory a while ago, but until now I hadn’t realized its radical implication: humans may have evolved adaptations to make major body/life features conditional on our social environment! If girl brains can order hormones to induce early puberty after seeing lots of nearby polygyny, how else might our bodies be contingent what our brains see about our social world? Do young brains see the level of violence, prosperity, or work complexity around, and adjust hormone-induced plans for body size, immune system strength, or brain resources? Could this adjustment explain recent trends in mortality, height, or intelligence? So many possibilities to consider!

Anthropologists often say that it is a mistake to look for “the” ancestral human environment or lifestyle, that what most defines humans is variety and adaptability. I’m going to take that view a lot more seriously from now on.

Added 4p: Why are people so much more willing to use strange chemicals to explain earlier puberty that other trens like increasing IQ, lifespan, and height? Is it because chemicals are bad, and therefore can only explain bad things?

Most animals have dominance, which is who would win a pairwise conflict

Prestige status seems to not exist in animals with simple social relations.

Unlike other shared rankings, like sexiness or dominance, we seem unaware of what exactly prestige ranks.

So what is prestige? That is, what sort of ranking would be useful for human-like primates to track about each other, but also be something illicit, so that foragers would be reluctant to admit its true function? One obvious candidate stands out to me: one’s value as an ally in coalition politics. That is, how much better off is a typical coalition with this person as an ally, relative to not having them as an ally. This is clearly an important concept, well worth tracking. It only makes sense in groups with complex coalition politics, and foragers have norms against overtly engaging in such politics.

Since an ability to win pairwise contests is useful to coalitions, we expect dominance to add to prestige. But humans and similar primates can also add value to a coalition by having skills that make them useful associates, and by being on good terms with other good-ally-material folks. And both skills and associations also seem to make important contributions to human prestige. Note that this theory predicts that other primates with complex coalition politics, like chimps, will also have a prestige status distinct from dominance status.

If prestige is about one’s value in coalition politics, what does that predict about em prestige? Of the list I gave, items 2,5,7,12,16, should be substantially related to prestige.