The human world is full of hierarchy. As I touched on last week, hierarchy is anthropologically necessary for groups over the size of a band (approximately 150, not four or five).

Culturally, some people have more status than others, and it leads to different levels of access to resources, from food and shelter, to knowledge and education, to political decision-making. In less complex societies, such status is usually achieved. That means that people who have status earned it themselves. This is contrasted in with ascribed status, which is inherited. Usually, ascribed status is the hallmark of social classes, as it allows people’s children to inherit their wealth and power.

Hierarchy and Hegemony

Hierarchy does not always work as advertized. Sometimes we are called on to believe things that are not true or are not in our immediate best interest. Sometimes we are called on to “sacrifice for the greater good” when that “good” is simply support of the status quo. Hegemony occurs when people of higher social status and power manipulate the beliefs of those beneath them so that their power is protected. That doesn’t mean that those on the bottom are without recourse.

Cultures have a way of allowing people to blow off some of that steam without resorting to overwhelming violence (riots and civil war) and massive disruption. Everything from employees griping about bosses behind their backs (a venerable tradition everywhere) to the Occupy Wall Street movement count as ways to express resistance to these power imbalances.

Power and Inversion

When the people on the downside of a power imbalance take part in resistance to it in a formal way that is by a culture, this can be done in a form called “ritual inversion.” In ritual inversion, the rules of society are reversed or ignored. This can be anything from late-night comedians commenting on politicians as the voice of the “common man,” to a day at your job where the bosses serve the employees lunch with their own hands, to a protest march, where those without the social power to make political decisions express themselves and judge their country’s leaders.

Ritual inversions are different from open rebellion. These expressions of social power happen within specific contexts, and follow their own rules. While these rules can invert certain social power structures, at the same time they have their own sets of rules.

On a political march in America, for example, members may say and do things that rebel against cultural hegemony, but at the same time they are likely to avoid open violence, and lawbreaking happens in a ritual context. When protesters sit down in a road, blocking it, knowing that they will be arrested for their actions, it is an example of ritualized rebellion. People of less social status are standing up to those in power, showing their lack of fear and using their own social power to publicly call that power structure into question. At the same time, they are not rioters, using indiscriminate violence to try to tear down the larger system. Protesters are not going to war; they are engaging in ritualized action.

What is Ritual?

Ritual doesn’t only refer to what happens in a church, but any set of actions that are prescribed. Rituals allow for the limited rewriting of the rules of culture within a certain context. Sure, ritual can mean those formalized, stiff ways of talking and acting in certain social situations. It can be the activity of high mass at Christmas, but it can also be a high school graduation, or even something as simple as the “ritual” of meeting someone new, introducing yourself, and shaking hands.

Rituals are activities that change the social world. The religious rituals that we are most familiar with are only a subset of these, often changing the social world by incorporating deities and such into “social” relationships.

Rituals are specific contexts where the rules of culture are changed for a limited time…and they can also, through their completion, reinforce cultural rules, either old ones (status quo) or new ones. A high school graduation changes the social status of the graduates, and is also one of the ways that students can enter adulthood. There’s no coincidence that the end of high school roughly coincides with the transition of children to adults at age eighteen. High school graduation is a rite of passage in Western culture.

Ritual Inversion

When we take the rules of culture and turn them on their heads, but only in specific circumstances, we are usually working with “ritual inversions.” A protest movement, like Occupy Wall Street, that follows social rules even while breaking the written laws, is a perfect example of this process. So, while we can talk about the necessity of hierarchy among humans living in groups higher than about 150, it can be important to recognize that human cultures have methods for both changing and critiquing power structures in ways that prevent total collapse, widespread violence, and civil war.

It is common wisdom to believe that humans, by and large, lack natural predators in their current environments. That is not true; what humans lack is predators outside our own species. Yes, there is the occasional psychopathic hunter of people who gets dolled up in camouflage, grabs a knife or gun, and makes some trouble. More common, however, is behavior that is predatory but not illegal.

The origin of the idea that we, as humans, lack predators can be traced from a combination of the Darwinian scientific concept of evolutionary adaptation, along with the pseudo-scientific idea of Social Darwinism. But if we trace the idea backwards, we find an underlying reliance on the Great Chain of Being, which dates back to Plato.

Usually when we talk about predation, we’re touching on the nature of relationships between species (or groups). “Natural” predation is a moral justification for violence: predators should not be judged as mean or cruel, they just are what they are.

The Great Chain of Being

The Great Chain of Being

The Great Chain of Being arrays all existence on a hierarchical continuum. From rock and stone at the bottom to God at the top, the Chain ascends from pure matter to pure spirit. On this continuum, humans occupy a position at the edge of spirit and animals. By holding this position, they are simply defined as being “above” all other animals.

Humans hold the right to have power over other creatures and the physical world. It is, effectively, a hunting license. But by the same token, people higher on the chain within the “human” category have a similar license regarding those below them. On the chain, there is always hierarchy.

The chain is infinitely hierarchical. Even within each category or subcategory (like “animals” or “dogs” or “members of this pride of lions”) there are ever finer subcategories, whether species or social class. Lions are over gazelles, and kings are over peasants. Leaders are over those they lead, and eaters are over their food.

Medieval scholars believed that within the category of humans, some were higher than others. Thus, a King was higher on the chain than a peasant, and naturally had power over the peasant. In the same way, a husband had power over a wife, and parents over children.

[As a side note, when people say that something, like gay marriage, goes against the “natural order” of things, ninety-nine times out of a hundred, this is exactly the idea that they are referencing!]

Predation

Theodore Roosevelt – American HunterPredation, or preying on something or someone, is conceptualized in Western culture by referring to this same “natural order.” Humans get to eat animals and plants because they are lower on the food-chain (a biological reductionist version of the Great Chain) than the others. In a scientific and biological sense, this whole idea of predator/prey really is much more complex. We can add in (at least!) symbiotes and parasites, and not every relationship is wholly consistent. After all, the Chain says that humans rank above polar bears, but I don’t recommend telling a polar bear to his face! “Sometimes you eat the bear…”

In a cultural sense, the Great Chain of Being becomes the moral and ethical justification not only between humans and their prey, but also for unequal relationships between people. We move from an theoretically rule-free competition (as biology really dictates) to a justification of hierarchy. And part of that hierarchy is that people higher on the ladder get to use those lower on the ladder.

Human on Human!

Is there a difference between competing with other humans for resources and treating them as prey? When we think about it at all, we usually figure that we’re not hunting other humans, we’re just competing with them for resources—food, mates, and all the other biological and psychological necessities. And in most cases that is correct.

However, that distinction does not always hold true. The moment that someone, as a human competing for resources, begins to treat other people as resources rather than as fellow competitors, it begins to look more like predation than competition. Looking back at the idea of the Great Chain, we compete with those on the same level, but hold a different kind of relationship with those above or below us.

Why do so many Americans love “Downton Abbey”?

Above? Below? Americans have a love / hate relationship with the idea of social class. To look at fiction and literature, they also generally have a very poor idea of its basis. On the one hand, social class is not some inherent feature passed on by “blood,” but is also not some inherent evil.

Social stratification is a necessity when it comes to functioning in complex societies. At the same time, it goes against the democratic ideals of America, and at times can be used to cover all sorts of abuses.

Remember the idea of the “band“? The approximately 150 people that we can effectively link to socially for cognitive and evolutionary reasons? Any group of people that is larger than that requires social hierarchy. So, social hierarchy can’t be all bad, can it?

Democracy tries to counterbalance the tendency of humans, as primates, to hierarchy. It asserts that there is equality, knowing that this is an ideal to be striven for. American democracy doesn’t promote equality, but equality of opportunity. It does not cast aside social hierarchy, but makes a virtue of social mobility.

Yet that does not change the nature of these hierarchical relationships. It just means that, at best, we’re not stuck in the one we were born to.

Intraspecies Predation

There is some distinction that we, as humans, make: between socially acceptable forms of predation and those that are unacceptable. Further, some forms of predation are acceptable by those who are higher on the Chain against those who are lower.

It is almost entirely acceptable to behave in a predatory manner against other humans, as long as it is done so on a financial level rather than a mortal one. Western culture accepts that unequal relationships exist, that they are good, and that they drive “progress.”

Ritual Inversion?Now, a lot of people may disagree that it should be legal, but that is a different matter from whether it is. Even where Western laws exist to prevent abuses, they are often either relatively toothless or poorly enforced. Human on human predation is “allowed” by society (whether legal or not) as long as it doesn’t threaten to collapse the culture as a whole.

As long as people in the West act in accordance with the Western tradition of the Great Chain (reinterpreted for local culture, assuredly) they will remain relatively free to behave however they like. By contrast, it is socially unacceptable, or “against the law,” to hunt other humans with a gun—at least without the specific permission of those higher up the Great Chain of Being / social hierarchy.

The purpose of government regulation is not to create a utopia. It is to get a bunch of primates to get along…at least most of the time.

There’s a lot of talk about the “role of government” in this or that. But let’s take a step back for a moment and talk about the role of government, any government, when we look at these primates we call humans. Government allows humans to live in large groups.

I’m with the Band, Man

Humans, evolutionarily, do not do well in groups over about 150 people. Most of human evolution took place in groups that anthropologists call “bands.” These bands are small groups of humans, under 150 members, who are generally attached to each other by kinship—by descent and by marriage.

With these bands, only two types of social distinctions were made. One was gender, and the other was age. It was not until the rise of horticulture that people started developing ways of solving social conflict that allowed everyone to stay together. Even this early farming required a certain amount of investment in the land and staying together. Before that, if the groups grew too large, they tended to fragment into two or more smaller groups.

Getting people to work together in groups larger than band size requires hierarchy—at least rudimentary government. If two or more bands had to live together and get along, this meant a certain amount of social organization. This led to the advent of the “big men” (and “big women”) as social leaders. These “big” people were leaders, but their children did not inherit their power. Effectiveness, and not lineage, marked people as leaders.

As the groups working together became larger, the forms of leadership became more strict and more formalized. In time, humans developed cities, and eventually states. The important thing to remember is that basic human behaviors didn’t change much. Our situations changed, but not our basic drives.

A Monopoly on Violence

One of the most basic functions of government, especially state government, is defined in its monopoly on violence. The state tells us who we can kill, and under what circumstances. Yet, as people, we are prone to violence when we find our livelihoods, our persons, and our own “bands” threatened.

The state requires a trade-off. It takes away our freedom to seek redress through violence. In return, it offers to solve those same problems in other ways. This is one of the basic purposes of law, and the reason that law not only restricts violence, but also controls financial interaction, from contract enforcement to theft. Modern law, at its root, takes away our right to violence. It also attempts to regulate and solve the problems that we, as primates, would usually address with violence.

For example, say that a typical Joe-schmoe invests in an enterprise, and that enterprise is a rip-off. Now, as a primate whose livelihood is threatened, Joe wants to solve the problem with a club. But the state, holding a monopoly on violence, says, “Hold on! That is not your responsibility!”

By forming a monopoly on violence, “government” has taken away Joe’s ability to effectively (or ineffectively) respond with violence to threats against his livelihood. If we remember that humans are violent primates, then one of the main functions of government is regulating violence.

Governments never say “no violence.” Instead they regulate it. Perhaps violence in self-defense is okay. Or violence against certain groups. Or violence against outsiders. The point is that government decides what is, or is not, legitimate violence.

The Trade-Off

In order for thousands or even millions of humans to live together in close quarters as we so commonly do, we have emplaced rules against violence. However, at the same time, we have rules and regulations there to prevent the very situations that lead to violence.

In order to live together, we have to give up the right to use violence individually. We relegate this power to the state, and then rely on the state for redress.

That should be the perfect system. Except, as with all things that touch the real world and human nature, the reality is much coarser, and messier. Institutions, including governments, tend to be reactive. If we recognize, however, that the purpose of regulation is generally the need to stem violence, rather than the need to “promote fairness” or any other high-minded goal, then we can see where the slippage comes in.

While law may have its own purposes, government (in its basest form) has no pressing need to address a problem until it threatens to spill over into violence. When we see this clearly, then we can more calmly address what seem to be “failures” in government. Though the system may not work as advertized, it certainly works as designed.

We all know the origin myth of science. We’ve been told that Isaac Newton discovered the theory of gravity after an apple fell on his head. But science, to some extent, has become a belief system, and not just a method for collecting data and verifying claims.

Science, like many fields, has done some spectacular things to benefit people. But what really drives science?

Information Sharing

The Scientific Method

Science isn’t about believing any particular thing, as much as it’s about having a specific relationship with knowledge. Scientific knowledge is externally verifiable, and worth sharing with others in the effort to grow humanity’s knowledge.

The Invisible College

The Invisible College was a group of scientific researchers and natural philosophers led by Robert Boyle (famous for Boyle’s Law). They employed the scientific method of experimentation and then shared their results with one another.

Members of the Invisible College were, in a sense, the founders of the “scientific community.” Starting in the 17th century, they shared information and results with each other through writings and letters, building a corpus of knowledge.

Just for a moment, think about how different that was from previous models of information sharing. Previously, if a researcher discovered information, that information was likely shared linearly with students, or maybe with a couple of colleagues. The whole relationship of the researcher to knowledge underwent a massive shift with the advent of science.

With the Invisible College, information was suddenly spread widely throughout Europe. The Invisible College was the immediate predecessor of the Royal Society, and is in that sense the direct ancestor of modern science.

When Isaac Newton wrote, “If I have seen further, it is because I have stood on the shoulders of giants,” he was hailing a new age of information sharing. In a philosophical sense, these natural philosophers were the forbears of today’s internet, where information is shared rapidly, allowing one scientific and technological advance after another.

The Scientific Method

The Invisible College did more than share information. The second major advance was defining what “knowledge” is. The Scientific Method, the basic way that science develops and tests information, was used as a means of validating and verifying information. These natural philosophers—early scientists—not only shared information but also agreed on what kind of information it was. Hypotheses needed to be independently verifiable. No matter who did the tests, or where, they would yield the same results every time.

The four stages of the Scientific Method are:

Perform basic observations

Hypothesize an explanation

Test the hypothesis

Analyze and interpret data

When information sharing and experimental verification are combined, suddenly we have two features that combine very powerfully: testing ideas and letting others test them. It transforms the relationship between people (scientists) from competitors to collaborators. It puts the generation of data, of knowledge, above other values, making the growth of human knowledge regarding the natural world a “higher good.”

Collaboration

We don’t even think twice about the information that the scientific community, indeed all of academia, shares. These ideas of collaboration, working together to improve the world, and the whole notion of progress, all come down to people choosing to share their findings and to grow the sum of human knowledge about the world. That’s a game-changer.

With the commitment of wild-eyed (scientific) fanatics, the early scientists set out to make something larger than themselves: a view of the world that most of us take for granted today. Chemistry, biology, physics, and all the natural sciences owe their existence to these 17th century philosophers, and their foresight in realizing that sharing information with each other would improve human knowledge in a way that withholding it would not.

The Scientific Method and You

But what seems to have been lost sometimes is the core of the Scientific Method. Science isn’t just believing in gravity, or even understanding the basic formulas that allow us to launch probes to Mars. Science, at its root, is about challenging assumptions about “what we know” and collaborating to find new ways to understand the world around us. And that isn’t just a job for people who work in laboratories. In the Information Age, where people find themselves inundated with more information than they can rationally process, the Scientific Method provides a means for understanding and parsing that information.

Scientifically, truth can be verified. Science can be scary because it doesn’t care what “everyone knows,” and it depends instead on what works and can be replicated.

When we find ourselves faced with challenges of understanding, with models of behavior, it is by coming back to the Scientific Method that we can get beyond winning the argument, and instead truly make progress in understanding the world.

Let’s say that we’re faced with a situation where one of two possible business processes needs to be chosen for implementation. Human nature says that we pick the one people agree on, or even the one experts agree on. Maybe we just listen to the sales pitches of the marketers. But here, you can let your inner scientist run wild and actually experiment. Experimentation is the hallmark of science.

Now, I know, you hardly need a lab coat and goggles for that kind of experiment. But we can use the Scientific Method outside of the lab. Both standardized verification of information and the sharing of information can lead us beyond competing with each other and toward effective collaboration.

The phrase “drinking the Kool-Aid” is popular, these days. It refers to the actions of people controlled by groupthink, usually to their own long-term detriment. “Drinking the Kool-Aid” means following a leader blindly, accepting leadership’s determinations, and promoting them as the will of the whole group.

“Drinking the Kool-Aid” is agreeing with the group, rather than subjecting decision-making to outside influences. Just imagine that you are in a meeting, and everyone decides that the best plan is to go jump off a bridge. “Drinking the Kool-Aid” means agreeing that this is an awesome plan, thus aligning yourself socially with the group. You might even volunteer for a leadership role and tweet about your promotion! Everyone seems to come to agreement, often by suppressing minority opinions to create the feeling that there is harmony within the group.

At a deeper level, “drinking the Kool-Aid” means agreeing with micro-social mores, rather than staying true to a larger sense of identity. It is an act of faith in the group and in its leaders to make good decisions. It is especially employed in situations where leadership wants to offer the image that all is well, that everyone is “on the same page,” and that the ideas being proposed are being backed by the whole group, and not simply mandated from the top.

In a competitive business environment, “drinking the Kool-Aid” becomes a survival tactic. Where getting along with the boss(es) can be more critical than making good decisions, drinking the Kool-Aid may be the only choice in the short term.

A Brief History of Jonestown

The phrase “drinking the Kool-Aid” is a reference to the mass suicide in Jonestown (Peoples Temple Agricultural Project) in Guyana on November 18, 1978. Over nine hundred people died, many of them intentionally drinking poison-laced Flavor Aid (not Kool-Aid, as the legend goes).

At the order of the leader of the Peoples Temple, Jim Jones, his followers committed suicide. They believed him, and believed in him, accepting that their actions had a larger meaning. They followed their leader into death, likely certain that they were saving themselves from a more terrible fate. A few survived by fleeing, but most chose death.

If Everyone Else Jumped off a Bridge…

Sure, there is a little bit of “laughing in the face of death” when we speak of drinking the Kool-Aid. At the same time, we might use the phrase in mockery of those who get the promotion we did not, because they seem willing to believe things that we believe are not true in order to fit in. Fantasies of rebellion and self-reliance aside, drinking the Kool-Aid is often necessary.

The old saw, “If everyone else were jumping off a bridge, would you?” ignores the fact that in many cases, with most people, the answer is “yes.” Humans are social animals, and we do not make every decision rationally, weighing the pros and cons of the matter.

We might create such lists to organize our thoughts, but our final answers are driven by more “squishy” aspects of ourselves such as our personal values, a desire for social acceptance, and even a certain mob mentality. The idea that we make choices rationally is a cultural fiction. We make some decisions that way, some of the time. But any time that the “coolness” of an idea takes part in our decision-making process, we have stepped away from pure, objective rationality and into something much more influenced by culture, ideas of success and acceptance, and in fact a whole social realm that is not objective, but is very, very real.

So, Why a Kool-Aid Free Diet?

In the course of life, we will need to do things we do not like and accept things that we do not truly believe in, at least on the surface (and often much more deeply). So why discuss it at all? It is one thing to ape social customs, another to believe them unquestioningly, and yet a third to understand their context, and sometimes meaning(s).

If you have ever traveled in a foreign country and spent time with the people who live there, you likely picked up at least a smattering of local customs and words. You might have been able to say hello, order basic foods, and just generally “get along.” The pure aping of social rules, with little or no understanding, is a far cry from being an expert in a culture. At the same time, it is also how we learn the social rules in our first culture.

Our own culture is learned in a process that anthropologists call “enculturation.” If acculturation is the learning of a second culture (like during a year abroad in France), then enculturation is the way we learn our first culture. We learn our first culture in a way that we never really learning anything again, by doing it until it makes sense, and not generally by learning sets of abstract rules. In a sense, that first culture, like the first language, becomes the baseline for all learning throughout our lives.

That does not mean that we will forever accept that culture as “true,” but it becomes our semantic home, our point of departure. The Kool-Aid Free Diet is about understanding that mental home, by taking a glimpse into our own culture, for a brief moment, as if from the outside.

Growing up in a golden age of post-Western Culture, it has seemed both odd and fitting to me that our current everyday ways of thinking about human behavior depend so much on earlier models. The models I’m talking about are not the ones that necessarily come from the hallowed halls of academia (though some do), but the ones that everyday people use, the ones that children learn at their parents’ knees.

In other words, I am not planning on looking at the world of cutting edge science; we do not, by and large, live in that one. Instead I am speaking of the cultural truths that shape our lives and our decision-making. When we want to get at understanding people, we need to look at both the rational person and that underlying biological stratum.

For all that we believe in, or at least depend on, “science” and its answers, as I look around, I see that we are terribly dependent on pre-scientific ideas of the person in our everyday lives. Our ideas of people as “rational actors” or “inherently good” or “genetically predisposed” to various behaviors, are all things we have inherited from some part of Western Culture. Yet at the same time, we are not rational about these ideas at all.

It’s neither a good thing nor a bad thing. But it is a conceit to think of ourselves as somehow beyond all that.

Living in Tension

Our spoken models of human behavior, the ones that we subscribe to in our everyday speech, and the ones we enshrine in our laws, policies, and business decisions, speak of people as rational products of 16th century enlightenment, 18th century political liberalism, and 19th century notions of progress. We tie these all together with 20th century ideas of modernization and globalization to become “modern” people.

Western culture is not simply a product of these new ideas, but something that ties back thousands of years to Medieval Europe, the Roman Empire, Classical Greece, Ancient Egypt, and the first cities in Babylonia.

But there is another side to being human. We are not just these cultural constructions, however ancient, but also messy biological animals. Our “selves” are not one or the other, but exist in a tension between these two positions—the rational thinker driven by choice and the often territorial, instinct-driven primate.

Rationality Is a Duck

One of the traditions that we have inherited from the Enlightenment is a rejection of the “animal” side of ourselves. We speak as if, by rejecting that half of ourselves, we can become somehow more “pure.” This is not as successful as we would like it to be. Instead, by forcing these parts of ourselves into hiding, we create a more complicated landscape where truths we have closed our eyes to shape our movements and inform our decisions.

As any advertiser knows, putting an attractive woman in the picture with a car makes it more appealing to a certain demographic. This is not rational, but it works.

When we try to analyze human behavior under the assumption that it is rational, and that we make choices for reasons, we trip ourselves up. Sure, I have reasons, and you have reasons, but in the end, when we look at aggregate patterns of human behavior they indicate that there are some things going on that have more to do with non-rational decision-making.

Primates goes beyond commonly accepted beliefs about the nature of people, looking past things that we all accept as true and seeing deeper into our own human nature. This writing is informed by the social sciences, from anthropology and history to psychology and sociology.

Posts navigation

The Kool-Aid Free Diet

Primates goes beyond commonly accepted beliefs about the nature of people, looking past things that we all accept as true and seeing deeper into our own human nature. This writing is informed by the social sciences, from anthropology and history to psychology and sociology.