Saturday, September 24, 2011

“Global Korea” poster. Has South Korea become the new posterboy for globalism?

East Asia has been an outlier in the developed world. Like Western Europe and North America, it is integrated into the global economy and enjoys a high standard of living. This is particularly so for the original five ‘tigers’: Japan, South Korea, Taiwan, Hong Kong, and Singapore.

Yet East Asia has bucked the trend toward loss of nationhood. Its governments still see their role as one of perpetuating a specific ethnic identity and cultural tradition. This is in contrast to the view, dominant in the West, that countries should simply be administrative units and should interfere as little as possible in the free flow of capital, goods, and labor.

Recently, this outlier has lost one member. South Korea is falling into line with the globalist paradigm and has opened its borders to increasingly higher rates of immigration:

As of 2007, 1,066,291 registered foreigners were residing in South Korea. More than 400,000 migrant workers are now working in so-called 3-D [Dirty, Dangerous, and Difficult] industries where South Koreans are reluctant to work. 110,362 immigrants entered in 2007 to marry South Korean husbands or wives and the cumulative number of international marriages increased to 364,000 during the 1990-2007 period. In 2005, 13% of all marriages in South Korea were interracial or interethnic marriages and the rate of international marriages was even higher in rural areas where about one-third of all marriages were interracial or interethnic. (Yoon et al., 2008)

When immigration began in the late 1980s, the aim was to ease labor shortages and offset a perilously low birth rate while maintaining the ethnic status quo. Diaspora Koreans would be repatriated from China and the former Soviet Union, and North Korean defectors would be welcomed. This aim was quietly set aside from the mid-1990s onward, as the zone of recruitment broadened to include the Philippines, Vietnam, Indonesia, Bangladesh, Sri Lanka, Pakistan, and even countries as far afield as Nigeria (Kim, 2004). Today, South Korea is entering uncharted waters of demographic change:

[…] South Korean society has entered the first phase of multiethnic and multicultural society and the current process seems irreversible. If the current trend continues, the proportion of foreigners residing in South Korea will increase to 2.8% in 2010, 5% in 2020, and 9.2% in 2050 (Yoon et al., 2008)

The increase may actually be greater. On the one hand, the declining birth rate shows no signs of bottoming out. On the other, once immigrant communities become established, they tend to facilitate more immigration from their home countries, whether legal or illegal. It is worth noting that the above figures exclude illegal immigrants, who are estimated to form half the total intake (Moon, 2010).

Will this demographic change meet growing resistance from the public? Not in the near future. If anything, public opinion has been moving in the other direction. Between two surveys, taken in 2003 and 2007, the shift in opinion was remarkable:

For example, to the statement "It is impossible for people who don' t share South Korean traditions and customs fully to become South Korean", 55% of the respondents agreed while 23% disagreed in 2003, but in 2007 30.8% of the respondents agreed while 32.9% disagreed. (Yoon et al., 2008)

There has also been an increase in hostility to public meetings of “ people prejudiced against racial and ethnical groups.” In 2004, 29.6% of respondents felt such meetings “should definitely not be allowed.” By 2007, the figure had risen to 46.5% (Yoon et al., 2008).

Multiculturalism is thus becoming a core value, like filial piety of another age. By adhering to it, South Koreans earn respect not only from their friends and colleagues but also from themselves. This conformity seems to reflect a longstanding desire in Korean culture to comply with social norms.

Will there be resistance from the political elites? Again, not in the near future. Although multiculturalism is often identified with the Left, the new policy of “Global Korea” is actually being pushed by the Right, specifically the Grand National Party (GNP):

For the conservative government, South Korean nationalism and democracy is fundamentally tied to the doctrine of neo-liberalism. Neo-liberalism refers to the flow of economic migrant labour and mobile global capital. This global environment also requires government policies to attract foreign migrants and workers into South Korea’s economy and society.

Multiculturalism is a state-led response to these global changes. The policies of multiculturalism define the present and future economic, security and cultural national strength of South Korea. Critics suggest that, in fact, the GNP regards multiculturalism as an instrumental policy of increasing national state power in this global environment. (Watson, 2010)

South Korea has entered what may be called ‘late’ or ‘mature’ capitalism. The business community has emancipated itself from the nation state and is now willing to enrich itself at the expense of its host society, notably by outsourcing employment to lower-wage countries and by “insourcing” lower-wage labor. To this end, its political spokesmen borrow leftwing discourse to create an artificial Left-Right consensus. As Watson (2010) goes on to argue:

The Right has effectively ‘‘stolen’’ the language of the Left (which has traditionally promoted multiculturalism) and colonised the language of multiculturalism with nationalism and security languages and concerns. For the Left, by ideologically separating multiculturalism from economic globalisation and its economic and political inequalities, multiculturalism becomes a quaint cosmopolitan smokescreen covering economic and political hardships.

One might add that the Right has likewise colonized the language of nationalism with multicultural concerns. The push is on to equate Korean nationality with residence on Korean soil. This concept of citizenship is now being taught in South Korean schools:

Mono-ethnicism was not officially removed from K-12 social studies and moral education textbooks until February, 2007. For example, social studies textbooks for sixth graders used to mention that “Korea consists of one ethnic group. We, Koreans, look similar and use the same language” (Mo, 2009). Citizenship education was grounded in this mono-ethnicism, and the national curriculum focused on enhancing democratic citizenship, including obedience to the law, rights as citizens, morality, and loyalty to the nation (Yang, 2007).

[…] The Korean government has acknowledged dramatic social changes in contemporary Korean society and has attempted to implement this view of contemporary Korean society in national curriculum standards. National curriculum standards have replaced mono-ethnicism with the notions of cultural diversity and multiculturalism. (Moon, 2010)

In this, the South Korean government appears to have also acted under pressure from foreign organizations, notably the United Nations:

Mainstream Korean citizens used to believe that Korea consists of “one-blood, one-language, and one-culture.” The Convention on the Elimination of Racial Discrimination (CERD) in the United Nations (UN) has pointed out that the “pure-blooded” ideology and the notions of ethnic homogeneity have resulted in various forms of discrimination in Korea (Wagner, 2009). CERD has recommended recognizing the multi-ethnic character of contemporary Korean society and promoting understanding, tolerance, and friendship among the different ethnic and national groups in Korea. In education, CERD has recommended that the Korean government include human rights awareness programs in the official curriculum. A revised curriculum should describe a Korean society in which people from multiple ethnic and cultural backgrounds live together harmoniously (Hong, 2008; Wagner, 2009). (Moon, 2010)

Why South Korea?

Why is this shift to globalism stronger in South Korea than in other East Asian countries? The likeliest answer is the country’s special relationship with the United States.

This relationship goes far beyond the current stationing of U.S. troops along the demilitarized zone. When South Korea was freed from Japanese rule in 1945, the Americans were greeted as liberators—in contrast to Japan, where they were merely accepted as occupiers. Even today, there is a legacy of pro-American sentiment that has few parallels elsewhere in East Asia.

As liberators, the Americans were able to create a new political class from scratch. The Japanese had forced into exile much of the native leadership, and these émigrés were now brought home to form a government under U.S. auspices. One of them was the first president of South Korea, Syngman Rhee, a man who had spent most of his adult life in the United States. A similar situation existed in North Korea, where the new government was made up largely of émigrés from China and the Soviet Union.

To some degree, this situation still prevails. Political and economic leaders are often graduates of American universities, and they tend to see the U.S. as a model to be followed. Furthermore, this model cannot be easily criticized because such criticism may be seen as sympathy for the communist North.

References

Choi, J. (2010). Educating Citizens in a Multicultural Society: The Case of South Korea, The Social Studies, 101, 174–178.

Kim, W-B. (2004). Migration of foreign workers into South Korea: from periphery to semi-periphery in the global labor market, Asian Survey, 44, 316-335.

Moon, S. (2010). Multicultural and Global Citizenship in the Transnational Age: The Case of South Korea, International Journal of Multicultural Education, 12, 1-15.

Saturday, September 17, 2011

Broken Hill skull from Zambia, dated to 110,000 BP. It is often identified as a Homo sapiens, largely because it is so recent. We now have evidence that very archaic hominins inhabited central and southern Africa at least 35,000 years ago.

The past year has brought us a new model of human evolution. It’s a modified version of “Out of Africa.” Present-day humans are now traced to a small founder group that began to expand some 80,000 years ago in East Africa and started to spread out of Africa some 50,000 to 40,000 years ago. Meanwhile, these early modern humans intermixed to varying degrees with the archaic hominins they replaced. There is thus 1 to 4% Neanderthal admixture in present-day Europeans and Asians, and 8% Neanderthal and “Denisovan” admixture in Melanesians (Green et al., 2010; Reich et al., 2010).

What about Africans? Are they the only “pure” humans? This is unlikely on theoretical grounds, as Hammer et al. (2011) point out in a newly released paper:

[…] the greatest opportunity for introgression was in Africa, where AMH [anatomically modern humans] and various archaic forms coexisted for much longer than they did outside of Africa. Indeed, the fossil record indicates that a variety of transitional forms with a mosaic of archaic and modern features lived over an extensive geographic area from Morocco to South Africa between 200 and 35 kya.

Africa’s warm climate tends to break down DNA quite rapidly. So we’ll probably never get to reconstruct an archaic African genome from a tooth or a skeletal fragment. But we can look through modern African genomes for signs of introgression from an archaic source. The most telltale signs are unusual polymorphisms in noncoding regions.

Using this approach, Hammer et al. (2011) assign about 2% of the modern African genome to an archaic population that split from ancestral modern humans some 700,000 years ago. This admixture is dated to about 35,000 years ago and may have occurred in Central Africa, since the level of admixture is highest in pygmy groups from that region.

Who exactly were these archaics?

Beginning ≈700 kya, fossil evidence from many parts of Africa indicate that Homo erectus was giving way to populations with larger brains, a change that was accompanied by several structural adjustments to the skull and postcranial skeleton (14). By ≈200 kya, individuals with more modern skeletal morphology begin to appear in the African record (8, 14). Despite these signs of anatomical and behavioral innovation, hominins with a combination of archaic and modern features persist in the fossil record across sub-Saharan Africa and the Middle East until after ≈35 kya […] Interestingly, recent studies attest to the existence of Late Stone Age human remains with archaic features in Nigeria (Iwo Eleru) and the Democratic Republic of Congo (Ishango) (Hammer et al., 2011)

Did Homo erectus linger in parts of Africa until as late as 35,000 years ago? The idea is no longer science fiction. We have the example of the “hobbits”—a Homo erectus population that lasted at least as long in Southeast Asia.

Perhaps we should take a second look at the Broken Hill skull, which was found near Kabwe, Zambia and has been dated to 110,000 BP (Bada et al., 1974). Many anthropologists have raised it to sapiens status, largely on the assumption that non-sapiens were no longer around at that time. Yet it doesn’t look at all like a Homo sapiens.

Admixture with late archaics

Broken Hill man probably occupied one end of a range of archaic groups that inhabited Africa on the eve of the ‘big bang’—the demic expansion of early modern humans that began 80,000 years ago somewhere in East Africa. At the other end were late archaic hominins who looked just like early modern humans but still lacked some of the final changes to their neural wiring.

If sub-Saharan Africans have about 2% admixture from Homo erectus, they probably have much more from late archaic hominins, who were more numerous and behaviorally more similar. How great is this late archaic admixture?

According to Watson et al. (1997), about 13% of the sub-Saharan gene pool comes from a demic expansion c. 111,000 years ago that corresponds to the entry of Skhul-Qafzeh hominins into the Middle East. Although these hominins were almost anatomically modern, their technology was Mousterian and differed little from that of Neanderthals.

Some last-minute neural changes seem to have occurred between this expansion 111,000 years ago and the ‘big bang’ 80,000 years ago. Perhaps these changes triggered the second expansion. As Atkinson et al. (2009) write:

[…] the African exodus was predated by a cultural revolution involving new stone blade technologies, skin working tools, ornaments and imported red ochre […] More advanced symbolic systems in language and religious beliefs could have provided a competitive advantage to a group by promoting coordination and cohesion.

What does it all mean?

It’s interesting that we have varying degrees of archaic admixture. But what does it all mean? Did these different admixtures make us different in different ways?

The claim has been made that species owe much of their genetic variability to introgressive hybridization. However, all the evidence contradicts this conclusion so far as animals is concerned. Not only are F1 hybrids between good species very rare, but where they occur the hybrids (even when not sterile) are demonstrably of inferior viability. The few genes that occasionally introgress into the parental species are not coadapted […] and are selected against. Introgressive hybridization seems to be a negligible source of genetic variation in animals.

In opposition to this view, Greg Cochran and John Hawkes have argued that gene introgression enabled early modern humans to adapt more quickly to new environments. Instead of starting from scratch, they just ‘cherry-picked’ genes that had already been developed by the populations they were replacing.

All of this assumes there were cherries worth picking. Did archaic hominins have anything useful to offer? Modern humans and Neanderthals adapted to the same cold environment but they did so in very different ways. The former, for instance, made tailored clothing while the latter were probably as furry as bears.

The stage whisper is that the Neanderthals gave ancestral Europeans special brain genes, notably the latest microcephalin variant (Hawkes et al., 2008). We now know otherwise. The reconstructed Neanderthal genome has revealed no brain genes that our ancestors cherry-picked. As for African archaic hominins, it’s even less clear what they had to offer. These were groups that lived under similar climatic and ecological conditions.

The cherry-picking theory seemed like a great idea. How else could one explain the sudden cultural dynamism of early modern humans? This effervescence began only 30,000 to 20,000 years ago—long after the ‘big bang.’ And it was most evident in southwestern France—a place far from East Africa. Surely the simplest explanation is gene introgression from European Neanderthals.

Well, things are never as simple as they seem. There are in fact other explanations:

1. Southwestern France has provided so many early European artifacts in part because it had so many early Europeans. It benefited from an unusually rich environment that could support a large population of semi-sedentary hunter/fisher/gatherers (Mellars, 1985).

2. France is a country with strong grassroots interest in history and prehistory. This is unfortunately not so elsewhere in the world, where the remote past is often viewed with indifference. Why does so much of our knowledge of the prehistoric Middle East come from Israel? Because Israel is chock-full of archeologists who do their work passionately and, in many cases, for free. We view human prehistory through the lens of present-day interests.

3. Actually, there is evidence of technological complexity almost at the epicenter of the ‘big bang.’ Central African sites have yielded fine tools, dated to c. 90,000 BP, that look just like Aurignacian tools from post-Neanderthal Europe (Brooks et al., 1995; Yellen et al., 1995).

Saturday, September 10, 2011

Slave exports to the Americas from different parts of Africa (Dalton & Leung, 2011). Did the slave trade create patterns of behavior that today exist throughout sub-Saharan Africa, such as generalized polygyny?

Why is polygyny so frequent in sub-Saharan Africa? As Goody (1973, p. 177) noted, the differences with Eurasia are striking:

In Europe and Asia, polygyny is largely but not exclusively an heir-producing device; often it is a way of replacing a barren wife. In Africa, plural marriage is far more generalised; according to Dorjahn, about 35 per cent of married men have more than one wife. Hence a large percentage of the population is likely to be part of a polygynous unit at some point in the life-cycle. Most men will be polygynously married at some time or other; women are yet more likely to be so. And most siblings will have sets of half siblings, both because of the plural marriage of their fathers and because of the remarriage of their mothers — since polygyny inevitably involves a large differential in the age of marriage, men will be older when they beget children than women are when they bear them. Hence there will be a higher proportion of widows and fatherless children.

Goody (1973) attributes this generalized polygyny partly to female self-reliance in food production. Year-round farming enables women to provide for their own needs and those of their children. A wife thus costs little in terms of upkeep, and this low maintenance cost encourages men to have as many wives as possible.

This rule nonetheless has interesting exceptions. In the savannah regions of Ghana, women plant grain and help with the harvest, but they leave yam cultivation to men and do not engage in hoeing for cereal agriculture. Yet polygyny rates are somewhat higher there than elsewhere in Ghana, where women contribute more to food production. Polygyny is also less frequent in East Africa than in West Africa, yet women contribute more to food production in East Africa than in West Africa.

Goody (1973, p. 185) concludes that “hoe agriculture, female farming and polygyny are clearly associated in a general way” but there must be other explanatory factors. But what?

For Dalton and Leung (2011), one big factor is the slave trade—the mass exportation of African laborers that ended only two centuries ago. West Africa tended to export male slaves while East Africa tended to export female slaves. This pattern reflected differences in market demand: on the one hand, the Americas wanted farm labor; on the other, the Middle East and South Asia wanted domestics or concubines. These differing sex ratios might therefore explain why polygyny is less frequent in East Africa than in West Africa:

The slave trades existed for hundreds of years, and, as a result, Africa experienced abnormal sex ratios for long periods of time. Polygyny could have emerged or been strengthened during the long period of abnormal sex ratios. Figures 1 and 2 suggest the Western Coast should have contained more polygynous marriages, whereas the Eastern Coast should have contained fewer.

These abnormal sex ratios returned to normal once the slave trade had ended in the early 19th century. Why, then, didn’t polygyny rates follow this return to normal? Dalton and Leung (2011, p. 8) blame cultural conservatism: “Once these cultural traits are established, polygyny can become self-sustaining.”

This hypothesis is interesting and might explain some of the variation in polygyny rates within sub-Saharan Africa. But it fails to explain why polygyny rates are in the double-digit range throughout sub-Saharan Africa. East Africa’s rates are lower but still high by world standards. Goody (1973, p. 181) states that East African cattle societies have a rate of 24.7%, i.e., the percentage of married men with more than one wife. Pebley and Mbugua (1989) similarly write: “The frequency is somewhat lower in East and South Africa, although 15 to 30 percent of husbands are reported to be polygynists in Kenya and Tanzania.”

[…] the percentage of men in polygynous marriages in Western African countries like Guinea, Togo, and Benin is 35.037, 29.793, and 29.679, whereas in Eastern African countries like Ethiopia, Kenya, and Malawi the percentage is 6.131, 9.206, and 10.101.

These figures come from the latest “Demographic and Health Surveys” and are twenty to forty years more recent than the other figures. As such, they reflect the decline in female farming and the growing urbanization of African societies. In this new context, institutionalized polygyny has given way to looser arrangements with multiple girlfriends and/or prostitutes. And why bring Ethiopia into the comparison? It is not a sub-Saharan country and differs from both West and East Africa in many ways, notably the long-established influence of the Coptic Church and Christian sexual norms.

Dalton and Leung also err in assuming that sub-Saharan Africa had low polygyny rates before the slave trade. Several lines of evidence argue otherwise:

1. The ratio of Y-chromosome to X-chromosome variability is much higher among sub-Saharan Africans, New Guineans, and Aboriginal Australians than among other human populations. This suggests a long-lasting trend of fewer men than women contributing to the gene pool (Dupanloup et al., 2003; see also Torroni et al., 1990; Scozzari et al., 1997).

2. Proto-Bantu, spoken approximately 3,000 years ago, has a specific term for “taking a second wife” (Polome, 1977).

3. A high level of male-male competition for females is suggested by the increased sexual dimorphism of African Americans for weight, chest size, arm girth, and leg girth (Todd & Lindala, 1928; Wolff & Steggerda, 1943). In contrast, a small, gracile, and almost childlike body characterizes Khoisans and Pygmies, the only sub-Saharan populations with low polygyny rates.

After arguing that the slave trade caused generalized polygyny, Dalton and Leung see therein a leading cause of Africa’s lag in economic and social development.

There is undoubtedly some kind of relationship between generalized polygyny and Africa’s stubbornly high fertility and economic poverty, but it’s not a simple one of cause and effect. Banning polygyny will not cause immediate changes to reproductive and economic behavior. Dalton and Leung themselves argue that cultural conservatism alone has maintained high polygyny rates in Africa for the past two centuries. Wouldn’t the same be true for other behaviors?

Remember also that the time depth of generalized polygyny is not the four or five centuries that Dalton and Leung claim. In sub-Saharan Africa, high polygyny rates are associated with ‘female farming’ societies, and such societies began to spread outward from a point of origin near the Niger’s headwaters some 6,000 to 7,000 years ago (Murdock, 1959, pp. 44, 64-68).

Behavioral predispositions have significant heritability, especially in relation to sexual behavior (Comings et al., 2002; Mendle et al., 2006; Belsky et al., 2007). If generalized polygyny has existed in sub-Saharan Africa for six to seven thousand years, wouldn’t it have favored certain predispositions and not others? And wouldn’t those predispositions survive the banning of polygyny?

The question doesn’t seem to have crossed the authors’ minds. They seem to believe, a bit naïvely, that there is only cultural conservatism to worry about. No less naïve are the authors they cite:

Nunn and Wantchekon (2010) examines a particular channel through which the slave trades impact current African economic performances, namely the levels of trust across individuals within Africa. Trust supports economic exchange in well-functioning markets and would have plausibly been affected within groups living in the capture and export economies participating in the slave trades. Nunn and Wantchekon (2010) shows those individuals whose ancestral groups experienced higher slave exports exhibit lower levels of trust even to this day. Our paper contributes to these findings by suggesting an additional channel through which the slave trades have had a long-term impact on current African society. (Dalton & Leung, 2011, p. 2)

Low trust is typical of all simple, clan-based societies. Papua-New-Guinea was not affected by the slave trade, yet it has very low levels of trust. The slave trade may indeed have made people less trusting in some parts of sub-Saharan Africa than in others. But this does not explain why levels of trust are so low in sub-Saharan Africa as a whole.

And questions can be raised about the study by Nunn and Wantchekon (2010). Those authors found that willingness to trust members of other ethnic groups correlated with an ethnic group’s historical importance as a source of slaves. Most slaves, however, were taken during inter-tribal wars, and such wars generally occurred in regions already prone to interethnic conflict.

In short, the correlation is valid, but it doesn’t prove the causal relationship that Nunn and Wantchekon infer. In fact, the line of causality probably runs in the opposite direction.

Comings, D.E., D. Muhleman, J.P. Johnson, J.P. MacMurray. (2002). Parent-daughter transmission of the androgen receptor gene as an explanation of the effect of father absence on age of menarche. Child Dev., 73, 1046-1051.

Saturday, September 3, 2011

Is groupthink genetically determined? Twin studies suggest that people are prewired to identify and comply with social rules.

Where to from here? Will evolutionary psychology ossify and disappear? Or will it redefine itself and move on?

In a sense it doesn’t matter. A name is just a name, and the field of evolution and human behavior has had other names. The main issue is whether the current name is a help or a hindrance. Will it allow change from within?

Bolhuis et al. (2011) think so. In their call for a new evolutionary psychology, they have made several recommendations. One is to accept the reality of gene-culture co-evolution. In short, we should pick up where research petered out some two decades ago.

Amongst the overrepresented categories in genome-wide scans of recent selection are numerous alleles expressed in the human nervous system and brain. This raises the possibility that complex cognition on which culture is reliant (social intelligence, language, and challenges associated with constructing and adapting to new environmental conditions) have driven human brain evolution. Mathematical models exploring how genetic and cultural processes interact provide strong support for the role of gene-culture coevolution in human evolution.

Evolutionary psychologists should reconsider their assumption of a universal human nature. “For example, sex differences in mate preferences constitute a large proportion of EP research and are generally assumed to exhibit universal patterns.” Yet sex roles assume different forms in different human populations.

Another recommendation is to bridge the gap between postulated “psychological mechanisms” and actual neurons. We now have tools, notably MRI, that can locate where a specific mental activity occurs in the brain. Again, such research should take variation within and between human populations into account and not be confined to the usual participant pool of North American university students.

Finally, evolutionary psychologists should stop assuming that the human mind consists mainly of domain-specific programs. Much of our thinking is, in fact, domain-general.

Uh, what is ‘domain-general’? Think of a computer program that has plenty of sections or variables left blank. The blanks can be filled in with information, thus enabling the same kind of program to do a wide range of tasks. We call this in-filling process ‘learning.’

Learning thus takes place via programs that have already been partly hardwired. This is why we can learn some things better than others. There are also constraints on how fast we can learn, how much we can learn, and on how easily we can integrate learned information. Learning is not the opposite of genetic determinism. The two concepts are complementary.

By minimizing the role of learning, evolutionary psychologists not only lose the high ground of credibility but also give a free hand to those who say that humans can learn to think anything. A good example is the debate over social rules:

EP has engaged in a longstanding debate with advocates of cultural evolution over whether human social learning is governed by evolved content biases (e.g., choose the sugar-rich food) or by domain-general context biases (e.g., conform to the local norm). There is sufficient empirical evidence for the deployment of context biases, such as conformity or prestige bias, to render the casual dismissal of transmitted culture counterproductive.(Bolhuis et al., 2011)

Groupthink is a reality, and its persistence in modern societies should make it ideal for EP research. One puzzle of twin studies is the relatively high heritability of religious fundamentalism. Among twins reared apart, 40-46% of the variance seems to be genetic in origin (DiLalla et al., 1996). Perhaps there has been natural selection for humans who can more easily identify and comply with social rules, thus sparing themselves the pain of learning them the hard way.

This point is worth investigating because willingness to comply with rules varies from one individual to another, and from one population to the next. Some people have an unusually high level of rule compliance. Why? Is it learned or innate? Or a bit of both?

Some evolutionary psychologists have actually been moving in this direction. Denise Cummins (1998, p. 37) describes mental evolution as “a strategic arms race in which the weaponry is ever-increasing mental capacity to represent and manipulate internal representations of the minds of others.” In addition to ‘indicative reasoning’ (what is true or false), humans have a capacity for ‘deontic reasoning’ (what is permitted, obligated, or forbidden). For deontic rules, people look for examples of violations. For indicative rules, people look for examples of proof.

In short, indicative rules are subject to change, as people learn more about their environment. Deontic rules are not so easily changed. The latter generally change with a new class of higher-status individuals, who not only are the preferred source of deontic rules but are also seen as being above the rules. Thus, people more easily remember cheaters than non-cheaters, but this memory is weaker when the cheaters are high-status individuals (Mealey, Daood, & Krage, 1996).

All of this raises a problem for the Pleistocene EEA. Hunter-gatherer societies have little if any social stratification. The same is largely true for simple agricultural societies. The ‘big man’ is not a force for social stability and rule making. His dominance is transient, lasting as long as his strength, charisma, and ability to intimidate.

Societies became stratified only during the last 10,000 years. This time also saw the beginnings of lawmaking, codified morality, and organized religion. Of course, there is no reason why these phenomena could not have influenced human nature via gene-culture co-evolution. The last 10,000 years have seen more genetic evolution than the previous 100,000 … or even the previous million.

But to say so is anathema to those who still believe that the human mind stopped evolving over a million years ago.

Follow me on Twitter!

Welcome to my blog! For the most part, this page will be an extension of my website, with comments relating to my research. But it will also branch out into more general discussions of human evolution.