[…] compared to other world populations, Africans south of the Sahara Desert are distinct dentally — especially in their expression of nine high- and two low-frequency morphological features. This suite of traits was termed the “Sub-Saharan African Dental Complex” (SSADC); it includes the world’s highest occurrences of Bushman canine, two-rooted UP1, UM1 Carabelli’s trait, three-rooted UM2, LM2 Y-groove, LM1 cusp 7, LP1 Tom’s root, two-rooted LM2, and UM3 presence, and among the lowest occurrences of UI1 double shoveling and UM1 enamel extension. (Irish, 2011)

The two low-frequency traits appear to be “derived.” They seem to have developed in sub-Saharan Africa after modern humans began to spread to other continents. The other traits, however, are ancestral:

[…] the same nine high-frequency traits are also ubiquitous in the dentitions of extinct hominids and many extinct and extant non-human primates

[…] The presence and, indeed, prevalence (see next section), of high-frequency Sub-Saharan dental traits in fossil and recent hominoids—some of which are probably direct ancestors of modern humans, suggests they have been around for a long time. (Irish, 1998, pp. 87-88)

In addition to these traits, Irish (1998) mentions a low-frequency trait that seems likewise ancestral and specific to sub-Saharan Africans:

A final ancestral feature found with some regularity in Sub-Saharan Africans, relative to other modern groups, is polydontia. Numerous cases of extra incisors, third premolars, and fourth molars have been noted […] In one study (Watters, 1962) the incidence reached 2.5-3% in several hundred west Africans; many of the extra teeth were fully formed and erupted. “Typical” mammals exhibit three incisors and four premolars (Jordan et al., 1992). Polydontia is also found in living non-human primates […] (Irish, 1998, p. 88)

Why are these ancestral traits much more common in sub-Saharan Africans than in other humans? There are several possible reasons. One is that non-Africans began as a small founder group and thus lost much of the dental variability that still characterizes Africans. Another reason might be that natural selection favored new forms of dentition outside Africa, perhaps as a response to new food sources or new ways of preparing food.

But there’s a third possible reason: archaic admixture. Just as modern humans mixed to some extent with Neanderthals in Europe and Denisovans in Asia, perhaps there was also mixture with archaic hominins in Africa, and perhaps this admixture introduced archaic dental features into present-day Africans.

But how could present-day Africans have archaic admixture? If modern humans originated in Africa, wouldn’t they have encountered archaic humans only in Europe and Asia?

Well, at first, modern humans did not occupy all of Africa. They were initially a small population somewhere in East Africa. Then, around 80,000 years ago, this population began to expand northward and eventually into Eurasia (Watson et al., 1997). Meanwhile, the same expansion was taking modern humans westward and southward into other parts of Africa.

Just whom exactly did these modern humans encounter during their expansion within Africa? Initially, they probably met hominins who looked the same but still lacked some of the mental rewiring that gave modern humans a competitive edge. These “almost-moderns” account for about 13% of the current sub-Saharan gene pool and may have been related to the Skhul-Qafzeh hominins who occupied the Middle East 120,000 to 80,000 years ago (Watson et al., 1997).

As modern humans spread further west and south within Africa, they encountered much more archaic hominins, and perhaps even lingering Homo erectus groups. About 2% of the modern African genome comes from an archaic population that split from ancestral modern humans some 700,000 years ago. This admixture is dated to about 35,000 years ago and may have occurred in Central Africa, since the level of admixture is highest in pygmy groups from that region (Hammer et al., 2011).

A more tangible sign of admixture is visible in a skull retrieved from the Iwo Eleru rock shelter, in southwestern Nigeria, and dated to approximately 16,300 BP:

Our analysis indicates that Iwo Eleru possesses neurocranial morphology intermediate in shape between archaic hominins (Neanderthals and Homo erectus) and modern humans. This morphology is outside the range of modern human variability in the PCA and CVA analyses, and is most similar to that shown by LPA individuals from Africa and the early anatomically modern specimens from Skhul and Qafzeh.

[… ] the transition to anatomical modernity in Africa was more complicated than previously thought, with late survival of “archaic” features and possibly deep population substructure in Africa during this time. (Harvati et al., 2011)

Then there is the Broken Hill skull, found near Kabwe, Zambia and dated to 110,000 BP (Bada et al., 1974). It looks for all the world like a Homo erectus. Textbooks generally try to raise it to Homo sapiens status or argue for an earlier dating. Recently, a late dating has been confirmed by Stringer (2011).

Interestingly, when Irish (2011) compared dentitions from west, central, east, and south Africa, ranging in age from the late Pleistocene to the mid-1950s, the early Holocene Kenyans and Tanzanians were the sample that had the fewest ancestral traits of the Sub-Saharan African Dental Complex (SSADC). In other words, the SSADC seems to have been least present in the “homeland” of modern humans (East Africa) and more present farther west and south.

Given the high level of archaic admixture in sub-Saharan Africans, we may have to revise downwards the estimate of 1 to 4% Neanderthal admixture in Eurasians. Yes, Eurasians are closer than sub-Saharan Africans to the Neanderthal genome. But is this discrepancy solely due to Neanderthal admixture in Eurasians? Could it also be due to Sub-Saharan Africans becoming further removed from the Neanderthal genome through admixture with other archaic groups?

The past may be a stranger country than previously thought. When farming villages began to form in the Middle East, there may still have been archaic hominins roaming over parts of western and southern Africa.

Saturday, January 21, 2012

Relative frequencies of surnames of the rich and the poor (common criminals) 1236-1858. In England, there seems to have been much downward mobility among descendants of the medieval rich and some upward mobility among descendants of the medieval poor (Clark, 2010).H/T to Jason Malloy

Consider free meritocracy in a two-class system, meaning that for each generation anyone in the lower class who has greater merit than someone in the upper class immediately swaps class with them. Mating then occurs at random within class.

[…] Class mobility after the first generation is 30% while after four generations it has declined to 10% and continues to decline after that. The average merit in the two classes is about -1SD in the lower and +1SD in the upper on the original scale, corresponding to IQs of 85 and 115.

[…] after four generations, about 70% of the variance is between classes.

This model, however, contradicts what Clark (2009a) found in his historical study of surnames and social class in England. He first collected rare English surnames that were exclusive in the year 1600 to the rich (as represented by wealthy testators) or to the poor (as represented by common criminals). He then went forward in time to the year 1851 and determined the occupational profile of the same rare surnames:

How do the descendants of these two groups look in terms of socioeconomic status by 1851? Surprisingly there seems to be almost complete regression to the mean.

Between 1600 and 1851, there was apparently great downward mobility among descendants of the rich and modest upward mobility among descendants of the poor. Clark (2010) subsequently found that this regression held true for the entire period stretching from 1236 to 1858 (see above chart).

Why does this outcome diverge so much from the theoretical outcome described above? One reason is the assumption that marriage takes place only within each social class. Yet assortative mating is only a tendency, and exceptions are numerous. In post-medieval England, a widower would likely take a second wife of lower social status because of his disadvantaged position on the marriage market, given the care required for his existing children. It was also accepted for an upper-class man to marry a woman of lower rank, on the condition that she be beautiful. This phenomenon was noted by Darwin:

Many persons are convinced, as it appears to me with justice, that our aristocracy, including under this term all wealthy families in which primogeniture has long prevailed, from having chosen during many generations from all classes the more beautiful women as their wives, have become handsomer, according to the European standard, than the middle classes; yet the middle classes are placed under equally favourable conditions of life for the perfect development of the body. (Darwin, 1936 [1888], p. 892)

Another reason is downward mobility. In England, the upper and middle classes were reproductively more successful than the lower class until the late 19th century. But higher-class families could offer their children only a limited number of occupational slots. A certain proportion thus had to emigrate or move down the social ladder. The lower class was thereby continually replenished by the demographic overflow of the upper and middle classes.

Finally, the word “merit” has different meanings in different contexts. It is never just IQ. In post-medieval England, merit meant a mix of “middle-class” values: thrift, self-control, future time orientation, and rejection of violence as a way to settle disputes (Clark, 2007; Clark, 2009a; Clark, 2009b). In other societies, merit may involve a different mix of predispositions and personality traits, such as ruthlessness and willingness to use violence.

Yet caste societies do exist. How do they come about? The main precondition seems to be not only the existence of social classes, but also a monopoly on certain occupations by each class. In such circumstances, downwardly mobile individuals cannot compete with the existing lower class, since the latter’s livelihood remains off-limits.

Take Japan. That country had a social evolution similar to England’s, i.e., gradual demographic expansion of the middle class and, correspondingly, gradual demographic replacement of the lower classes by downwardly mobile individuals. But the lowest class, the Burakumin, survived because it had a monopoly on occupations that involved taking life or handling dead bodies (e.g., leather working, butchery, undertaking, etc.). The Burakumin thus survive as a remnant of the majority Japanese population that existed several centuries ago.

Stigmatized castes, like the Burakumin, may provide a window into a population’s evolutionary past. Such groups cannot participate in the gene-culture co-evolution of the majority population. Nor do they have much leeway for their own gene-culture co-evolution, since they are rigorously confined to a few occupations.

References

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World. Princeton and Oxford: Princeton University Press.

Saturday, January 14, 2012

Fulani woman, Nigeria. Source.African Americans aren’t just sub-Saharan Africans with European admixture. There has also been admixture from Amerindian peoples and from groups partly of North African origin, like the Fulani.

Have African Americans evolved since they first came to North America? The question may seem strange. Doesn’t evolution happen over millions of years? The first slaves disembarked in the future United States back in 1619 and the last ones arrived (illegally) in the 1850s. That’s about three centuries. How could any population evolve over so short a time?

Yet natural selection can cause significant change in as little as eight generations, at least in nonhuman species. In humans, it has altered at least 7% of the genome over the last 40 thousand years, and most of that change has happened over the last 10 thousand years (Hawks et al., 2007).

Jin et al. (2011) argue that African Americans have changed genetically over the last three centuries, not only through European admixture but also because of natural selection. First, many black slaves died during their passage from Africa to the New World. The survivors, already a select group, faced a new environment in colonial America. They had to adapt to new challenges in their struggle for existence, such as new pathogens, new social structures, and new means of subsistence.

To identify these effects of natural selection, Jin et al. (2011) used two methods on a large sample of African Americans (5,210 individuals). They first looked at various genomic regions to see whether the degree of European admixture was higher or lower than the admixture for the genome as a whole, estimated at 21.61%. Such deviations would be “signals” of natural selection favoring certain genetic variants at the expense of others.

The second method involved comparing the African component of the African American genome with the genomes of present-day sub-Saharan African populations—in proportion to their respective contributions to the African American gene pool.

And the results? Some genomic regions did deviate from the level of 21.61% European admixture. Many of them were associated with diseases, like prostate cancer and hypertension, that are more common among African Americans than among Euro Americans. Alleles that protect against malaria were also less frequent than would be predicted by European admixture. This is evidence that natural selection has been eliminating alleles that are less necessary in North America.

These results seem expectable. Too expectable, in fact. Yes, prostate cancer occurs more often in African Americans than in Africans, but this difference is due to underreporting and shorter life expectancy in Africa. Keep in mind that prostate cancer tends to be diagnosed late in life (Ogunbiyi & Shittu, 1999; Osegbe, 1997).

There are other grounds for skepticism. The deviations from overall European admixture were small, less than 2.6%. Admittedly, the sample size was large, so sampling error couldn’t be responsible. But there may have been other sources of error.

One of them concerns the estimated European admixture of 21.61%. This figure is consistent with previous estimates and is probably the best one available. But the degree of admixture varies among African Americans, especially by social class and by geographic region.

The authors also oversimplify their model when they describe African Americans as sub-Saharan Africans with European admixture. There has also been Amerindian admixture, as noted by Myrdal (1944, p. 124):

Indians were held as slaves in some of the American colonies while Negro slaves were being imported. Equality of social status between Indians and Negroes favored intermingling. The whites had little interest in hindering it. As the number of Negro slaves increased, the Indians slaves gradually disappeared into the larger Negro population. Whole tribes of Indians became untraceably lost in the Negro population of the South. […] Twenty-seven and three-tenths per cent of the Negro sample of 1,551 individuals examined by Herskovits claimed some Indian ancestry.

The authors state that they excluded African American individuals who had more than 2% Native American/East Asian ancestry, but such exclusion is at best approximate. And what about the many individuals with 1-2% Amerindian ancestry?

Another wild card is North African admixture. The Atlantic slave trade involved some populations that were partly of Arab/Berber descent, such as the Fulani (also known as Fulbe or Peul. The hotel maid who gave DSK a BJ was a Fulani). To some degree, North African admixture would resemble European admixture. To some degree, it would have its own characteristics.

Admittedly, these other admixtures were relatively small. But one doesn’t need a big factor to explain a deviation of two percent or so.

Ideas for future research

I’d like to see more research on recent evolution among African Americans, in particular on the possibility of gene-culture co-evolution. One cultural determinant of genetic change might be Christianity, specifically the way it has structured family life, turned men into active fathers, and created a strict rules-based culture.

Briefly put, I’d like answers to the following questions:

1. Was natural increase higher among church-going African Americans than among non-churchgoers? (because of a higher rate of family formation, stronger male involvement in the family, lower rate of infant mortality, etc.)

2. Did churchgoers have a different psychological profile than non-churchgoers? In other words, were non-churchgoers disproportionately made up of individuals who had trouble complying with a rules-based culture?

3. Did the higher natural increase of churchgoers, together with their psychological profile, lead to an evolutionary process similar to what Clark (2007) has described for England? (i.e., gradual demographic replacement of impulsive, present-oriented individuals with disciplined, future-oriented individuals).

4. Did this process abort in the 1960s with the decline in church life among African Americans?

References

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford.

Saturday, January 7, 2012

What did the Neanderthals look like? SourceHard to tell, since we know so little about their soft tissues. But they probably had fur.

Were the Neanderthals as furry as bears? The question was raised by one of my readers, and I’ll try to reply at length in this post.

There are three lines of argument:

Lack of tailored clothing

Neanderthal sites show no evidence of tools for making tailored clothing. There are only hide scrapers, which might have been used to make blankets or ponchos. This is in contrast to Upper Paleolithic (modern human) sites, which have an abundance of eyed bone needles and bone awls (Hoffecker, 2002, pp. 107, 109, 135, 252). Moreover, microwear analysis of Neanderthal hide scrapers shows that they were used only for the initial phases of hide preparation, and not for the more advanced phases of clothing production (Hoffecker, 2002, p. 107).

The counter-argument is that some human groups, notably the Yaghan of Tierra del Fuego, have lived in sub-arctic environments with little clothing.

Human body louse

The human body louse (which lives in clothing) seems to have diverged from the human head louse with the advent of modern humans. This dating is based on a comparison of the two louse genomes:

The results indicate greater diversity in African than non-African lice, suggesting an African origin of human lice. A molecular clock analysis indicates that body lice originated not more than about 72,000 ± 42,000 years ago; the mtDNA sequences also indicate a demographic expansion of body lice that correlates with the spread of modern humans out of Africa. These results suggest that clothing was a surprisingly recent innovation in human evolution.(Kittler et al., 2003)

A more recent analysis places the origin of body lice at 83,000 to 170,000 years ago (Toups et al, 2011). The authors conclude: “Our estimate for the origin of clothing use suggests that one of the technologies necessary for successful dispersal into colder climates was already available to AMH prior to their emergence out of Africa.”

There is nonetheless some controversy over the phylogenetic status of these two kinds of louse. Light et al. (2008) argue that there is far too much genetic overlap between them and that they cannot be considered “genetically distinct evolutionary units.” This point has been reiterated by Li et al. (2010):

While being phenotypically and physiologically different, human head and body lice are indistinguishable based on mitochondrial and nuclear genes. As protein-coding genes are too conserved to provide significant genetic diversity, we performed strain-typing of a large collection of human head and body lice using variable intergenic spacer sequences. Ninety-seven human lice were classified into ninety-six genotypes based on four intergenic spacer sequences. Genotypic and phylogenetic analyses using these sequences suggested that human head and body lice are still indistinguishable. We hypothesized that the phenotypic and physiological differences between human head and body lice are controlled by very limited mutations.

This is a recurring problem when one examines two species or subspecies that have recently diverged from each other. The only genes that have diverged are those whose variants clearly differ in adaptive value between the two environmental settings—in this case, head hair and clothing. It is only with time, and reproductive isolation, that differences will develop at other gene loci.

In the meantime, lice—like humans—exhibit the apparent contradiction of distinct phenotypic differences co-existing with very fuzzy genetic differences.

We don’t yet know for sure, but it seems likely that, as part of their adaptation to cold, Neanderthals were furry. Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mother’s fur as infants. Modern humans don’t have these ridges, but Neanderthals do.

Follow me on Twitter!

Welcome to my blog! For the most part, this page will be an extension of my website, with comments relating to my research. But it will also branch out into more general discussions of human evolution.