Wednesday, October 31, 2007

Rummaging around my late mother’s home, I came across a “study in the problem of race” called The Clash of Colour. It dated back to the 1920s and reflected a kind of soft anti-racism that now seems quaint … and impossible.

The author, Basil Mathews, discusses the injustice of a world where whites control nine-tenths of the world’s habitable surface. He denounces the hypocrisy of demanding self-determination for Eastern Europeans but not for Africans and Asians. Turning his attention to African Americans, he speaks even more caustically about Jim Crow laws and lack of equal opportunity. Finally, near the end of the book, he quotes a resolution that the World’s Student Christian Federation adopted in 1922:

We, representing Christian students from all parts of the world, believe in the fundamental equality of all the races and nations of mankind and consider it as part of our Christian vocation to express this reality in all our relationships. (Mathews, 1925, p. 149)

Yet, strangely enough, the author’s antiracism co-exists with a belief in racial differences:

When we talk of the unity of man, we do not mean the uniformity of man. Race is real. It seems certain that—as Dr McDougall says—

“Racial qualities both physical and mental are extremely stable and persistent, and if the experience of each generation is in any manner or degree transmitted as modifications of the racial qualities, it is only in very slight degree, so as to produce any moulding effect only very slowly and in the course of generations.

I would submit the principle that, although differences of racial mental qualities are relatively small, so small as to be indistinguishable with certainty in individuals, they are yet of great importance for the life of nations, because they exert throughout many generations a constant bias upon the development of their culture and their institutions.” (Mathews, 1925, p. 151)

These sentiments sound disturbingly similar to those of Dr. James Watson, the discoverer of the DNA double helix, who recently wrote, “there is no firm reason to anticipate that the intellectual capacities of peoples geographically separated in their evolution should prove to have evolved identically. Our wanting to reserve equal powers of reason as some universal heritage of humanity will not be enough to make it so.” For these words, Dr. Watson was roundly condemned and forced out of his position at the Cold Spring Harbor Laboratory.

James Watson published his book in 2007. Basil Mathews published his in 1925. Between these two dates, antiracism changed. What passed for progressive opinion in the 1920s is now inadmissible.

What happened? There was, to be sure, the world’s revulsion against Nazism. But that isn’t the whole story. Even in comparison to the 1970s, today’s antiracism has become much more radical.

Recently, President Bush pushed hard for an immigration reform that would have reduced white Americans to minority status by the year 2050 while pushing the population total to half a billion. Such a proposal would have been unthinkable thirty years ago. It certainly would not have come from a self-professed conservative. Today, in 2007, such demographic change evokes scarcely a murmur of protest from the left to the right of the political spectrum. America’s elites have converted almost entirely to the anti-racist worldview. And this ‘consensus’ is defended not by debate but by an absence of debate—by a systematic silencing of any dissidence, such as Dr. Watson's.

It’s unhealthy for any belief, however noble, to exist in an echo chamber of constant approval. This is, after all, what totalitarianism is about—the rise to hegemony of one opinion to the detriment of all others. Such is the state of antiracism today. It is no longer the voice of reason that speaks to the screams of bigotry and intolerance. By a strange role reversal, it has become the very thing it used to oppose.

Antiracism, I fear, is painting itself into a corner from which it cannot extricate itself and which it will have to defend with increasingly totalitarian methods. Is this re-enactment of history really necessary? Can we not learn from the past? Must we follow the same trajectory that other hegemonic beliefs have followed with the same tragic consequences?

References

Mathews, B. (1925). The Clash of Colour. A Study in the Problem of Race. London: Edinburgh House Press.

Watson, J.D. (2007). Avoid Boring People: Lessons from a Life in Science. Knopf

Monday, October 15, 2007

In Europe, especially in the north and east, skin is unusually white, almost at the physiological limit of depigmentation, eyes are not only brown but also blue, gray, hazel or green, and hair is not only black but also brown, flaxen, golden or red. Are these color traits directly or indirectly due to selection for light skin at northern latitudes? But why, then, are they absent in populations that are indigenous to similar latitudes in northern Asia and North America?

As one reader of this blog has argued, skies are more overcast in Europe than at similar latitudes in northern Asia and North America. Thus, ancestral Europeans would have experienced less selection for dark skin to protect against skin cancer and sunburn and more selection for light skin to increase synthesis of vitamin D. Since genes for hair and eye color have some effect on skin color, relaxation of selection for dark skin should have allowed defective alleles to proliferate at all pigmentation loci, including those for hair color and eye color.

Moreover, it is doubtful that relaxed selection for dark skin could have diversified hair and eye color by allowing defective alleles to proliferate. Two papers have shown that such a scenario would have needed close to a million years to produce the hair-color and eye-color variability that Europeans now display, with the redhead alleles alone being c. 80,000 years old (Harding et al., 2000; Templeton, 2002). Yet modern humans have been in Europe for only 35,000 years or so.

Instead of relaxed selection for dark skin, perhaps there was increased selection for light skin, notably to boost synthesis of vitamin D. This hypothesis solves the time problem but does not explain the increase in the number of MC1R and OCA2 alleles. Natural selection would have simply favored one allele at the expense of all others, i.e., whichever one optimally reduced skin pigmentation.

There are other problems with either hypothesis, or with any that attribute these color traits to weaker solar UV:

1) If we examine the many homozygous and heterozygous combinations of MC1R or OCA2 alleles, most have little visible effect on skin pigmentation, except for the ones that produce red hair or blue eyes (Duffy et al., 2004; Sturm & Frudakis, 2004).

2) If we consider the estimated time of origin of these color traits, at least two of them seem to have appeared long after modern humans had entered Europe's northern latitudes about 35,000 years ago. The whitening of European skin, through allelic changes at AIM1, is dated to about 11,000 years ago (Soejima et al., 2005). No less recent are allelic changes at other skin color loci and at the eye color gene OCA2 (Voight et al., 2006). Did natural selection wait over 20,000 years before acting?

Are there other forces of natural selection that might explain the 'European exception'? Loomis (1970) and Murray (1934) have argued that Europeans are lighter-skinned than indigenous populations at similar latitudes in northern Asia and North America because the latter obtain sufficient vitamin D in their diet from marine fish. This argument may hold true for the Inuit but not for the majority of indigenous populations that live within the zone of minimal UV radiation, essentially above 47º N (Jablonski & Chaplin, 2000). Most, in fact, live far from sea coastlines.

Monday, October 8, 2007

Several years ago, my main research interest was the difference in skin pigmentation between women and men. In a nutshell, women are paler in complexion and men browner and ruddier because the latter have more melanin and hemoglobin in their skin. This sex difference dominated skin color variability in earlier social environments; therefore, skin color may have become a visual cue for gender-specific responses (e.g., sexual attraction, gender identification, conflict readiness, social distancing, etc.).

In a rating study, I showed female subjects several pairs of male facial photos, and in each pair one of the faces had been made slightly darker than the other. The darker face was more likely to be preferred by the women in the estrogen-dominant phase of their menstrual cycle (i.e., the first two-thirds) than by those in the progesterone-dominant phase (i.e., the last third). This cyclic change in preference was absent in women on oral contraceptives and in women who were assessing pairs of female faces (Frost, 1994).

At no point in the cycle was the darker male face more popular than the lighter one. It was simply less often disliked during the estrogen-dominant phase. As I saw it, higher estrogen levels seemed to be disabling a negative response to darker individuals. This negative response might be a social-distancing mechanism that keeps conflict readiness at a higher level during social interaction with males.

My study left some questions unanswered. What component of male skin color was triggering this response? Was it ruddiness (hemoglobin) or brownness (melanin)? And exactly what feelings were being triggered?

Some recent findings suggest that the trigger may be male ruddiness and the feelings something akin to intimidation. In the 2004 Olympic Games, opponents in boxing, taekwondo, Greco-Roman wrestling, and freestyle wrestling were randomly assigned red or blue athletic uniforms. For all four competitions, the ones who wore red uniforms were significantly likelier to win. This phenomenon was investigated by Ioan et al. (2007), who asked participants to name the color of words on a computer screen and measured the response time. The men took significantly longer than the women to respond when the words were red. Reducing luminosity increased response time for both men and women, but the gender gap remained. The authors concluded:

Our data suggests that “seeing red” distracts men through a psychological rather than a perceptual mechanism. Such a mechanism would associate red with aggression or dominance and may have a long evolutionary history, as indicated by behavioural evidence from nonhuman primates and other species.

With respect to our species, they state:

In humans, the adult male is ruddier in complexion than the adult female and male hormones greatly increase blood circulation in the skin’s outer layers. Testosterone influences erythropoiesis during male puberty and a decline of testosterone with aging increases the risk of anemia. Furthermore, men with hypogonadism or those taking anti-androgenic drugs frequently have anemia. These data are consistent with a testosterone-dependent ruddiness of the male complexion, as seen in many other species where red coloration acts as a signal of male dominance.

It would be interesting to repeat the above study with female subjects at different phases of the menstrual cycle. I suspect that response time would be longer among subjects in the progesterone-dominant phase than among those in the estrogen-dominant phase. In other words, the gender gap may be due to estrogen disabling a conflict-readiness mechanism that uses ruddiness as a visual cue for male identity.

Follow me on Twitter!

Welcome to my blog! For the most part, this page will be an extension of my website, with comments relating to my research. But it will also branch out into more general discussions of human evolution.