Search

The imperative of the human species to ‘Know Thyself‘ has developed into a rapidly expanding field in palaeoanthropology. The exploration of our species, Homo sapiens, is a particularly active field which utilizes multi-disciplinary approaches to untangle the evolutionary threads of our beginning. The following essay introduces concepts and approaches used in this field, whilst raising current research issues.

——~…~——

“For a species that is both narcissistic and inquisitive, Homo sapiens has so far done a remarkably poor job of defining itself as a morphological entity”, Tattersall and Schwartz (2008: 49).

Thus starts the opening sentence to Tattersall and Schwartz’s 2008 article on the problems of clarifying the morphological distinctiveness of anatomically modern humans (AMH or the species Homo sapiens). It is perhaps applicable not just to the morphological characteristics but also the fossil record and origins of AMH themselves (Pearson 2008: 38). This paper, then, will discuss the principles behind the definitions and evolution of AMH in context with reference to its behaviour and morphological traits. In turn, the dominant models of the origin and subsequent dispersion of AMH will be discussed, with reference to where Homo sapiens ‘fit’ in the palaeoanthropological record. A wealth of new genetic research data and fossil finds has considerably opened up the treasure chest of hominin information, which is having a considerable impact on our understanding of the H. sapiens place in the evolutionary records (Bowden et al. 2012, Curnoe et al. 2012, Krause et al. 2010, Prat et al. 2011, Wood 2005: 42). It is directly as a result of how the reporting of evolutionary science has changed in the past few decades (McEwan 2012), and how technological approaches have uncovered so much genetic data in reconstructing fossil record relationships (Jurmain et al. 2011: 270), that the definition of AMH is not so easy. This paper will conclude with a talk on how the biocultural evolution of H. sapiens is now impacting both our environment and localised populations in certain contexts (Le Fanu 2009, Hawks et al. 2007, Jurmain et al. 2011).

It is important to note that H. sapiens are the last species of the genus Homo, with the first species tentatively dated in Africa to nearly 2.5 million YA (years ago), which led to the first dispersal of hominins (largely H. erectus) from Africa around 1.8 YA (Jurmain et a.l 2011: 240); AMH dispersal occurred much later. It was once thought that AMH were defined by modern anatomy and behaviour at the junction of the Upper Palaeolithic around 40,000 YA (Nowell 2010: 438), however, recent palaeoanthropological finds and research have discovered a distinct ‘decoupling’ between early AMH anatomy and later symbolic/modern behaviour, with anatomically similar traits of AMH in fossils pinpointed to east and south Africa to around 200,000 YA (Rightmire 2008: 8, Wood 2005). However there are problems concurrent with the dating of the hominin fossil record, as Millard (2008: 870) concludes that ‘the dating evidence for many key fossils is poor’. Typically there are a number of assigned morphological features that mark out Homo sapiens compared to other species in the Homo genus (Table 1). As Tattersall and Schwartz (2008: 51) note, however impressive the suite of features ‘not all of them are expressed with equal emphasis in all living humans’. When this is combined with the fossil record of AMH, with individuals often taken as examples for their own long lost skeletal population and the problems inherent in the preservation of skeletal elements (geological pressure, scavenging etc), we should rightly be wary of definitively assigning a species name before comparison with relative contextual remains, stratigraphic layers and other similar period sites (Millard 2008, Pettitt 2005).

Using a cladistics framework, Pearson (2008: 38) highlighted the fact that there are specific difficulties in using statistical measurements of metrical and discrete measurements as having been conceptualised as derived features in AMH crania, with comparison to Neandertal and H. erectus crania. However there are further problems when trying to establish if the earliest H. sapiens African fossils of Omo Kibish, the Herto crania, or Near Eastern Skhul and Qafzeh fossils fit within the 95% rate of modern features, with results not even reaching the 75% fit of the modern features for AMH (Pearson 2008: 39). In part this is due to fossils, such as the Herto crania, which are used as the mean of that particular population, which ultimately ‘conflates individual, within-population variation and between-population variation’ (Jurmain et al. 2011, Pearson 2008: 39). Other problems of quantifying such long chronological morphological differences include the lack of various populations of modern (Australian aboriginals, for example) and certain prehistoric peoples being outside of the 95% confidence to fit the given morphological concept of AMH. Clearly there needs to be a control on the temporal/geographic population of the AMH under consideration in such studies, when carrying out both the statistical analysis with other fossil hominins and when taking the defining measurements.

Pettitt (2005: 132-137) argues that H. sapiens should be classed into three arbitrary chronological groups of morphological continuity: 1) those of the earliest H. Sapiens, including material from Bodo (Ethopia), Broken Hill (Zambia) and Elandfontein (South Africa) amongst others; 2) Transitional (or archaic) H. sapiens including Herto, Omo Kibish 1 and 2 (Ethiopia), Florisbad (South Africa) and Jebel Irhoud (Morocco); 3) finally AMH including Makapansgat, Border Cave and Equus Cave (South Africa), Taramsa (Egypt), and Dar-es-Soltan (Morocco) examples (see Table 2 below for dates). This ordering of morphological continuity defines AMH through the evolution of H. sapien traits with retention of H. ergaster traits (earliest), whilst the AMH group compromise clear AMH dating to less than 125,000 YA (Pettitt 2005: 132). As Pearson (2008: 44) suggests, ‘the process of becoming modern likely occurred as a series of steps, regardless of whether one considers these different steps to be different taxa in a bushy phylogeny or merely different grades in a single evolving lineage’. Pearson (2008: 44) goes on to say that the ‘evolution of modern man should be viewed as a process rather than an event involving rapid morphological change due to drift during population bottlenecks and selection for new advantageous traits or genes, or a combination of the two’, rather than a singular smooth process. Therefore we should be wary of relying purely on the often sparse fossil record. Regardless, it is widely recognised that H. Sapiens are a probably daughter species of H. erectus (i.e. as a result of a speciation occurrence) which spread across Africa and into Western Eurasia at the beginning of, or just before, the Middle Pleistocene (Jurmain et al. 2011, Rightmire 2008: 8).

Recent research has also led to five majority agreements in regards to the tenets of AMH behaviour (Table 2; Nowell 2010: 447). Wood (2005: 109) makes the salient point that early eurocentrism in the search for AMH behavioural origins clouded certain judgements, such as focusing on Western Europe to the detriment of African archaeological sites.

Points of Consensus on Modern Behaviour:

The relationship between modern anatomy and modern behaviour is more complex than once thought.

Modern behaviour has symbolic thoughts at its core.

Archaeological record of the African Middle Stone Age has rendered invalid the idea of a ‘human revolution’ occurring for the first time in the Upper Palaeolithic of Western Europe.

Later Neandertal sites have demonstrated modern behaviour to either some form or some degree, such as personal adornment or symbolic behaviour.

The triad of social, cultural and demographic factors are key in understanding variability and patterning in the archaeology record.

Research (Jurmain et al 2011, Prat et al 2012) has also highlighted symbolic behaviour in a number of early H. Sapiens sites throughout Africa and the Near East; Balter (2011: 21) highlights Aterian sites in North Africa where various personal and possible symbolic artefacts have been found, whilst Blombos Cave in South Africa (77,000 YA), and Katanda in the DR of Congo (80,000 YA), have some of the earliest symbolic artefacts recovered including incised ochre, worked bone and beads; almost a full 45,000 years before any such artefacts appear in the European record (Jurmain et al. 2011: 298-299). Mellars (2006: 9383) proposes a model that indicates climatic, environmental and cultural changes around 80,000 to 60,000 YA as major causative agents of cognitive change alongside population pressures in the dispersal of African H. Sapiens. However Nowell (2010: 441) states that the gradual emergence of behaviours as a mosaic of features, and not as a single revolutionary package, should be considered within the archaeological record, whilst defining that for the majority of researcher’s symbolic language and codified social relationships define modern behaviour. Mosaic features in fossil hominids have been noted in recent discoveries of the Australopithecus sediba specimen, highlighting a mix of Australopithecus and Homo anatomical features (Wong 2012: 25).

The origins of AMH living outside of Africa have led to the formation of two major competing models in palaeoanthropolog: the multi-regional continuity hypothesis that proposes already living populations of hominins and local populations in Asia, Europe and Africa continued their ‘indigenous evolutionary development from pre-modern Middle Pleistocene forms to anatomically modern human’ (Jurmain et al. 2011: 281), whilst the complete replacement (or out of Africa) hypothesis proposes that AMH arose in Africa 200,000 YA to completely replace those in Europe and Asia (Table 3; Jurmain et al. 2011: 279). Critical to the multi-regional hypothesis are the tenets that i) a level of gene flow between geographically separated populations prevented speciation, ii) all living humans derive largely from the species H. erectus, iii) natural selection in regional populations is responsible for the regional variants found in extant populations, and finally, iv) that the emergence of H. sapiens was not restricted to one area per se but was a phenomenon that occurred throughout the geographic range where ‘humans lived’ (Johanson 2001: 1).

Critical to the complete replacement theory are that i) H. sapiens arose in one place, highly likely to be East/South Africa, ii) H. sapiens ultimately migrated out of Africa, and replaced all human populations without interbreeding, and that iii) modern human variation is a relatively recent phenomenon (Johanson 2001: 1).

Although not all factors of the multiregional hypothesis cannot be falsified, it seems prevalent that H. sapiens originated in Eastern Africa (with Ethiopia so far providing the most stable dated site), and dispersed to Europe and Asia from 65,000 YA onwards in various waves (Table 2; Jurmain et al. 2011: 282, Mellars 2006: 9381). The two most securely dated sites in Europe for AMH are Pecstera Cu Oase in Romania at 42,000 YA and Buran Kaya III in the Crimea, Ukraine at 31,900 YA (Hoffecker 2009: 16040, Prat et al. 2011). Unsurprisingly, Hoffecker (2009: 16040) notes that the issue of the mechanism of transition is a ‘controversial topic in palaeoanthropology’. Arguments have been made that AMH crossed into Eurasia via a Levantine corridor, with the earliest AMH dates from Skhul and Qafzeh in Israel at around 120,000 to 100,000 YA (Wood 2005: 98), whilst recent work in North African Aterian populations from around the same period are pointed out as being possible ancestors to at least some of the H. sapiens who left Africa during this period (Balter 2011: 23). The palaeoanthropological evidence suggests that they, the Aterians, possessed the right symbolic behaviour, anatomy and favourable climatic conditions to be at least a contender for contributing to one of the waves of H. sapiens leaving (Balter 2011: 22-23). There are a variety of sites across Europe after 40,000 YA that show a variety of evidence for AMH presence, including the triad of modern human behaviour with symbolic artefacts and modern skeletal morphology. However, we should not forget that Europe was already populated with the H. Neandertalensis species prior, and co-existed with H. sapiens for approximately 10,000 years or so (Hoffecker 2009: 16040, Wood 2005: 110). This subject will be tackled shortly.

The most secured dates found in Asia are from areas such as the Sahul region (conjoined landmass of Australia, Papua New Guinea and Tasmania), where it is possible AMH occupied various areas (Wood 2005: 111-112). It must be remembered that while the ‘dwarf’ species H. floresiensis survived up until 18,000 YA on the island of Flores with temporal overlap between themselves and H. sapiens, it seems unlikely there was regional overlap from the archaeological evidence (Wood 2005: 111). Curnoe et a.l (2012: 1) note that the AMH fossil record for East Asia is, at this time, poorly recorded owing to a lack of detailed description, rigorous taxonomy classification and a distinct lack of accurately dated fossils. However there are a few key sites: Liujiang in Southern China has produced a skeleton which, although it lacks exact stratigraphic position, has been dated to an estimated broad range from 153-30,000 YA, whilst the Niah Cave child in East Malaysia has been dated to 45-39,000 YA for the cranium from a recent field and lab program (Curnoe et al. 2012: 2). Tianyuan cave, just south of the Zhoukoudian cave, has fragmentary evidence of an AMH crania and teeth which are dated to 40,000 YA, with a possible mix of archaic and modern features; the American and Chinese team who excavated it have suggested it is evidence of interbreeding in China with resident archaic populations, but suggest an African origin for the AMH itself (Jurmain et al. 2011: 287).

The above examples highlight problems in understanding the definition of AMH, both anatomically and behaviourally. With the advent of dispersals from Africa AMH interacted with other hominids, prominent of which are the Neandertals in Eurasia and the elusive Denisovans in Siberia (Krause et al. 2010, Hubin 2009, Noonan 2010, Zilhao 2006). Genetic evidence is unravelling what it is to be an AMH (Hawks et al. 2007), and there is evidence to suggest that Neandertals contributed up to 4% of non-African modern human DNA via gene flow (Green et al. 2010: 711, Reich et al. 2010: 1057).

Roughly one third of Neandertal mtDNA genetic diversity, dating from 70,000 to 38,000 YA, is comparable to contemporary human populations (Briggs et al. 2009: 319), although Noonan (2010: 550) and Herrera et al. (2009: 253) raise the flag of caution as the majority of Neandertal remains were not collected with their regard to DNA investigation, whilst modern DNA contamination, despite the safeguards, is still prevalent. Briggs et al. (2009: 321) postulate that low mtDNA diversity throughout much of the Neandertal lineage may indicate a low effective population size, although it could be reflective of AMH direct/indirect influences as they spread from Africa (interbreeding or out competing for example). Herrera et al. (2009: 253) note that there are difficulties such as identifying haplotypes indicative of interbreeding. Nonetheless, as Zilhao et al. (2010: 1027) points out that a Mid-Palaeolithic Iberian Neandertal sites shows distinct features associated with AMH including symbolic behaviour, with ochre and shells displaying evidence of body paint, and organisation skills, which that studies believes is the outcome of demographic pressure, technology and ‘social complexification’ within the Neandertal species itself (Roebroeks et al. 2012: 2).

There is the distinct possibility of admixture; this is reinforced by the apparent coexistence of the surrounding area by Neandertals, AMH and Denisovans in the Altai region at roughly the same time periods, and by the fact that Denisova populations contributed roughly 4-6% present day DNA in AMH Melanesian populations; this suggests they interacted with Melanesian ancestors, but probably not in the Siberia region (Krause et al. 2010: 895, Reich et al. 2010: 1053). The lack of complete remains and its physically limited location from this suspected new species at Denisova Cave limit our knowledge but tests are continuing. If this hominin, as hypothesised, had a wide geographical range (Reich et al. 2010: 1059), the question must be asked why we haven’t noticed it before? Interestingly Abi-Rached et al. (2011: 94) highlight that the fact that as the AMH Eurasian populations mixed with archaic hominids, adaptive introgression of vital immune system components (Human Leukocytes Antigen class 1) helped to provide a mechanism for rapid evolution. The adapted introgression of the genes now represent more than half of the HLA alleles in modern Eurasians, and were later introduced into African populations (Abi-Rached et al. 2011: 89). Therefore the definition of AMH must include evidence of interbreeding to some degree. Future genomic studies in other archaic hominins should provide more information relating to the relationships between species; however it seems clear that gene flow was relatively common in the Upper Pleistocene (Reich et al. 2010: 1059).

Increased AMH demographic growth and geographic spread dated from 80,000 YA to the present, has led to rapid genetic evolutionary selective pressures on features including ‘skin pigmentation, adaptation to cold and diet’ amongst others (Hawks et al. 2007: 20756). Some of the most dramatic have been associated with the uptake of agriculture during the Neolithic period, both in terms of our ability in coping with disease and changes from interaction via population density (Barnes et al. 2011: 848). This is partly the result of cultural and ecological reasons (i.e. a biocultural pathway), and Hawks et al. (2007: 20756-20757) remark that in their study it was noted ‘new adaptive alleles continued to reflect demographic growth, (that) the Neolithic and later periods would have experienced a rate of adaptive evolution >100 times higher than characterised most of human evolution’. Two examples help highlight the effects of biocultural change in modern population; coevolution of humans and cattle since the Neolithic has resulted in distinct populations of modern humans becoming lactose persistence, such as Europeans, whilst other populations, such as African and Asian adults, are largely lactose intolerant (Jurmain et al. 2011: 313). This is through active selection of breeding cattle which ‘inadvertently selected for the gene that produces lactose persistence in themselves’ (Jurmain et al. 2011: 313); this example shows the geographical distribution of lactose persistence is often related to a history of cultural dependency on fresh milk products. On the other hand, modern population pressures include the admixture of populations who have had the pressures of urbanisation, agriculture and gene selection for disease loading (such as Tuberculosis) who then interact with indigenous populations, such as Torres Strait Islanders and Papua New Guinea populations, who are not predisposed to deal with TB because of their lack of long term cattle coevolution (Barnes et al. 2011). The importance is recognising that there is great variation at an environmental genetic level in modern AMH, and this is highly likely to be the case during the long and concurrent evolution of AMH (Jurmain et al. 2011).

In conclusion the definition of AMH comes to thus; either a strict definition of AMH present at around 40-35,000 YA onwards, with the full suite of the triad of anatomically modern skeletal elements, modern behavioural & cognitive functions, and similar genetics to today’s worldwide population (Tattersall & Schwartz 2008), or we can take the view that H. sapiens evolved with a mosaic of features that they themselves appeared at different times during the evolution of AMH (Jurmain et al. 2011, Pettitt 2005). It is this author’s belief that the origin of H. sapiens species lies at the Omo Kibish site in Eastern Africa as the earliest evidence so far, and the definition of AMH must be taken with accord of the fossil record (Jurmain et al. 2011). Throughout this paper, a long chronology has been presented and discussed of H. sapiens in the context of human evolution, and consideration has been given to the relatively modern genetic changes in modern human populations (Hawks et al. 2007). This view belies the complexity of defining AMH, especially as new hominins are found (Krause et al. 2010, Reich et al. 2010, Wong 2012), as the consideration of the context is paramount. There is inherent variation in the record, as evidenced between the distinct morphological variation between Omo 1 and Omo 2 fossils, leading up to the palaeogenetic and modern genetic variation and morphological in populations from inside and outside Africa (Briggs et al. 2009, Hawks et al. 2007, Harvati et al. 2012). In comparison, the origin of the Homo genus is still in dispute (Wong 2012: 24) and the chimpanzee fossil record is distinctly lacking (Wood 2005: 69-70). Only recently has SNP genotyping revealed the extent of Pan troglodytes ellioti as a genetically distinct species (Bowden et al. 2012: 1). The importance of this is that we should seek to place the well discussed H. sapiens within a larger framework of where hominins (both extant and extinct) diverged, interacted and evolved (see discussion- Patterson et al. 2006: 1106, Wakeley 2008). The definition of AMH is therefore but one fragment of our long evolutionary history.

This second post, and the first part, deal with biomolecular approaches and research studies in detecting the presence of infectious diseases in human bone from archaeological material. The recent coming of age of biomolecular techniques, as applied to archaeological material, has provided a rich and complex source of information in helping to uncover how infectious diseases spread in the historic and prehistoric past. The second post, here, describes recent research focused on Malaria and associated anaemic conditions, including Sickle Cell Anaemia and Thalassaemia. The first post can be found here.

——————————————————————————————————————–

It has long been realised that malaria can only be recognised in skeletal remains via indirect evidence of presentation of the following pathological lesions- porotic hyperostosis, cribra orbitalia and marrow hypertrophy- which are taken as evidence of the presence of anaemia, the main contributor of mortality in malarial victims (Roberts & Manchester 2010). However there is no pathognomonic bone lesion for either Plasmodium vivax or P. falciparum, the main human species of malaria causing Plasmodium genus (Gowland & Western 2012: 303, Roberts & Manchester 2010: 233), and the above skeletal lesions have varying aetiologies including anaemia, osteitis, parasitic infection, and other interrelated deficiency diseases which are still not clearly understood (Gowland & Western 2012: 302). To securely diagnose malaria in skeletal material, DNA identification of the Plasmodium genus must take place, and even then current Polymerase Chain Reaction (PRC) tests ‘do not appear to be able to amplify routinely the DNA of malaria pathogens from ancient bones’ (Gowland & Western 2012: 302).

Recent immunological techniques to identify antigens have also been used to isolate and identify P. falciparum, although false positives can occur as a result of contamination or diagenetic factors(Gowland & Western 2012: 302). Gowland & Western (2012) have recently proposed a spatial epidemiological model for malarial spread in Anglo-Saxon England, which highlights the re-surging interest in malaria in the modern context as well as one affecting a past population. This holistic approach used GIS data with diagnosed porotic hyperostosis in skeletal remains, mosquito (Anopheles atroparvus) habitat information and historical data in presenting a locality data set for malaria infected individuals (Gowland & Western 2010: 304-305). The modelling of palaeopathological, climatic, and historical data, provides new information on disease range, mechanism of transmission, and infection localities. However, there are also complicating factors in assessing and diagnosing malaria from other diseases, as noted below (Roberts & Manchester 2010: 234).

Particularly important are two inherited haemolytic anaemia’s, thalassaemia and sickle-cell anaemia, who are characterised by abnormal haemoglobin and increased destruction of red blood cells (Jurmain et al. 2011: 312, Roberts & Manchester 2010: 232). Thalassaemia is a genetically determined disorder which is caused by a ‘problem of haemoglobin synthesis’ (Roberts & Manchester 2010: 233). This results in failure or depression of synthesis of the chain, this leads to pale cells with low hemoglobin content which are then rapidly destroyed once formed. There are three grades of the disease, minor, intermediate and major, the last of which includes severe anemia and possible bone changes; the range of the disease is typically centered in the Mediterranean, Middle East and Far East (Roberts & Manchester 2010: 233). The importance is that it is seen as an adaptive response to malaria infection through the development of this heritable disease; that the high red blood cell turnover stalls and negates any effect of malarial infection. Archaeological evidences comes from Greek, Turkish and Cypriot populations deriving from marshy contexts, which are ideal breeding grounds for mosquitoes, the prime vector for malaria (Roberts & Manchester 2010: 233).

Sickle-cell anaemia occurs as a result of the deformation and destruction of red blood cells which leads to over enlargement of bony centres (centered on the skull, pelvis, vertebrae) and over-activity of marrow production as the body produces more red blood cells (Waldron 2009). This inheritable disease range is mainly located in Central and Eastern African populations who have high rates of the disease, but also affects Indian, Middle Eastern, and Southern European populations (Roberts & Manchester 2010: 234). Jurmain et al. (2011: 312) remark that the sickle-cell allele hasn’t always been effective in malarial negation in human populations, and primarily came to prominence during the advent of agriculture, and in particular during the last 2000 years in Africa. The origin of the mutation of the allele responsible, HB5 in haemoglobin, has been dated to 2100 to 1250 years ago in African populations (Jurmain et al. 2011: 312). Although malaria infection has only relatively recently affected human populations, it has become a powerful selective force that still affects large portions of the world’s population today.

In conclusion, biomolecular approaches to archaeological and osteological remains are vital in unraveling past populations and the natural world (Jurmain et al. 2011). The interactions between wild and domesticated animals, humans, insects and the environment are a prerequisite for understanding the mode of transmission and virulence of infectious diseases (Barnes et al. 2011, Gowland & Western 2012, Jurmain et. al 2011). Yet, we must take into consideration the difficulties in understanding infectious disease; examples of the osteological paradox are ever present, understanding the aetiology of bone changes, and the context of genetic differences between populations must be noted whilst PCR amplification, aDNA detection and genome explorations methods must be continually improved for clearer results (Li et al. 2011, Schurch et al. 2011, Spigelman et al. 2012, Tran et al. 2011); this approach must be multidisciplinary in understanding past and present populations (Jurmain et al. 2011, Roberts & Manchester 2010, Waldron 2009).

The modern world has changed, and the boundaries that once protected various human populations has changed dramatically with cheap air travel and vast population movement; this is unprecedented in both history and prehistory, and in population density and scale, but also at the genetic level in human genetic variation (Hawks et al. 2007, Jurmain et al. 2011: 311). The eradication of smallpox, the Bill and Melinda Gates foundation in fighting malaria, and the ongoing WHO (World Health Organisation) case against polio (Branswell 2012: 50) are strong examples of what can be achieved worldwide. By building a past population profile of the effects of infectious disease, we are better prepared for the fight tomorrow.

The following two posts deal with biomolecular approaches and research studies in detecting the presence of infectious diseases in human bone from archaeological material. The recent coming of age of biomolecular techniques, as applied to archaeological material, has provided a rich and complex source of information in helping to uncover how infectious diseases spread in the historic and prehistoric past. Whilst it has help clear some mysteries up, it has unleashed others. The first post, here, describes recent research focused on Treponemal diseases (including Yaws, Syphilis and Pinta) and Smallpox. The second post can be found here.

——————————————————————————————————————————–

Treponemal Diseases

Roberts & Manchester (2010: 216) note that infectious diseases are ‘not solely microbiological entities but are a composite reflection of individual immunity, social, environmental, and biological interaction’. The study of treponemal disease, in particular, is fraught with controversy and stigma, both in the modern and historical contexts (Lucas de Melo et al. 2010: 1, Roberts 2000), and in the nature of its spread and transmission. However the combination of molecular pathology, phylogenetics, and palaeopathological studies, are helping to produce a clearer genetic origin of the disease and the impacts that this disease had, and continues to have, on the world at large (Hunnius et al. 2007: 2092). Typically the bacterial diseases of the genus Treponema are split into different forms; pinta (T. carateum), yaws (T. pallidum subspecies pertenue), endemic syphilis (T. pallidum subspecies edemicum) and venereal/congenital syphilis (T. pallidum subspecies pallidum) (Table 1; Lucas de Melo et al. 2010: 2). The four forms were, until recently, indistinguishable in physical and laboratory characteristics (Roberts & Manchester 2010: 207), whilst the pinta strand does not affect bone (Waldron 2009: 103). DNA analysis of the bacteria of venereal syphilis has shown a difference between it and the non-venereal types; although it is noted that there is no change in the clinical presentation of the disease (Roberts & Manchester 2010: 207).

Yaws was likely the first disease to emerge, probably from an ape relative in Central Africa, whilst the endemic form of syphilis derived from an ancestral form in the Middle East and the Balkans at a later date, whilst T. pallidum was the last to emerge, probably from a New World progenitor, although the issue is still highly contentious (Roberts & Manchester 2010: 212, Waldron 2009: 105). Gaining virulence at a dramatic rate in the 15th and 16th centuries AD in Europe, venereal syphilis affected a large section of the population due to its mode of transmission. It should be noted, however, that bone changes in syphilis are rare in the early stages but common in the tertiary stage of the disease (Roberts & Manchester 2010). It has also been noted that there could be a back and forth transmission, from one treponemal disease to another, within intra-population groups changing from one environment to another; that ultimately it’s possible that each social group, or population, has its own treponemal disease suited to its ‘geographic and climatic home and its stage of cultural development’ (Roberts & Manchester 2010: 213).

However, this infectious disease, in its venereal form, is particularly hard to locate and identify in archaeological populations; the limitations of biomolecular palaeopathology have become clear (Bouwman & Brown 2005: 711, Hunnius et al. 2007, Lucas de Melo et al. 2010: 10). Bouwman & Brown’s (2005) experiment, and Hunnius et al. (2007) subsequent paper, have highlighted the difficulties in amplifying T. pallidum subspecies T. pallidum, even in highly suspected bone samples. Bouwman & Brown (2005: 711) tested 9 treponemal samples using the Polymerase Chain Reaction (PCR) tests, optimized to highlight ancient treponemal DNA. This resulted in poor amplification of treponemal ancient DNA (aDNA) from human bone, even with bone of varying origins (geographic, social and climatic samples). 3 outcomes where postulated; the bones were either not suitable for aDNA retrieval, treponemal aDNA was present but the PCR was not sensitive enough to be pick it up, or there was no treponemal DNA in the bones (Bouwman & Brown 2005: 711-712). Subsequent investigations and phylogenetic approaches have highlighted that the disease invades different parts of the body at impressive rates, but in the later stages of the disease, the organism’s DNA is not present in the actual bone itself, just at the stage when an osteologist can identify it macroscopically (Hunnius et al 2007: 2098). Phylogenetic evidence supports evidence of variations in the virulence of syphilis, and the support of a more distant origin, possibly around 16,500 to 5000 years ago, but where exactly remains unsolved (Lucas de Melo et al. 2010: 2). Interestingly, in the early 20th century P. Vivax (the main causer of malaria) was used as a treatment for patients with neurosyphilis in a procedure by the physician Julius Wagner-Jauregg; it was injected as a form of pyrotherapy to introduce high fevers to combat the late stage syphilitic disease by killing the causative bacteria (Wagner-Jauregg 1931).

Smallpox

The Smallpox virus is particularly devastating and disfiguring disease, but thankfully no longer an active infection in the modern world (Manchester & Roberts 2010: 180). Although kept only in laboratory samples now, there is an ongoing concern regarding whether it could be a danger to modern archaeologists dealing with infected material (Waldron 2009: 110). The disease, once contracted, either leads to recovery with lifelong immunity or death. The severe form is called variola major and is documented in the Old World with a 30% death rate once contracted, whilst its less virulent form, named variola or alastrim minor, is found in Central America and has a mortality rate of 1% (Hogan & Harchelroad 2005, Li et al. 2007: 15788). Smallpox, the strictly human variola virus pathogen, is found in literature and documentary records during the last 2000 years (Larsen 1997), yet an osteological signature is not present or identifiable in infected individuals (Waldron 2009: 110). Therefore to find out the origins of the disease, Li et al. (2007) used correlated variola phylogenetics with historical smallpox records to map the evolution, origin and transportation of smallpox between human populations.

Li et al. (2007: 15787) state that no credible descriptions of the variola virus have been found on the American continent or sub-Saharan Africa before the advent of westward European exploration in the 15th century AD; suggesting that with European exploration and expansion came the virulent waves of smallpox that helped to decimate the existing Native American populations, who previously had no contact or natural immunization with such a highly virulent disease. It is worth noting here the disease has been used in warfare as a chemical weapon surprisingly early. During the 18th century American colonial wars between the French, British and the Native Americans, the British forces stationed in America actively infected items of clothing that were given to the Native population to help aid the spread of the disease among the Native Americans , who at that time were largely allied to the French. This weakened the Native American population dramatically during the various colonial wars and subsequent colonial expansion westward; it’s estimated nearly half of the American Native population died from smallpox alone and its naturally rapid commutable spread of smallpox through human populations (Hogan & Harchelroad 2005).

Li et al. (2007: 15787) note that there are ambiguous gaps in the evolution of smallpox disease itself however. Li et al. (2007) initiated a systematic analysis of the concatenated Single Nucleotide Polymorphisms (SNP’s) from the genome sequences of 47 variola major isolates from a broad geographic distribution to investigate its origins. Variola major has a slowly evolving DNA genome, which means a robust phylogeny of the disease is possible (Hogan & Harchelroad 2005).

Firstly, the results showed that the origin of variola was likely to have diverged from an ancestral African rodent–borne variola like virus, either around 16,000 or 68,000 thousand years ago dependent on which historical records are used to calibrate the molecular clock (East Asian or African) (Li et al. 2007: 15791). Taterapox virus is associated with terrestrial rodents in West Africa, and provides a close relationship with the variola virus. It is entirely possible that variola derived from an enzootic pathogen of African rodents, and subsequently spread from Africa outwards (Li et al. 2007: 15792). Secondly, evidence points towards two primary clades of the variola virus, both from the same source as above, but each represent a different severity and virulence of the variola virus.

The first primary clade is represented by the Asian variola major strains, which are the more clinically severe form of smallpox; the molecular study of its natural ‘clock’ suggests it spread from Asia either 400 or 1600 years ago (Li et al. 2007: 15788). Included in this first primary clade is the subclade of the African minor variation of the main Asian variola major disease. The second primary clade compromises two subclades, of which are the South American alastrim minor and the West African isolates (Li et al. 2007: 15788). This clade had a remarkably lower fatality rate in comparison to the above clade. The importance of phylogeny analysis is that it highlights areas of disease prevalence and virulence that can be missed, or indeed entirely absent, from the osteological and archaeological record (Brown & Brown 2011).