Abstract

Drug-resistant pathogens emerge faster than new drugs come out of drug discovery pipelines. Current and future drug options should therefore be better protected, requiring a clear understanding of the factors that contribute to the natural history of drug resistance. Although many of these factors are relatively well understood for most bacteria, this proves to be more complex for vectorborne parasites. In this review, we discuss considering three key models (Plasmodium, Leishmania and Schistosoma) how drug resistance can emerge, spread and persist. We demonstrate a multiplicity of scenarios, clearly resulting from the biological diversity of the different organisms, but also from the different modes of action of the drugs used, the specific within- and between-host ecology of the parasites, and environmental factors that may have direct or indirect effects. We conclude that integrated control of drug-resistant vectorborne parasites is not dependent upon chemotherapy only, but also requires a better insight into the ecology of these parasites and how their transmission can be impaired.

Introduction

Scope of the review

Infectious diseases are the theatre of an evolutionary arms race between pathogens and their hosts. Like the Red Queen who taught Alice to run ever faster in order to stay in the same place, most infectious microorganisms have developed intricate strategies to survive as colonists of their hosts, whereas the latter have evolved by equally clever adaptations to destroy or fence off these intruders. Medical interventions and other ecological interferences inevitably led to a new episode in the arms race between pathogens and their hosts, as shown by the rapid development of drug resistance among many pathogens. In a recent review, Andersson & Hughes (2011) discussed a particular aspect of this competition, namely the mechanisms and processes involved in the persistence of antibiotic resistance in bacterial populations and how to exploit them to increase the likelihood of reversing the problem. In this review, we extend and broaden this analysis to eukaryotic pathogens, more specifically vectorborne parasites. Major cellular and molecular differences with pro-karyotes could differentially influence the natural history of resistance, from emergence to spread and persistence or disappearance. In addition, the digenic life cycle of these parasites imposes adaptation to not only the micro-biotope in the vertebrate, but also to the fundamentally different and drug-free micro-biotope in the invertebrate host, possibly posing an additional bottleneck for drug-resistant parasites to become successful.

Main features of the chosen parasitic models

Three different paradigms among vectorborne diseases will be illustrated by the models Plasmodium, Leishmania and Schistosoma, respectively (Fig. 1). Plasmodium spp. are protozoan parasites (Sporozoa) that reproduce asexually in their host and have an obligatory sexual life cycle in the mosquito host. Infections with Plasmodium falciparum, the most important human malaria parasite, can be very diverse: up to five and sometimes well in excess of ten genotypes can infect a single host in high transmission areas (Konate et al., 1999; Beck et al., 2001; Magesa et al., 2002; Sutherland et al., 2002; Juliano et al., 2010; Auburn et al., 2012). Resistance to all previous first-line treatments is already widespread (Hyde, 2005), and resistance against the only available first-line treatment that has not yet failed globally, the artemisinin derivatives, may be on the rise (Dondorp et al., 2010). The antimalarial resistance problem seems to be largely driven by the transmission of resistant parasites, opposed to frequent emergence of new mutants in various regions (Wootton et al., 2002; Nair et al., 2003; Hastings, 2004; Mita et al., 2011). Leishmania belong to another group of Protozoa (Kinetoplastida). They have a predominantly asexual reproduction mode in both vertebrate and invertebrate hosts, but may nevertheless undergo recombination between individuals, related (endogamy) or not (Rougeron et al., 2010). Albeit within-host multiclonality is poorly studied, it is likely to be very low, considering the low infection rates among sand fly vectors (Bhattarai et al., 2009) and the strong immunity after recovery from infection (Modabber, 2010). Transmission dynamics of Leishmania are rather slow; hence, frequent emergence of resistance likely plays a major role, as confirmed by whole genome sequencing studies (Downing et al., 2011). In addition, transmission may be predominantly driven by individuals not eligible for treatment, such as asymptomatically infected humans in case of anthroponotic leishmaniasis (Stauch et al., 2012) or animals in case of zoonotic forms. In this context, it is striking how drug resistance may persist, and fitness appears to be an essential feature to consider. Schistosoma are metazoan parasites (Trematoda) showing a complex life cycle with an obligatory alternation of sexual reproduction within the final human host and asexual reproduction within the intermediate snail host. A major conceptual difference with protozoa is the absence of replicative stages in the human host, with adults living in their hosts' blood for up to 10 years and where the female produces about 200–1000 eggs per day that need to be excreted to the environment to complete the life cycle (Gryseels et al., 2006). A single host can be infected by many different schistosome genotypes, resulting in large populations showing high genetic variation (Agola et al., 2006, 2009; Rudge et al., 2008; Thiele et al., 2008; Gower et al., 2012). Infection loads are, however, aggregated (Anderson & May, 1991) with many people having low-intensity infections that are asymptomatic and not obvious enough to warrant treatment. Despite several alarming reports of Schistosoma mansoni resistance foci against the two most commonly used drugs, praziquantel (PZQ) and oxamniquine (OXA), resistance did not spread as there is no indication that it has become a serious public health problem (Cioli et al., 1993; Doenhoff et al., 2002; Doenhoff & Pica-Mattoccia, 2006; Wang et al., 2012). Nevertheless, OXA resistance showed stronger characteristics than PZQ resistance: OXA-resistant S. mansoni resisted to very high doses of the drug (Pica-Mattoccia et al., 1993), while S. mansoni isolates frequently exposed to PZQ differed maximum threefold in their tolerance to the drug compared with susceptible strains (Cioli et al., 2004).

Figure 1.

Schematic representation of the life cycles of Plasmodium, Leishmania and Schistosoma. The protozoan parasites Plasmodium and Leishmania are transmitted by mosquitoes of the genus Anopheles and sand flies of the genus Phlebotomus, respectively (both belonging to the Phylum Arthropoda). The helminth parasite Schistosoma is transmitted by a snail of the genus Biomphalaria or Bulinus (Phylum Mollusca) in a watery environment. Besides many differences between the life cycles, the main difference between the two protozoan parasites and Schistosoma is that both protozoa have replicative stages in the host, while Schistosoma adults do not replicate in the host itself but only produce eggs that are expelled into the environment. One of the main differences between Plasmodium and Leishmania is that Plasmodium has sexual stages, while Leishmania reproduces mainly clonally.

Natural evolution of drug resistance in vectorborne parasites: a multiplicity of scenarios

We will show in this review how the interplay between the biology of the parasites, their ecology and the mode of action of the respective drugs can shape dramatically different scenarios of emergence, spread and disappearance of drug resistance. From the pathogen point of view, we will stress the importance of their genome organisation and dynamics (some structural features favouring rapid adaptations), their reproduction mode (with or without sexual recombination) and the occurrence of molecular pre-adaptations (for instance to oxidative stress) upon which the pathogen can evolve drug resistance more easily. Several features of the drugs (Table 1) will be highlighted, such as their mode of action (do they interact with host's protective mechanisms against which the parasite has already adapted?), the role of low and high drug pressure in the emergence and spread of resistance and the possible interaction with environmental pressure such as arsenic pollution. The fitness of drug-resistant parasites (both in presence and absence of the drug) will be a central point of discussion as it plays an essential role at different levels of the natural history of drug resistance. This will be illustrated with classical examples of fitness cost and contrasted with a unique case of increased fitness encountered in natural Leishmania isolates.

Table 1. Selection of drugs against vectorborne parasites for which clinical resistance has developed

A reduced clearance time is observed (Dondorp et al., 2009). The mechanism is unclear, one hypothesis is resistance of only one particular life stage (ring stage) of the parasite (Witkowski et al., 2013)

Emergence of drug resistance

The emergence of drug-resistant parasites to clinically and ecologically significant levels is influenced by many factors that relate to (1) the pathogen; (2) the drug itself; and (3) the environment (Fig. 2).

Figure 2.

Summary of the factors that affect the emergence, spread and persistence of drug-resistant vectorborne parasites.

Pathogen-related factors

Pathogens can develop different pathways to become more tolerant or resistant to a drug. Drug resistance might not only arise from a change or mutation in the drug target gene, but also by ways to decrease the drug–target interaction such as changes in the permeability of a cell or detoxification of the drug through the cell's natural antioxidant defence enzymes (James et al., 2009). Which of these pathways will evolve in the pathogen does not only depend on the characteristics of the drug, but also on the pathogen's general biological traits, resulting in some pathogens being inherently more likely to generate drug resistance than others.

Leishmania parasites, for example, are experts in manipulating gene dosage through massive aneuploidy (Downing et al., 2011; Mannaert et al., 2012) and the formation of circular extrachromosomal episomes. These are ‘emergency toolboxes’ that can contain several genes and can be quickly up- or down-regulated in a stressful or drugged environment if needed (Leprohon et al., 2009). In addition, Leishmania has several inherent molecular traits to defend itself against oxidative stress: among others, it contains a redox cascade that depends on trypanothione, which is a highly efficient thiol that only occurs in Trypanosomatids (Krauth-Siegel & Comini, 2008). Developing resistance to antimonials (SSG), which cause intracellular oxidative stress and increase exogenous oxidative stress released upon the parasite through the host cell, might thus have been not too difficult for Leishmania donovani in the Indian subcontinent: boosting its inherent defensive and host manipulation skills would have been sufficient (reviewed in Vanaerschot et al., 2012). Another indication of the ease by which SSG resistance has emerged is the fact that resistance has emerged in parasites with different genetic backgrounds within a restricted endemic region (Downing et al., 2011) and that multiple SSG resistance mechanisms are circulating in the Indian subcontinent (Decuypere et al., 2012). This contrasts with the situation of Plasmodium, where high-level resistance against two of the most widely used antimalarials, chloroquine (CQ) and sulfadoxine-pyrimethamine (SP), has for both drugs arisen only a handful of times. In both situations, continent-wide resistance in Africa was caused by a single selective sweep of resistant parasites originating from South-East Asia (Wootton et al., 2002; Nair et al., 2003; Hastings, 2004; Mita et al., 2011). Hence, the emergence of high-level resistance against antimalarials appears to be a rare event.

The mutation rate, the life cycle and the stage on which drug pressure is exerted also play a key role. Malaria parasites undergo many asexual replication cycles within the human host, growing from 102–103 injected sporozoites, to 1011–1012 parasites within the blood stream at peak parasitaemia. With a mutation rate estimated at 10−9 SNPs per generation, based on the dhfr gene (Paget-McNicol & Saul, 2001), and drug pressure coinciding with this replicating life stage of the parasites, selection for resistant mutants is strong. Sexual reproduction, and thus genetic recombination, occurs in the mosquito vector (Walliker et al., 1987). Such recombination provides a mechanism for multidrug resistance to be broken down or built up and additionally, for resistant genes to spread in a diverse genetic background and potentially increase their fitness (Walliker et al., 2005) – a prerequisite perhaps for the observed selective sweeps of drug-resistant Plasmodium throughout endemic regions. In case of leishmaniasis, maximal parasite loads are likely much smaller than in malaria, while the substitution rate in Leishmania is estimated to be similar as in Plasmodium (2.74 × 10−9 substitutions per site per generation; Downing et al., 2012). However, this mutation rate could be much higher in case of structural genomic changes such as local gene amplification. In addition, the predominantly clonal nature of Leishmania reproduction seems to favour the emergence of resistance compared to Plasmodium. In the case of Schistosoma, sexually reproducing adults are the only target of PZQ and OXA. This implies that mutations [10−4 mutations per generation for microsatellite loci (Valentim et al., 2009)] can only be generated in the nonreplicative offspring (eggs) that subsequently need to be successfully released in the environment and find a snail host while competing with wild-type offspring. Such a bottleneck likely constrains the establishment or emergence of drug-resistant schistosomes. On the other hand, helminth parasites usually live in large populations that have a high degree of genetic variability. This may be a basis for rapid selection and thus emergence of inherently drug-resistant alleles, as has already been reported for livestock helminths (Geerts & Gryseels, 2000; Prichard, 2001).

Drug-specific factors

One of the best known factors contributing to the emergence of resistance is the half-life of the drug. This determines the period of selective pressure on parasites: drugs with a long half-life exert more selection for resistance (Watkins & Mosobo, 1993; Read & Huijben, 2009). The half-lives of antimalarial drugs range from weeks (mefloquine) to merely several hours (artemisinins) (White, 2002). Thus, the short half-life of artemisinin likely reduces the probability of resistance emergence in Plasmodium. While the antileishmanial drug SSG is eliminated in two phases with a half-life of 2 and 76 h respectively, 80–95% of the drug is eliminated through the urine within 6 h after administration (Frezard & Demicheli, 2010). Despite the short half-life of SSG, SSG-resistant strains seem to have emerged multiple times during the decades of use (Downing et al., 2011). Miltefosine (MIL), on the other hand, has a longer half-life of around 7 days and was thus suspected to be even more prone to the emergence of resistance (Dorlo et al., 2012). Up to now, even when the clinical efficacy of the drug is reported to decline in the Indian subcontinent (Sundar et al., 2012; Rijal et al., 2013), there is only one single report of a confirmed natural MIL-resistant Leishmania strain, isolated from a French HIV-coinfected patient who failed repeated episodes of MIL treatment (Cojean et al., 2012). The antischistosomal drugs PZQ and OXA have a short half-life ranging between 1–3 and 1.5–2 h, respectively (Kaye & Woolhouse, 1976; Steiner et al., 1976; Kokwaro & Taylor, 1991), which possibly contributed to the scarcity of emergence of resistance in the field.

The quality of the drug has also been recognised as a crucial factor in the emergence of resistance to several drugs in various pathogens: a low quality reduces the concentration of the active compound and might increase toxicity, both are factors which contribute to subtherapeutic dosage either directly or indirectly by urging the patient to interrupt or stop treatment. A low-quality treatment, such as inadequate dosage, evidently has a similar effect. However, one should also keep in mind that a high dosage also increases the selective pressure for drug-resistant strains, favouring the rapid selection of full-blown resistant parasites, if present, within a parasite population in a given patient (Read et al., 2011). This particular aspect of the arms race between the pathogen and human interventions is well illustrated in the case of SSG (Haldar et al., 2011). The drug in its latest form was introduced in India in the 1940s. North Bihar (north of the Ganges) was first reported to have serious problems of decreasing SSG efficacy during the epidemic in the late 1970s. Since then, SSG treatment regimens were regularly modified by increasing both dosage and duration in an effort to counteract the gradually decreasing SSG efficacy. Nevertheless, the situation continued to deteriorate dramatically during the epidemics of the early 1990s with treatment failure rates reaching up to 65% (Sundar et al., 2000). This continuous and gradually increasing drug pressure on the parasite population, implemented due to the lack of other affordable drugs, has likely contributed to a faster emergence and spread SSG-resistant L. donovani on the Indian subcontinent.

Interestingly, the action mechanism of the drug at hand may also be a major factor facilitating the emergence of resistance. If a drug targets general defence mechanisms of the pathogen, such as the oxidative or nitrosative stress pathways in intracellular pathogens, for example, resistance might emerge sooner than expected. This is because oxidative stress and nitrosative stress are fundamental stress responses of host cells against which intracellular pathogens have developed several defence mechanisms throughout thousands of years of evolution. Developing resistance to such a drug might thus be based on pre-existing pathways and therefore only require changes that have a limited impact on the pathogen's survival (e.g. up-regulation of oxidative radical scavengers). This has up to now only been illustrated by the case of SSG resistance in L. donovani, as already mentioned earlier.

Ecological factors

Ecological factors may indirectly affect the emergence of drug resistance through their impact on transmission. It is striking for instance that antimalarial resistance emerges more frequently in South-East Asia. South-East Asia has a much lower transmission intensity than most endemic areas in Africa, which leads to (1) a higher frequency of symptomatic cases and consequently a higher per-parasite treatment rate (i.e. higher drug pressure); (2) lower multiplicity of infection, resulting in possibly lower levels of competitive suppression of mutants; (3) a low degree of genetic recombination due to mostly clonal infections and thus lower probability of breaking down multilocus resistance; and (4) reduced immunity which makes mutant parasites more likely to survive the immune response (Hastings, 2004; Klein et al., 2008; Mita et al., 2011).

The environment, part of a parasite's ecology, might also play a more direct role in the emergence of drug-resistant pathogens. Especially, impoverished and heavily industrialised areas suffer from environmental pollution and can only afford cheap drugs that are sometimes based on relatively toxic metalloids, as is the case in leishmaniasis treatment. Environmental pollution with metalloids can thus expedite the emergence of resistance to metalloid drugs or can even cause what may appear as ‘intrinsic intolerance’ of the pathogen to the drug if the environmental exposure existed already long before the metalloid drug was induced. Interestingly, Perry et al. (2011) reported that arsenic contamination of ground water, caused by shallow tube wells, geographically coincides with places where the prevalence of SSG-resistant L. donovani is high. Because arsenic and antimony are closely related metalloids and have shown to cause similar types of stress in Leishmania, arsenic pollution of drinking water might indeed very well have contributed to the swift emergence of SSG-resistant L. donovani in these high endemic regions in the north of India. Environmental pollution with antimony itself has also been described, and in regions where the drug pressure was low or absent, Leishmania infantum strains isolated from dogs showed a reduced susceptibility to SSG (reviewed by Sereno et al., 2012). A similar situation of environmental pollution might perhaps also apply for Leishmania of the subgenus Viannia in Peru, where SSG resistance is widespread despite the presence of an untreated animal reservoir (Yardley et al., 2006), but this should be further verified. As yet we are not aware of any similar scenarios taking place in the treatment of Plasmodium or Schistosoma infections.

Spread of drug resistance

The spread of drug-resistant parasites is ultimately dependent on their transmission potential for which within- and between-host ecology plays a key role. This is impacted by a combination of the applied drug pressure on the whole parasite population and the competitive potential or relative fitness of drug-resistant vs. drug-sensitive pathogens (Fig. 2).

Drug pressure

Infectious diseases are traditionally treated with aggressive chemotherapy, aimed at eradicating every single pathogen. While this approach maximally reduces the probability of de novo resistant mutants by drastically reducing the parasite population, it theoretically also maximises the selection on resistant mutants already present in the infection (Read et al., 2011), favouring their spread. This phenomenon has been demonstrated in a rodent malaria model (Wargo et al., 2007). Similarly, pregnant women receiving curative chemoprophylaxis in an area with high levels of resistance presented with a higher frequency of resistant parasites and exacerbated malaria infection than women that opted out of using chemoprophylaxis (Harrington et al., 2009). One of the factors determining the strength of selection (or drug pressure) on a parasite population is the proportion of the population in refugia at the time of treatment (Van Wyk, 2001; Coles, 2002; Webster et al., 2008). Refugia are the proportions of parasite population that are not exposed to the drug and therefore escaping selection, hereby diluting the resistance genes in the pathogen population. Large refugia are likely to be found (1) within human populations living in areas of high infection intensity and prevalence where chemotherapy is only administered randomly or selectively [for instance for P. falciparum malaria in many parts of sub-Saharan Africa (Warhurst & Duraisingh, 2001)]; and (2) within infested environments where no control measures are taken (Webster et al., 2008). Although the size of refugia and its impact on the spread or persistence of drug resistance is largely unknown for schistosomes, its proportion under current treatment policies is likely high because (1) a significant proportion of parasites are left untreated [such as adults and preschool children (Odogwu et al., 2006)]; (2) PZQ is only given annually; (3) alternative reservoir host species might exist; and (4) many schistosome life stages survive treatment (Webster et al., 2008). Refugia might therefore explain why PZQ drug resistance has not (yet) become a public health problem in Africa (Doenhoff et al., 2009). In China, however, 30 years of intense PZQ pressure on the parasite population through treatment of both humans and domestic reservoirs (such as cattle and buffaloes), hereby eliminating known refugia sites, in combination with snail control using niclosamide did not have an impact on the efficacy of PZQ against Schistosoma japonicum (Xianyi et al., 2005). These findings suggest that factors other than refugia, vector control, may counter the spread of resistance. Asymptomatically infected individuals may also serve as refugia sites. For L. donovani for instance, more than 90% of the infected individuals are asymptomatic (Ostyn et al., 2011) and are thus not treated. The presence of such large refugia sites significantly reduces the average drug pressure on the total population of parasites. The proportion of drug-resistant parasites in these asymptomatic individuals is currently unknown and will depend among other things on pathogen fitness. Last but not least, the occurrence of local sanctuaries within the host, where parasites may have a lower exposure to the drug or exhibit different parasite dynamics compared with the systemic compartment, should also be considered.

Fitness cost

Next to drug pressure, the spread of drug-resistant pathogens is highly dependent on how well they can compete with wild-type drug-sensitive pathogens in the total population, that is, their fitness. We define fitness as the capacity of organisms to survive, reproduce and be transmitted between hosts in a given environment (Vanaer-schot et al., 2012) – which summarises all that is important to complete their life cycle. Organisms that acquire resistance to a drug usually pay a metabolic price in return which often only becomes apparent when drug pressure is alleviated. This cost is not a simple genetic trade-off but plays out differently depending on the ecological context (reviewed in Read & Huijben, 2009). For instance, a metabolic cost paid in the absence of competition may increase greatly when a competitor is around that is not bound to this cost. This is especially true for drug-resistant vectorborne parasites that must survive in the nondrugged environment of the vector, whereas other directly transmitted pathogens circulate only within (possibly) drug-treated hosts. The metabolic cost of resistance that may play out in the vector can thus significantly impact between-host transmission. This fitness aspect of drug-resistant pathogens at the population level determines how fast drug-resistant pathogens will overtake drug-sensitive pathogens in a certain geographical region at a given time, depending on the level of treatment or drug pressure. These concepts are well illustrated by mathematical modelling and experimental studies in the three parasites here considered.

In Leishmania, a recent mathematical modelling study (Stauch et al., 2012) predicted that the increase in SSG treatment failure rates cannot be explained by assuming that all SSG-resistant L. donovani strains would contribute to SSG treatment failure in the patient, and thus suggesting that SSG-resistant L. donovani must have had a greater fitness compared with SSG-susceptible L. donovani. This fitness aspect could be disease related, by causing more clinical cases (higher pathogenicity) or more severe disease (higher virulence), and/or transmission related, by increasing the transmissibility from sand flies to humans or vice versa. The hypothesis of increased fitness – in the absence of the drug – was explored in in vitro and rodent models and showed that SSG-resistant L. donovani indeed showed traits of a higher fitness (reviewed in Vanaerschot et al., 2012). Although this should be further confirmed by including fitness studies in the vector itself, this higher fitness of natural drug-resistant L. donovani in the absence of drug pressure seems rather unique. Due to the mainly clonal reproduction of L. donovani, the drug-resistant phenotype with its associated traits has an even greater chance to persist and spread compared with drug-resistant organisms that undergo sexual reproduction and might lose the drug-resistant phenotype in the process, such as Plasmodium and Schistosoma. Plasmodium has a different ecology that may differentially impact the persistence and spread of drug-resistant strains. Malaria parasites frequently share their host with other genotypes and biodiversity may be high within an infection. These multiclonal infections arise more frequently from mixed inocula than from super-infections by multiple mosquitoes (Nkhoma et al., 2012). Competitive suppression of resistant parasites is thus expected to be strong in the absence of drug treatment. Such competition could be resource-based exploitation competition, interference competition or immune-mediated apparent competition (Read & Taylor, 2001) and is dependent upon the cost of resistance. The stronger the competitive suppression of resistant parasites, the harder it is for the parasite to establish in a population with a low treatment rate. Studies on the fitness of drug-resistant Plasmodium all show an advantage for drug-resistant lines in a rodent model when drug pressure is applied, but a significant fitness cost in the absence of drug pressure (Peters et al., 2002; de Roode et al., 2004; Hayward et al., 2005; Wargo et al., 2007; Huijben et al., 2010, 2011) (reviewed in Babiker et al., 2009). Data on how the resistant phenotype plays out in the vector are scarce, only one study has looked at, and found, a cost of resistance in the mosquito vector (Shinondo et al., 1994). However, a resistant phenotype may play out favourably within the vector as well. CQ-resistant Pfcrt mutants had a significant selective advantage in competitive mosquito infections in the presence of CQ, which resulted in enhanced infectivity to mosquitoes. Such higher infectivity to mosquitoes may have been an important driver of the worldwide spread of mutant pfcrt (Ecker et al., 2011). In Schistosoma, models of selection and evolution have shown that the change in frequency of resistant strains depends on the interplay between their relative fitness and the degree of selective pressure (Feng et al., 2001; Xu et al., 2005). The rationale of these models is that natural schistosome populations consist of a collection of strains that express different levels of drug resistance and that parasites pay a cost to drug resistance that is inversely related to the level of drug resistance. It was predicted that drug treatment allows the coexistence of multiple parasite strains with many different levels of resistance (from susceptible to highly resistant), while exclusion of susceptible strains will only happen when the treatment rate is high enough. A cost of resistance has been experimentally described for OXA-resistant worms: they appeared to be slightly less viable than their sensitive counterparts at all stages of the life cycle due to the lack of a functional sulfotransferase (Cioli et al., 1992). This cost of resistance could potentially explain why OXA resistance remained restricted to sporadic foci without any apparent tendency to spread in the human population (Cioli et al., 1993; Secor & Colley, 2005). The few experimental studies performed on PZQ-resistant worms suggest contrasting results depending on the life stage. On one hand, PZQ-resistant schistosomes exhibited reduced cercarial output compared with control strains (Liang et al., 2001; William et al., 2001). On the other hand, PZQ-resistant isolates from Senegal showed a higher infectivity of cercariae to snails, a longer prepatent period within the snails, a higher longevity of snails infected with PZQ-R isolates and significantly more eggs were found within the faeces and tissues of mice infected with PZQ-resistant isolates (Liang et al., 2001). Although these observations should be interpreted with care and much more research is urgently needed, these findings indicate that a fitness cost of resistance at a certain point in the life cycle (e.g. low numbers of cercariae shed by snails) might be counterbalanced by fitness benefits elsewhere in the cycle (e.g. higher infectivity of cercariae to snails or higher egg production). Such fitness compensatory changes have also been described in bacteria, although more at a molecular level (reviewed in Andersson & Hughes 2011). The cost of resistance in vectorborne parasites in both the vertebrate host and invertebrate vector thus plays an important role in the spread of resistance. Above all, it is the specific within-host ecological context of each disease system that plays a key part in how this cost translates in the evolution of resistance.

Persistence of drug resistance

Whether or not drug-resistant pathogens will persist relies upon their fitness compared with wild-type drug-sensitive pathogens in drug-free conditions and their ecological context (Fig. 2). The traits of a greater fitness of SSG-resistant L. donovani, shown by various in vitro and in vivo studies (Vanaerschot et al., 2012), will likely cause SSG-resistant L. donovani to persist in the natural parasite population in the Indian subcontinent. This is supported by epidemiological data indicating that SSG-resistant parasites are still highly prevalent in the endemic regions of India and Nepal (Mukhopadhyay et al., 2011; Vanaerschot M and Dujardin JC, unpublished results), despite the drop in SSG pressure on the parasite population now that SSG is no longer the official first-line treatment against VL (from 2006 onwards). However, a minority of SSG-resistant strains did not show traits of a higher fitness and, concomitantly, the prevalence of their genotypes was decreased in recently isolated samples, likely suggesting the disappearance of these less successful SSG-resistant genotypes (Vanaerschot M and Dujardin JC, unpublished results). In case of malaria, anecdotal evidence from natural infections suggests that a cost does indeed occur for drug-resistant P. falciparum parasites. In countries where chloroquine was replaced as first-line treatment, resistant parasites were quickly replaced by an expanding population of susceptible strains (Wang et al., 2005; Laufer et al., 2006, 2010; Temu et al., 2006). Additionally, in areas of seasonal malaria transmission, frequency of resistance markers drops every year during the long dry season only to increase again during the wet season when high transmission and drug usage are in place (Abdel-Muhsin et al., 2004; Babiker et al., 2005; Ord et al., 2007). Similar observations can also be seen within a patient on a shorter timescale, where frequency of resistance in pregnant women at time of delivery was observed to be lower in women who had a longer time period between delivery and last drug treatment compared with women who had received their last dose of intermittent preventative treatment < 2.5 months ago (Menendez et al., 2011). In all these cases, the frequency of resistant Plasmodium drops when selective pressure by the drug is taken away, suggesting a significant fitness cost in the absence of treatment and the disappearance of drug-resistant Plasmodium once the natural pathogen population is relieved from drug pressure. In schistosomiasis, there are almost no epidemiological data on the disappearance of PZQ resistance. The previously mentioned lower reproductive success in the intermediate snail host for S. mansoni from Egypt (William et al., 2001) and Senegal (Liang et al., 2001) likely translates into a fitness cost. Some of the PZQ-resistant isolates from Egypt even reverted to an increased state of PZQ susceptibility in the absence of drug pressure (William et al., 2001). The instability of some of these PZQ-resistant isolates could potentially explain why PZQ-resistant worms in Egypt have not become a more significant portion of the schistosome population despite 10 years of therapeutic pressure in villages where these strains were previously identified (Ismail et al., 1999; Botros et al., 2005). Furthermore, data on in vitro studies and observational studies in one geographical area or one particular host population can only be reliably extrapolated when the ecological context, such as immune status, level of within-host competition and transmission intensity, is also understood.

Conclusions and recommendations

This review demonstrated the multiplicity of scenarios that occur during the natural history of drug resistance in vectorborne parasites. This clearly results from the biological diversity of the different organisms here presented (such as their genome organisation, reproduction mode, life-cycle characteristics), but also from the different modes of action of the drugs used (targeting vulnerable points of their metabolism or key natural defences of the pathogens and thus further boosting pre-existing adaptations), their own specific within- and between-host ecology, and environmental factors acting directly (e.g. chemical pollution) or indirectly (through an effect on transmission) on the parasites (Fig. 2). As a consequence, factors such as multiple emergence and transmission of resistant parasites may be of differential importance for parasites to successfully establish drug resistance in a population.

A key element in the spread, persistence and disappearance of resistance is clearly the fitness of the drug-resistant parasites which can range from a biological cost (which occurs most frequently in microbiology), almost no cost (whether or not obtained by compensatory changes) to a fitness benefit such as SSG-resistant Leishmania where the drug seems to have increased parasite fitness. An accurate assessment of the fitness of drug-resistant parasites is therefore crucial, but such studies are not routinely performed. Although the control of drug-resistant parasites can only move forward through a full understanding of their biology, pathogen fitness studies mostly focused on in vitro or rodent models while studies in the vector itself are virtually lacking. Studying the parasite's behaviour in its natural vector is, however, both laborious and difficult to attain due to the specific breeding environment that these vectors require and the difficulties in achieving reliable in vivo infections in the vector. Nevertheless, mosquito infections with Plasmodium parasites can be achieved using either a rodent malaria model (Sinden et al., 2007; Spence et al., 2012) or transmission of in vitro cultured P. falciparum (Lensen et al., 1999). Also for Leishmania, a recently developed model of visceral leishmaniasis in hamsters through bites of sand flies opens new doors for a more accurate assessment of the fitness of drug-resistant Leishmania (Aslan et al., 2013). Although rodent–snail models are well developed for schistosomes with several laboratories around the world maintaining the life cycle successfully, experimental work on the fitness of drug-resistant strains in both hosts remains scarce (Liang et al., 2001; William et al., 2001).

There are also many aspects of the evolution of drug resistance that are not yet fully understood. This is most definitively true for the role of asymptomatically infected individuals that often outnumber clinical cases. Mathematical models can overcome such data gaps by in silico assessment of the differential outcome when changing specific parameters related to reservoir functions of certain animal and/or human populations. Such models may also be adapted to better understand what it takes for drug-resistant parasites to further spread or to disappear. However, they can only be developed based on available data from accurate in vitro, in vivo and field studies. Reciprocally, they can feed experimental research by proposing new hypotheses.

Seen the potential impact of drug-resistant parasites on the future epidemiology of infectious diseases, as exemplified by the fitter SSG-resistant L. donovani, the ease by which resistance can emerge or the effect it may have on parasite fitness (studied by inducing resistance in vitro or in vivo) should be standardly assessed in drug discovery pipelines. Such studies should also pay particular attention to testing strains that are relevant to the parasite population in the field – including clinical isolates that may be resistant to previously used drugs. Nevertheless, the rate at which drug-resistant parasites emerge is currently faster than the rate at which new drugs come out from drug discovery pipelines. It is therefore of the utmost importance to protect current treatment options. Including treatment outcome/drug resistance monitoring work packages in control programmes are imperative initiatives to detect the emergence of drug-resistant parasites in a timely fashion and adapt treatment policies if necessary. This is well exemplified by the Worldwide Antimalarial Resistance Network (Price et al., 2007) and their new cooperative effort with the World Health Organisation (Sibley et al., 2010), and the Kaladrug-R project that assesses the long-term efficacy of Miltefosine for the treatment of visceral leishmaniasis while including resistance monitoring (www.leishrisk.net/kaladrug, Rijal et al., 2013).

As for many viral and bacterial diseases, combination therapy is becoming the norm for vectorborne parasites such as malaria. For leishmaniasis and schistosomiasis, standard treatment currently relies on one single drug, with several combination treatments showing success in humans for visceral leishmaniasis (Sundar et al., 2011). For schistosomiasis, more research for optimum combinations is required because some tested combinations proved to be no more effective than PZQ alone (Obonyo et al., 2010), but more combinations are being assessed in mice models (el-Lakkany et al., 2011; Keiser et al., 2011; Xiao et al., 2011). Combination therapies can indeed reduce the probability of de novo drug-resistant mutants and likely increase the useful lifespan of the current drugs. However, this is not the panacea: recent experimental work showed that resistance of L. donovani against drug combinations could be obtained after 10 weeks of induction (Garcia-Hernandez et al., 2012), and efficacy of combinations in the field should be monitored as carefully as monotherapeutic schemes. Furthermore, combinations should only be implemented with drugs that target specific parasite species specifically: the proposed combination therapy of PZQ and artemisinins for the treatment of Schistosoma infection raises concerns about inducing drug resistance against artemisinins in Plasmodium in regions where both species coexist (Keiser & Utzinger, 2007; Utzinger et al., 2007).

Initiatives for developing new drugs must absolutely be complemented by efforts to understand the forces that drive the spread of resistance and ensure future adequate treatment of infectious diseases. Integrated control of drug-resistant vectorborne diseases is not dependent upon chemotherapy only, but also requires a better insight into the ecology of these parasites and how their transmission can be impaired. One option would be to reduce the human–vector contact after drug exposure and thereby prevent possibly emerged drug-resistant parasites from spreading. For instance in the case of malaria, where transmission of resistant parasites is the predominant problem, drug resistance would be much better controlled if drug-treated individuals would be using a bednet during the initial period following drug treatment.

In conclusion, the biological complexity behind resistance of vectorborne parasites differs from one parasite to another because it depends on their inherent adaptive capacity to generate drug resistance and the impact of the resistant phenotype on their fitness throughout their complex life cycles. As our analyses make clear, each disease system has unique features and requires a unique resistance management approach. Careful monitoring of the emergence of resistant parasites may allow timely actions to prevent their further transmission and further safeguard the efficacy of current treatment options. In order to prevent future treatment options being impacted by resistant pathogens, current drug design studies should consider including the ease or difficulty with which resistance can emerge and spread, incorporating the factors we discussed above, when assessing what makes the best drug with the greatest potential for long-term control of infectious diseases.

Acknowledgements

We thank Prof. Dr D. Cioli for his clarifications on OXA resistance. This work is supported by the European Commission (Kaladrug-R, FP7 grant 222895), the Belgian Science Policy Office (TRIT, contract P7/41), The Flemish Fund for Scientific research (contracts G.0B81.12 and G.0552.10), the Flemish Ministry of Sciences (ITMA SOFI-B GeMInI) and the National Institute of General Medical Sciences (R01GM089932). F.V.D.B. is a doctoral fellow of the Flemish Inter-University Council (VLADOC). The authors have no conflict of interest to declare.

Doenhoff MJ & Pica-Mattoccia L (2006) Praziquantel for the treatment of schistosomiasis: its use for control in areas with endemic disease and prospects for drug resistance. Expert Rev Anti Infect Ther4: 199–210.

Rijal S, Ostyn B, Uranw Set al. (2013) Increasing failure of miltefosine in the treatment of kala-azar in Nepal and the potential role of parasite drug resistance, re-infection or non-compliance. Clin Infect Dis56: 1530–1538.