Within-host Competition Slows Evolution Of Drug Resistance In Malaria

Most cases of malaria can be cured with a simple course of antimalarial drugs, but many of these drugs have lost their effectiveness because the malaria parasite (Plasmodium falciparum) has evolved resistance to them. Only one class of drugs (artemisinin-based combination therapies, or ACTs) remains broadly effective against malaria.

However, resistance to ACTs has already developed in Southeast Asia, and if history is any guide, it is only a matter of time before the same thing happens in Africa. Once that happens, it is likely that malaria cases – and malaria deaths – will increase unless a replacement drug is found.

Drug resistance may be unavoidable, but a great deal of research has been undertaken to learn how to delay it, slow it down, and work around it. Part of this work endeavors to understand what conditions favor the spread of resistance and why. Sometimes the conditions are obvious: for instance, resistance evolves faster when drugs are used more frequently, which is why efforts have been made to (moderately) restrict the availability of antimalarial drugs and ensure their appropriate use.

Other conditions that influence the spread of resistance are harder to understand. Over the last 60 or so years, antimalarial drug resistance has independently emerged numerous times, but almost always in low-transmission settings, most often in Southeast Asia and South America. Despite the fact that 90% of P. falciparum cases occur in sub-Saharan Africa, drug resistance has seldom emerged there. The predominant drug-resistant lineages in Africa actually originated in Southeast Asia and eventually made their way across the African continent via gene flow.

It remains something of a mystery why drug resistance is less likely to evolve in high-transmission settings. In our recent paper, “Within-host competition can delay evolution of drug resistance in malaria,” published in PLOS Biology, we explored how competition with existing drug-sensitive strains of malaria can inhibit the emergence of new, drug-resistant strains in high-transmission areas.

In low-transmission settings, the spread of the malaria parasite is limited by vector availability, such that much of the population remains uninfected despite not being immune to the parasite. In high-transmission settings, however, malaria vectors are abundant, such that most of the population is infected most of the time. In this latter scenario, the parasite is limited by host availability.

Intuition suggests that high-transmission settings might be unfavorable for the emergence of drug-resistant strains because, like an overcrowded garden, they lack “fertile ground” in which to take root. Most of the population is already occupied by drug-sensitive strains, which have a significant advantage in numbers. Unless high levels of antimalarial drug use clear out enough hosts for resistant parasites to colonize, emerging drug-resistant strains may go extinct before they can spread.

In our study, we used an individual-based model to simulate the introduction of drug-resistant strains into different transmission settings. The model backed up our intuition: unless rates of antimalarial drug use were very high, drug-resistant strains went extinct more often, and more quickly, in high-transmission settings than in low-transmission settings, suggesting that they failed to gain a foothold in populations where the majority of hosts were already infected.

Our results suggest that it may take longer for drug resistance to emerge (become established in the population) in high-transmission settings, because many drug-resistant mutants may go extinct before one finally takes root. But once it does emerge, our simulations show the spread of resistance (the rate at which drug-resistant strains take over the population) may actually be more rapid in high-transmission areas than in low-transmission ones.

These predictions – of delayed emergence but rapid spread of drug resistance in high-transmission settings – are consistent with the history of resistance to chloroquine (a synthetic antimalarial that was widely used in the second half of the twentieth century). Chloroquine was introduced in the 1950s, and chloroquine-resistant strains emerged in Asia and South America by the 1960s. Chloroquine remained effective throughout Africa, however, until resistant strains appeared in eastern Africa circa 1978; a decade later, chloroquine resistance had reached virtually every part of the continent.

Our results suggest that the evolution of drug resistance in high-transmission settings is characterized by a “tipping point” that is not present in low-transmission settings. Our model indicates that drug resistance struggles to establish in high-transmission settings, but spreads easily in those same settings once it overcomes the establishment hurdle. Our hope is that these insights may lead to new approaches for prevention and containment of drug resistance in high-transmission settings, either by exploiting existing mechanisms that inhibit emergence of resistance or strategically targeting factors that facilitate the spread of resistant strains. With no new antimalarial drugs available, preserving the efficacy of the drugs we have now is a vital part of ongoing efforts to reduce the global burden of malaria.

These findings are described in the article entitled Within-host competition can delay evolution of drug resistance in malaria, recently published in the journal PLoS BiologyThis work was conducted by Mary Bushman, Rustom Antia, and Jacobus C. de Roode from Emory University, and Venkatachalam Udhayakumar from the Centers for Disease Control and Prevention.