The Forever Enemy
August 27, 2009 |  by Michael Anft

(page 2 of 4)

The term “malaria” comes from Italy, where four popes, several dozen cardinals, the painter Caravaggio, and millions of others had died from it by the 17th century. Around that time, Jesuit priests returning from missions in South America first brought word of a potential cure: the bark of the Andes Mountains cinchona tree, which relieved the disease’s telltale chills, sweats, and high fevers. Mal’aria is Italian for “bad airs”—a reflection of the medieval belief that wafts from fetid swamps brought the disease with them. The mosquitoes that bred there and in brackish waters elsewhere wouldn’t become suspects for another couple of centuries. And “Jesuit bark,” as it became known later in England, wouldn’t remain effective for much longer than that—a foreshadowing of malaria’s ability to outpace any remedies humankind has devised to combat it.

By the time quinine— the name of the crystalline alkaloid in cinchona bark— was sold on the Italian street for a premium, humanity was well acquainted with malaria’s ceaseless march around the world. Rome had been ravaged. Egyptian royalty had suffered from it (or so their mummified remains tell us). Alexander the Great had succumbed to the disease, and the armies of Genghis Khan had been stopped by it on their westward march. Malaria forced Christopher Columbus to cancel one of his New World explorations. While tropical areas harbored malaria, so did temperate zones and sub-Arctic realms—anyplace where mosquitoes could follow sizable populations of humans as they felled forest trees and turned to organized, labor-intensive agriculture, or opened up new mining colonies. The insects proved especially fond of reproducing and lurking on the edges of irrigation channels and in standing pools of water that pockmarked primitive farms.

In the insectary, caged mosquitoes feed on Plasmodiumladen blood.

In the insectary, caged mosquitoes feed on Plasmodiumladen blood.

But for all of the mosquito’s climatic adaptability, scientists peg malaria’s origins to the tropics, specifically to sub-Saharan Africa. As people built farming settlements in tropical forests sometime between 4,000 and 10,000 years ago, Plasmodium falciparum emerged full bore. Higher concentrations of humans and A. gambiae were a boon to the evolution of the parasite, which needs large populations of both to continue its reproduction cycle. The parasite has developed a relatively complex genome due to its talent for rapid reproduction—it replicates itself into thousands of new parasites every 14 days. While the virus that causes polio features 11 genes, Plasmodium has nearly 5,000, the consequence of being a hardy survivor.

In subsequent millennia, as malaria mosquitoes spread across the globe, the world’s great civilizations and tiniest island outposts suffered. Eventually, European settlers and slave ships brought malaria to the New World. Over the centuries, a handful of U.S. presidents— George Washington, Abraham Lincoln, Ulysses S. Grant, Theodore Roosevelt, and John F. Kennedy— would become infected. During the 1800s, Washington, D.C., was so stricken with malaria that a physician proposed ringing the area near the Potomac River with a screen of wire mesh. The disease would afflict onequarter of all soldiers during the Civil War.

By 1897, when Sir Ronald Ross had proved that the disease was misnamed and that mosquitoes provided malaria with its delivery system, improvements in agricultural practices and mechanization in the northern United States had largely wiped out the disease. The South, home to human-intensive labor on unimproved tracts of land, would have to wait years for DDT to arrive and continued to battle malaria through the 1940s.

As would Americans abroad. When soldiers were infected en masse during the war, stateside scientists led an effort to increase the potency of anti-malarial drugs worldwide—and Johns Hopkins played a major role. A team of malaria researchers formed at the university in 1938 had blossomed four years later into the lead office of the national Survey of Anti-Malarial Drugs. Financed by the National Academy of Sciences, Hopkins scientists tested new compounds and cataloged thousands of potential remedies sent in by researchers from around the country. Survey researchers recorded 13,000 compounds before reaching the conclusion that a drug concocted by a German scientist in 1934 worked on Plasmodium-infected ducklings. The remedy, called chloroquine, had been obscured by the confusion and secrecy of wartime but was uncovered soon enough to aid in post-war reconstruction efforts and beyond.

Drug improvements, the spraying of DDT, innovations in swamp drainage, better overall health, and better community and farm planning led to the eradication of malaria throughout the United States and Europe shortly following the war’s end.

But history teaches us that ridding the world of malaria isn’t as easy as all that. If, as Dickens once posited, the poor will always be with us, then so will malaria. Subsistence farming practices, poor and often unscreened housing, and a lack of health care and money with which to buy the best available drugs benefit Plasmodium falciparum at the expense of its hosts. People with poor nutrition or anemia suffer disproportionately, as do pregnant women and their babies. Money from the rich West to fight it has always been hard to come by. Pharmaceutical companies, driven more by profit than altruism, traditionally haven’t exactly made curing malaria a priority. Foreign aid too often has been inconsistent or paltry. One of the worst things that ever happened with the disease, some researchers complain, is that industrialized countries had beaten it back on their own turf.

In the 1960s and 1970s, however, international aid groups and health organizations had garnered enough Western support to send chloroquine treatments by the millions to Africa. Then, Plasmodium falciparum and a second form, Plasmodium vivax, predictably and inexorably mutated yet again. Drug-resistant strains of the parasites spread from village to village, country by country, reinfecting swaths of land not only in Africa but Asia as well. As soldiers fighting the war in Vietnam fell prey, governments on all sides began programs to discover new ways to stop malaria’s spread. The U.S Army enlisted a group from Walter Reed Army Hospital in Washington, D.C, to develop new, synthetic anti-malarial drugs. On the other side of the ideological spectrum, Ho Chi Minh, the Communist leader of North Vietnam—and a malaria sufferer—appealed to Mao Tse-Tung for help. In what would become known as Project 523, the Chinese government enlisted scientists to focus their research on an age-old herb called Qing Hao (sweet wormwood) that Taoist philosopher Ge Hong had first recommended in A.D. 300 as a cure for malarial fevers.

Artemisinin, the active ingredient in Qing Hao, proved effective against even the most virulent strains of malaria. Unlike chloroquine, which often caused severe stomach cramps and bone pain, and quinine, whose users frequently suffered irreparable hearing loss, the drug is gentle and longlasting. It came to the attention of doctors from the World Health Organization (WHO) in the 1980s, when Khmer Rouge soldiers in Cambodia who had taken it remained malaria-free despite long jungle stints.

The rest of the world caught on. But because of Plasmodium falciparum’s chameleon-like ability to change itself, scientists at WHO and elsewhere decided it would be best to use the drug in combination with other anti-malarial medications. These so-called “partner drugs” proved effective—until recently. The hardiness of the malaria parasite has struck again—this time at the Cambodian/Thai border, the place where ACTs (artemisinin-combination treatments) were first used, and where resistance is beginning to show itself again.

ACT resistance could cause a worsening of an already untenable situation in Africa, where the disease continues to rage. Johns Hopkins scientists believe they’ve made a breakthrough in making a future generation of those drugs as strong as possible (see “Bringing More Firepower to the Fight,” page 46). Other researchers are targeting the parasite’s reproductive cycle with a vaccine that, in early trials, has stopped the disease from spreading in Kenyan baboons (see “Taking a Shot at a Cure,” page 45). And a third group is mounting an attack from a sharply different angle: If the parasite’s ever-mutating genes present a compelling scientific problem, could changing the genes of one of its hosts keep it from reproducing? Could the answer to the misery wrought by malaria be found in the genes of the mosquito?