Abstract

As part of a mid-1940s malaria research program, U.S. Public Health Service researchers working in South Carolina chose to withhold treatment from a group of subjects while testing the efficacy of a new insecticide. Research during World War II had generated new tools to fight malaria, including the insecticide DDT and the medication chloroquine. The choices made about how to conduct research in one of the last pockets of endemic malaria in the United States reveal much about prevailing attitudes and assumptions with regard to malaria control. We describe this research and explore the ethical choices inherent in the tension between environmentally based interventions and the individual health needs of the population living within the study domain. The singular focus on the mosquito and its lifecycle led some researchers to view the humans in their study area as little more than parasite reservoirs, an attitude fueled by the frustrating disappearance of malaria just when the scientists were on the verge of establishing the efficacy of a powerful new agent in the fight against malaria. This analysis of their choices has relevance to broader questions in public health ethics.

pdf

Share