In lieu of an abstract, here is a brief excerpt of the content:

C H A P T E R 13 Conceptual and Normative Dimensions of Toxicogenomics ANDREA O. SMITH AND JASON SCOTT ROBERT Toxicogenetics and toxicogenomics are widely seen to offer considerable promise for environmental risk assessment and rational, knowledge-based environmental policy. As the proponents of such research clearly understand, “it is only through the development of a profound knowledge base that toxicology and environmental health can rapidly advance” (Waters, Olden, and Tennant, 2003, p. 349). But what will constitute this “profound knowledge base”? For some, it will be comprised of molecular knowledge: knowledge about genetic polymorphisms, the molecular signatures of chemicals, and the chemical interactions between genes and toxicants. This would appear to be the perspective of the U.S. National Institute of Environmental Health Sciences, as evidenced by the Environmental Genome Project, the National Center for Toxicogenomics, and related efforts. But without an appropriate context through which to interpret this molecular knowledge, research in toxicogenetics and toxicogenomics may not be able to make good on its promise . From our perspective, this appropriate context includes knowledge about organismal development, ecology, and politics, among other things. In part, this is because the molecular level does not force itself on us in risk assessment and environmental protection. Scientists and regulators make choices as to whether and how to use molecular data. These are choices with multiple normative dimensions, opportunity costs, and social and scientific consequences. In this chapter, we explore these issues , beginning with a discussion of the recent history of the use of genomic techniques and data in toxicology. Drawing on the tools of the philosophy of science, particularly regarding the identification of conceptual assumptions and the justification of research methods, we discuss both plainly valid though underdeveloped uses (such as DNA microarrays for detecting toxins in biosamples) and some less plainly valid though more common uses of toxicogenomics. For the authors of many chapters in this volume, their cup runneth over. Ours, by contrast, will seem half-empty. A caveat: though we are skeptical that toxicogenomics will be able to deliver on all the manifold promises made on its behalf, we offer our skepticism not to be “antigene” or technophobic. Rather, as the information produced by toxicogenomics increases in volume and detail, scientists are wrestling with how to understand the meaning of the data—the data far outstrip our ability to interpret them. Should toxicogenomics proceed without resolving issues of validity and inference, for instance, it will promote a false sense of our understanding of environmentally induced disease, having important normative implications. Our claim is a modest one: critical conceptual analysis—and so the philosophy of science—is important to sound progress in environmental science and policy (Robert and Smith, 2004). Why Toxicogenomics? Some of the greatest threats to our health arise from exposure to environmental agents. Epidemiologists and environmental health researchers have tracked incidences of exposure, explored the nature of toxic chemicals and their effects on humans and other organisms, and, in some instances, offered remedies, including regulation of suspect (and guilty) toxins. But with the rise of a molecular worldview in biology since the 1940s, and especially in the past two decades, our understanding of environmental influences on human health has come to be dominated by a focus on genes. At first, environmentally induced mutations were explored , as in the Department of Energy’s involvement in establishing the 220 E T H I C A L A N D P H I L O S O P H I C A L P E R S P E C T I V E S [18.218.38.125] Project MUSE (2024-04-18 18:19 GMT) Human Genome Project (Beatty, 2000; Maienschein, 2003). More recently, attention has turned to environmentally sensitive polymorphisms, as in the Environmental Genome Project at the U.S. National Institute of Environmental Health Sciences (Olden and Wilson, 2000; Christiani et al., 2001; Robert and Smith, 2004). The integration of genetics, and now of genomics, into toxicology has been encouraged in the pursuit of a more mechanistic toxicology. In the 1990s, the National Toxicology Program endorsed this goal and envisioned a future in which toxicology evolved from a descriptive to a predictive scientific enterprise (Goodman, 1994; Bucher and Portier, 2004). By and large, this has yet to be achieved, as toxicologists still grapple with how best to identify detailed mechanisms of toxicity and assess risk posed by chemical agents. As it is usually defined, toxicogenetics focuses on the identi...

Share