In lieu of an abstract, here is a brief excerpt of the content:

271 13 Blame and Shame? How Can We Reduce Unproductive Animal Experimentation? anne innis dagg Biomedical scientists continue to insist that animal experimentation is essential to progress in combating illness.1 Yet each year, as many millions of animals suffer and die around the world in biomedical experimentation ,2 there are only a few important medical discoveries: most experiments have minimal effect in improving human health. It is, therefore, worthwhile to examine how the number of animals can be reduced without reducing important biomedical findings. We should note first that thousands of experiments involving myriad animals will certainly be of no use to science, since the experimental results have never been published (nor will anyone beyond the small research community even know that such animals have died in the name of science) for the following reasons: a. Some experiments go so badly that they are not written up—the animals escape, the chemicals are mislabelled, the equipment malfunctions , et cetera. b. Some experiments produce negative results (such as animals being given cancer lesions only to be treated with chemicals that do not affect the lesions) and for this reason the researcher cannot find a publisher. c. Some small journals, especially new ones that have not yet “proved” themselves, may not be surveyed for the Web of Science so that few scientists know they exist. d. Many articles are deemed unacceptable for publication; some biological journals have refused half of all submitted articles.3 Of published research, some papers receive many citations by other scientists and influence future research, but others which garner few or no citations do not. This latter group is the focus of this essay. Whether a research paper is valuable or not can be roughly measured by the number of citations it receives in the years following its publication, indicating that subsequent scientists found the work useful for their own future research. The electronic database Web of Science, available at large universities, documents bibliographic references from articles published in over 8,700 academic journals that can be searched online. Many papers are judged to be of no scientific worth because no subsequent scientist has cited them, but how does one make scientists in general care about this? They can argue that no one knows in advance how an experiment will turn out, so each one is justified just in case it produces something worthwhile. Some research is carried out without any real theory to guide it—for example, simply injecting carcinogens into mice to produce cancerous lesions and then experimenting with other chemical injections to see if any of them reduce the size of the lesions. My weapon of choice to fight such waste of animal lives has been to carry out four research projects on citation analysis to try to shame scientists, as this article will detail. My first two papers were about experiments/papers published in psychological and behavioural/neurological journals because I felt that it was especially bad to harm and kill animals in a discipline that was often peripheral to human health.4 My third study was about cancer research; cancer experiments tend to be exceptionally invasive and painful, involving giving animals cancer before trying to eradicate the cancer by various invasive means, so I hoped that scientists might be willing to take action on my findings on compassionate grounds. As well, huge amounts of money are collected from the public and spent on cancer research— $2 billion a year in the United States and many millions in Canada. My fourth study was about researchers at a large research hospital—the Hospital for Sick Children in Toronto—in the hope that the institution would be too embarrassed to continue supporting the research of scientists with little competence. In the first study, completed in 1998, I read and analyzed 115 articles in five psychological or neurological journals published in 1989 and 1990.5 First, I checked the Web of Science for seven years following the date of publication to determine the number of subsequent articles that cited each of these papers. (This database is now online, but then I had to pay a librarian a dollar for each datum to collect the information.) Citations tend to 272 Anne Innis Dagg peak in the second and third years following publication, although they may continue to be notable for six years6 and in a few rare cases may continue strongly for decades.7 (Citation numbers are important but not perfect. Some citations for...


Additional Information

Related ISBN
MARC Record
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.