In lieu of an abstract, here is a brief excerpt of the content:

2 An Evolution of Risk Why Social Science Is Needed to Understand Risk State a moral case to a ploughman and a professor. The former will decide it as well and often better than the latter because he has not been led astray by artificial rules. —Thomas Jefferson, letter to Peter Carr, August 10, 1787 The Risk Crises of 1986 T he year 2012 marked the twenty-fifth anniversary of three major technical disasters: the Chernobyl catastrophe, the Challenger accident, and the pollution of the Rhine River after a fire destroyed a chemical storage building in Basel, Switzerland. These three events had lasting repercussions on public opinion. Even before 1986, many surveys in the United States, Canada, and most of Europe had shown an ambivalent position of most over the opportunities and the risks of large technological systems (Covello 1983; Gould et al. 1988; Lee 1998; Slovic 1987). Risk perception studies and investigations of popular attitudes toward technologies showed that people were concerned about the environmental and health-related impacts of large-scale technology but, at the same time, assigned a fair proportion of trustworthiness to the technical and political elite. Although trust had been eroding since the nuclear accident at Three Mile Island in 1979 and the continuing debate on nuclear waste, at least in the United States (Bella, Mosher, and Calvo 1988; Kasperson, Golding, and Kasperson 1999; Rosa and Clark 1999), most Americans and Europeans were convinced that large-scale technology such as nuclear power or waste incinerators was a necessary, but highly unwanted, manifestation of modernity. Furthermore, opinion polls provided evidence that the “culture of experts” was credited with This chapter is a revised and updated version of Renn 2008c: 53–66. 34 Social Science Foundations of Risk technological competence and know-how but did less well with human concerns and moral issues (Barke and Jenkins-Smith 1993; Otway and von Winterfeldt 1982). Ecologists and technology critics, by contrast, were seen as sincere and brave underdogs with convincing arguments, even if they lacked real technical knowledge. The lasting public image was the rationality of the science and technology expert versus the morality of the ecologist. Technological experts seemed to have the stronger public support, and the technical elite certainly seemed to dominate official policy. Their risk assessments provided sufficient “objective” reassurance that the intuitive perception of immanent threats by critics was unwarranted. The technical elite not only were able to reassure the public that design criteria and risk-management practices would be sufficient to contain the catastrophic potential of large-scale technology; they also were successful in convincing governments and public management agencies that the technology had a legitimate role to play in modern society. Nuclear power and similar large-scale technology offered many benefits to society. The official line was that, as long as the risk of a major catastrophe was small, society had to accept the risk. Despite a large number of initiatives against the highly unpopular nuclear industry, persistent protests against the building of new chemical plants and the expansion of airports, and new alternative movements springing up over many landscapes, the movers and shakers in the technological elite were able to influence Conservative, Liberal, and Social Democratic parties in all Western countries . In Germany, more and more nuclear power stations became functional; in Switzerland, all referenda before 1986 decided in favor of keeping nuclear power stations in operation; and in Sweden, a referendum decided in favor of a limited term operation of the existing nuclear power stations. Other European countries slowed down the pace of developing this unpopular technology, but on the whole, there was no sign of a moratorium, and even less of any political U-turn. This picture changed dramatically after the three disasters in 1986. Supporters of large-scale technology were on the defensive, while skeptics began to define a new way to think about risk. The new thinking pointed out that “objective ” estimates of risk were not so objective after all; many of the facts underpinning risk assessments did not speak for themselves, and qualitative features of risk important to citizens escaped the reductionistic calculus of objective risk. Now the experts were taken to task not only for lacking morality, but also for lacking rationality. An immediate consequence of this was that virtually all European countries, with the exception of France, deferred the development of nuclear energy. In Germany, after long and acrimonious arguments, the project of reprocessing nuclear waste was completely abandoned. Later, the new government of 1998...

Share