In lieu of an abstract, here is a brief excerpt of the content:

Reviewed by:
  • Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality by Robert M. Geraci
  • Michael Graziano
Geraci, Robert M . Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality. New York : Oxford University Press , 2010 . 248 + x pp. $27.95 (USD). Hardback. ISBN: 9780195393026

“To study intelligent robots is to study our culture,” writes Robert M. Geraci in Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality (7). Investigating the porous boundary between science fiction and popular science, Geraci seeks to show how the categories of Christian and Jewish apocalypticism have been used in techno-scientific contexts, particularly by those studying or writing about intelligent robots. The result is what Geraci terms “Apocalyptic AI,” defined as “the presence of apocalyptic theology in popular science books on robotics and artificial intelligence” (1). Geraci argues that the category of Apocalyptic AI is “almost identical” to Christian and Jewish apocalyptic traditions (9). In his quest to understand the importance of Apocalyptic AI in American culture, Geraci engaged in ethnographic work at locales such as the Robotics Institute of Carnegie Mellon University, where the author interviewed roboticists, as well as the online game world Second Life, where the virtual inhabitants see their digital home as a “potential realm” for the realization of a “virtual future” (79).

Apocalyptic AI, according to Geraci, is a “particularly American” phenomenon (21). Paul Boyer’s work on the apocalypse is leveraged to show the ways in which scientists can “imagine the world in apocalyptic terms,” particularly since World War II (21). Geraci also draws on Catherine Albanese’s work to situate apocalyptic robots in a historical genealogy of other apocalyptic American traditions of the nineteenth century. The author traces this genealogy up to the early twenty-first century, where he reviews how Christian theologians and computer scientists are grappling with the moral, ethical, and legal implications of robotics research.

Geraci is keen to make a methodological intervention in the study of religion and science. In one of the stronger passages of the book, Geraci chides the sociologists, anthropologists, and historians who study science without considering religion as part of the broader social context in which scientific research occurs. Religious studies, in Geraci’s view, is also remiss for not engaging more productively with science and technology studies, which he sees as an ample source of data. To Geraci’s credit, Apocalyptic AI takes great pains to approach its subject matter from the vantage point of both methodologies in service of his larger point that doing so “simply reiterates the powerful ways in which techno-scientific culture remains, first and foremost, human culture” (44).

The author is at his best when he demonstrates the ways in which popular science and science fiction act as mediating forces between scientific research and laypeople. This act of mediation helps to explain the strategic appeals to apocalypticism and Christian theology [End Page 267] which Geraci painstakingly documents. This attention allows the author to reinforce one of his central arguments that “pop science in general, and Apocalyptic AI in particular, is a—sometimes conscious, sometimes unconscious—strategy for the acquisition of cultural prestige, especially as such prestige is measured in financial support” (3). Scholars interested in the economic underpinnings of these fields will find Apocalyptic AI worth their time, particularly the appendix “In Defense of Robotics,” which situates Apocalyptic AI within the larger context of the contemporary defence industry.

Geraci’s argument would have been strengthened by paying more attention to the construction of the category “religion” in the techno-scientific contexts he investigates. For example, Geraci’s use of Bruce Lincoln’s work on authority could have been more fully developed. Lincoln does not argue that the sacred “grounds” ultimate authority, as Geraci suggests. Instead, Lincoln suggests that authority structures often rely on strategic appeals to transcendence which are then used to create and maintain categories such as the “sacred” in specific historical moments. The book would have been aided by attention to which components of religion are taken as self-evidently sacred by Apocalyptic AI enthusiasts, as this would allow Geraci to bolster his argument that Apocalyptic AI is an extension of long-running...

pdf

Share