- Grounding Digital History in the History of Computing
The term “digital history” refers to any use of computers in the creation, enhancement, or presentation of historical scholarship, whether academic or not.1–4 Some of its practitioners believe that it constitutes an independent subdiscipline; others believe it will eventually be subsumed into the whole, as historians of every stripe are forced to confront the fact that the sources that they depend on have become digital. Whatever its eventual fate, digital history currently lacks an established historiography. In this brief article, we use the relatively recent publication of a number of books on the subject as a launching point to argue that digital history should be grounded in the history of computing. This, then, is not meant to be a comprehensive review. Nor can it be objective because we count many of the authors discussed here among our friends and have collaborated with a number of them on various research projects. Instead, our hope is to bring together two endeavors in which we already feel we have some stake, for their mutual benefit and our own.
Explosion of Content
One of the most compelling arguments for the adoption of digital methods in historical research and teaching is the shift identified by the late Roy Rosenzweig from a “culture of scarcity to a culture of abundance.”5 Over the last few decades we’ve seen exponential growth in the number of digitized sources that can be readily accessed—setting aside for a moment the vexing issue of content behind paywalls—these include published books and serials, unpublished archival materials, and textual and nontextual sources. In addition, the vast majority of new information is now born digital, and social media multiplies the potential connections by which things can come to our attention, or overwhelm it. Even the most cursory search on any topic yields more than anyone could be expected to absorb in a reasonable amount of time. As Rosenzweig put it, we now have “to write history … faced by an essentially complete historical record.” Other digital historians agree with his assessment. T. Mills Kelly, for example, suggests that history students need to be taught to “mak[e] sense of a million sources.”6 Dan Cohen goes so far as to define digital history “as the theory and practice of bringing technology to bear on the abundance we now confront.”2
To the historian of computing, this glut, this mind-boggling excess, is a straightforward corollary of Moore’s law. Since the 1960s, the number of transistors that can be placed on an integrated circuit has been doubling regularly. The speed and power of computers, the density of digital storage, the number of devices and their interconnections have increased apace, while device size and cost have fallen precipitously. Paul Cerruzi has argued that this is an instance of “raw technological determinism,” a self-fulfilling prophecy on the part of the semiconductor manufacturers, and a trend that has seemed “impervious to social, economic, or political contexts.”7,8 Population explosion and global acceleration of change followed in the wake of these electronic innovations. This necessarily affects the way that we work. As Cohen noted in a 2011 talk, a historian of the Johnson administration may be able to read and analyze 40,000 White House memos, but no historian of the Clinton administration will ever do the same with 4 million emails.9 Increasingly, text mining (and image, audio, and video mining) will be the only way to make sense of the recent past and our contemporary involvement with its consequences.
The changing source base will also expand the ways that we think about the past. “Despite the apparently revolutionary nature of the tool,” David J. Staley wrote in a book about visualization, “most historians use computers conservatively: to laterally transfer textual culture from paper to screen.”10 However, the ubiquity of smartphones ensures that the bulk of born-digital cultural materials are not textual, as every few days a billion images are uploaded to Facebook alone.11 Staley argues that in engaging with visual sources historians will come to value “simultaneity...