In lieu of an abstract, here is a brief excerpt of the content:

  • Rethinking Why and How Organizations Acquire Information Technologies, Part 1Conventional Explanations and Their Limits
  • James W. Cortada (bio)

For over a half-century, those observing how computing spread generally accepted a common set of explanations for why this happened. Moreover, these reasons proved useful in explaining why the use of information technologies (IT) diffused so quickly. However, as the number of historians, economists, management experts, and political scientists witnessed the unfolding of this enormous worldwide diffusion of IT in recent years, cracks began to appear in these consensus understandings. The cracks did not reflect a rejection of conventional views, rather that these were only partially right, and so in need of refinement. In the narrower space of historical research on the evolution of computing, we have gone from "computing pioneers," journalists, and a handful of academic historians studying IT in the 1970s and 1980s to, now, several hundred and an equal number of journalists and other interested parties participating in the endeavor. As the results of their research accumulate, their activities raise the question, are our basic assumptions about diffusion and appropriation of IT still valid? Or, are we due for a revision reflecting the results of a new generation of historical research? The answer to both questions is yes. Second, what does the experience with computing teach us about potential developments of other related technologies, particularly those with embedded computing, such as manufacturing robotics, entertainment systems, and vehicles? Are they subject to the same changing circumstances as computing?

The purpose of this two-part essay is to examine how a comfortable paradigm for [End Page 48] explaining the spread of a major technology began cracking, shaken by new findings. I do this by, first, in Part 1 summarizing familiar explanations for the diffusion of computing, as much recently historical research still depends on it. Then I describe problems—cracks—appearing in these assumptions. In Part 2, I propose a modification of how to view IT's diffusion—the why and how—an update more in line with current historiography. The conclusion of this essay identifies implications for research on other technologies, especially those that emerged during the second industrial revolution that are still unfolding, many of which are morphing into quasi-computers themselves.1

CONVENTIONAL EXPLANATIONS

For a half-century, the explanation for why people, organizations, industries, indeed whole societies, appropriated IT remained fairly constant, involving three lines of thinking: declining costs of computing, improvements in the capabilities (functions) of the technology, and their increased reliability. Historians integrated these three explanations into their own work. The first—declining costs of computing—was the handiwork of economists, the second—improving technologies—largely of computer pioneers, and the third—reliability—the least studied, reflections of users and experts on business management. A reading of general histories of computing from Michael Williams, through the early work of Paul Ceruzzi, James W. Cortada, and later to Ceruzzi's later work and the recent edition of Computer (2014) by Martin Campbell-Kelly, William Aspray, Nathan Ensmenger, and Jeffrey R. Yost, continued appropriately to honor those three lines of argumentation to one degree or another.2 Proponents of these three lines of argumentation who were not historians have been relatively consistent over time too, which may help explain the lack of sharp differences of opinion one might expect as a new general technology emerged, morphed, and spread rapidly, essentially in one generation.

Authors of the economic perspective held that computing dropped in cost so dramatically that organizations and individuals could increasingly afford to acquire computing, that they proved less expensive than previous technologies (e.g., desk top calculators), or salaries of people. Prices, economists argued in the case of the United States, "have fallen greatly, declining at an average annual pace of 15.1% between 1970 and 1994, compared with an average increase in the overall price level of 5.2% a year during the same period."3 That statement was published in 1997. But examining the prices date back to 1953 and between then and the early 2000s, economists calculated that computing enjoyed annual price declines of 15% to 30% each year, some by as much as between 25% and 45%.4...

pdf

Share