In lieu of an abstract, here is a brief excerpt of the content:

Charles Babbage and the Politics of Computer Memory As we have seen, the dialectic of technological determination is both enabling and disempowering. It clears space to imagine wild visions of the future. But it closes off our ability to question our present options , since the future is presumed to be the inevitable result of inexorable technological progress. And it impoverishes our understanding of the past, robbing it of any sense of contingency. What happened had to happen , since it did happen. This bargain has been good enough for technotopians from Edward Bellamy to Isaac Asimov (author of the Foundation series, which introduced the fictional predictive science of “psychohistory”) to Gene Rodenberry (creator of Star Trek) to Louis Rosetto (founder of Wired magazine ). But for some thinkers, the trade-off isn’t worth it. Looking more closely at the history of computing, these skeptics notice the odd turns and dead ends that give the lie to the grand narrative of technological determinism . This chapter will look at the struggle between the determinist mainstream and the critical margins to define the historical memory of computing . It will focus on the contested legacy of the so-called “father of computing,” Charles Babbage. Before we get to Babbage, though, we’ll need a little background. Let’s start with a question: what is a “computer ”? The answer depends on how you define “computer.” The term was originally used to label not machines, but people. For most of the past three centuries, a computer meant “one who computes,” according to the Oxford English Dictionary, which traces this usage as far back at 1646.1 Scientists engaged in large-scale projects involving many calculations, such as the computation of navigation tables, would hire rooms full of 1 23 human “computers”—usually women—to crunch their numbers.2 It was not until the 1940s, when new kinds of flexible calculating machines began to replace people for these large-scale projects, that the connotations of the word began to shift, as engineers labeled their new devices “computers.” Even so, through the 1940s and 1950s, popular discourse more often referred to the machines as “giant brains,” “electronic brains,” or “mechanical brains.” It wasn’t until the 1960s that “computer ” became standard usage. While a term such as “giant brains” may strike us today as a rather garish anthropomorphism, note that the seemingly more neutral term “computer” itself has its origins in anthropomorphism . One version of the history of computing, then, is the story of computing as a process for the large-scale production and organization of information —a process performed sometimes by people, sometimes by machines . A second, more familiar version is the story of the computer as a mechanical calculating device. This chronology takes us from the abacus and other counting devices of the ancient world to the mechanical adding machines first developed in the seventeenth century, which used gears and levers to perform arithmetic. These two strands of computing history— computer as large-scale information processor, and computer as mechanical device—first came together in the work of a nineteenth-century British inventor named Charles Babbage. Babbage’s Engines Babbage began his first project, the “difference engine,” in the 1820s. A massive, steam-powered calculating machine and printer, it was designed to mechanize the process of computation and table-making, just as other inventions of the Industrial Revolution were mechanizing other labor processes. The British government invested a total of 17,000 pounds in his research; Babbage is estimated to have spent an equal amount of his own money. In 1833 Babbage produced a small-scale prototype that clearly demonstrated that the completed machine could work. But before Babbage could finish his machine, he was distracted by a new, more complex project. He never completed his difference engine. Babbage’s new idea was an “analytical engine.” Rather than being hard wired to perform specific tasks, it was designed to be “a machine of the most general nature.” Inspired by the Jacquard loom, Babbage came 24 | Charles Babbage and the Politics of Computer Memory [18.117.81.240] Project MUSE (2024-04-23 13:05 GMT) up with the idea of using a series of punched cards to input information into his machine. The cards would contain not only the raw numbers to be processed, but also logically coded instructions on how to process them. Input numbers could be held in the “store,” a series of 1000 registers , each capable of storing one 50-digit...

Share