In lieu of an abstract, here is a brief excerpt of the content:

Reviewed by:
  • To the Digital Age: Research Labs, Start-Up Companies, and the Rise of MOS Technology
  • Paul E. Ceruzzi (bio)
To the Digital Age: Research Labs, Start-Up Companies, and the Rise of MOS Technology. By Ross Knox Bassett. Baltimore: Johns Hopkins University Press, 2002. Pp. xii+421. $44.95.

A glance at recent nonfiction best-seller lists reveals a number of popular histories of technology, which follow the conceit that some relatively obscure invention (such as the Harrison chronometer) was responsible for changing the course of world history. After reading three or four of these, one begins to wonder if this is really the right approach, even if it does sell books and therefore ought to be good for our field. There is no need here to criticize these popular works, other than to state the obvious, namely, that an invention by itself, stripped of its social and cultural components, cannot drive history.

Ross Bassett does not make such an inflated claim in To the Digital Age, but in this thoroughly researched and detailed book he does reveal that much of the digitized world we inhabit today is in fact enabled by a specific type of silicon device, the Metal-Oxide-Semiconductor, or MOS, transistor. It had its drawbacks, especially slow speed, but it also carried some advantages, especially its ability to be fabricated at high densities and its parsimonious power requirements. To his credit, Bassett devotes as much attention to the social dimensions of this story as to the technical details of the invention and production of the MOS transistor itself.

We learn of the transistor's champions at various laboratories, including Bell Labs, RCA, and IBM, and of how they struggled against those who favored the existing paradigm, the bipolar transistor, which switched at faster speeds. We learn of the culture of Silicon Valley and how it differed from its east coast counterparts—another topic about which much has been written, but never with the reliance on interviews and primary sources that Bassett's book features. Some of the information about Silicon Valley, such as the close relationship that Valley firms cultivated to marketing, I already knew about. Other information is new to me, especially that Stanford University was not much of an enabler of the Valley's initial success.

What is most provocative about this book is that the story of the MOS transistor challenges the assumption of scholars about the nature of research and development since the Second World War. As articulated in Vannevar Bush's Science, the Endless Frontier, it gives primacy to basic research, whose practitioners need not always concern themselves with practical applications. The assumption is that the applications will come in due course, but to try to influence that trajectory is both unnecessary and counterproductive. In the story of MOS, Bassett portrays the IBM Corporation, and to a lesser extent RCA and Bell Labs, as subscribing to this belief to their own detriment, even though they accomplished significant work [End Page 892] on MOS. For it was not at IBM, RCA, or Bell Labs where MOS was perfected and pushed on to the world; rather, it was at the Intel Corporation in Mountain View, California. At Intel there was no acceptance of the old assumption about research and development; indeed, there was no separate research laboratory at all.

The result was the empirical observation made by Intel cofounder Gordon Moore, and called Moore's Law after him: since about 1960, the number of transistors that one can place on a chip of silicon doubles about every eighteen months. By implication, the world of digital circuits is advancing at a rate unmatched by any other technology in history. Such a doubling simply could not have been possible with bipolar circuits, IBM's technology. It was MOS's scalability that made it so attractive to its early champions, even if few at first recognized that scalability, not switching speed, was the critical feature.

Moore's Law makes historians of technology uncomfortable, especially its implication that an increase in chip density, a technical criterion, equals "progress" in general. Acceptance of this premise implies a technological determinism that people seem to accept with little...

pdf

Share