Since the mid 1960s, American science has undergone significant changes in the way it is organized, funded, and practiced. These changes include the decline of basic research by corporations; a new orientation toward the short-term and the commercial, with pressure on universities and government labs to participate in the market; and the promotion of interdisciplinarity. In this book, Cyrus Mody argues that the changes in American science that began in the 1960s co-evolved with and were shaped by the needs of the "civilianized" US semiconductor industry. In 1965, Gordon Moore declared that the most profitable number of circuit components that can be crammed on a single silicon chip doubles every year. Mody views "Moore's Law" less as prediction than as self-fulfilling prophecy, pointing to the enormous investments of capital, people, and institutions the semiconductor industry required -- the "long arm" of Moore's Law that helped shape all of science. Mody offers a series of case studies in microelectronics that illustrate the reach of Moore's Law. He describes the pressures on Stanford University's electrical engineers during the Vietnam era, IBM's exploration of alternatives to semiconductor technology, the emergence of consortia to integrate research across disciplines and universities, and the interwoven development of the the molecular electronics community and associated academic institutions as the vision of a molecular computer informed the restructuring of research programs.