In lieu of an abstract, here is a brief excerpt of the content:

  • Logic, Code, and the History of Programming
  • Mark Priestley (bio)

A striking feature of the debates around the perceived software crisis in the 1960s and 1970s is the frank contempt expressed by some elite computer scientists for much work in the fields of programming and programming language design. The writings of computer scientist Edsger Dijkstra are a familiar source of such material: in his Turing Award lecture, he opined that "the sooner we can forget that FORTRAN ever existed the better" and likened an advocate of the PL/I language to a drug addict [1]. In a slightly more restrained register, John Backus (another Turing Award winner) used his acceptance speech to denounce existing languages as "fat and flabby [2]." Dijkstra's contempt for the tools of his trade easily slipped into contempt for their users. For example, he described software engineering as the "doomed discipline" whose charter is "how to program if you cannot," and BASIC programmers as "mentally mutilated beyond hope of regeneration [3]."

Such comments often frame visions of different styles of language and approaches to programming. Backus' critique was a prologue to a presentation of a new system of functional programming, and Dijkstra was a career-long advocate of small languages and a rigorous approach to program development. Both represent a tradition within computer science that sees programming as an unruly and uncontrollable activity that requires disciplining [4]. In the broadest terms, this tradition aims to subordinate programming to the logico-mathematical activities of axiomatization and proof. Ideally, one should program by writing a formal specification of a problem and then formally deriving code from this specification. This is a perfectly reasonable research program within computer science, of course, but also an ideal that characterizes only a tiny fragment of the programming activity that has taken place since the computer was invented.

In this essay, I argue that instead of seeing coding as subordinate to logic, both should be understood as instances of the more general activity of working with formal symbolic notations [5]. In this perspective, programming appears as an autonomous activity, and I conclude by arguing that an appreciation of this autonomy is necessary for writing an adequate history of programming.

SURPRISING DIFFICULTY OF PROGRAMMING

At the moment of the emergence of the automatic high-speed general-purpose digital computer, both Alan Turing and John von Neumann characterized programming as a new form of logic [6]. This highlighted the distinction between the parts of the machine that carried out arithmetical operations and the parts that dealt with the sequencing of those operations, often referred to as the "logical control." The usage also appeared to situate coding within the intellectual space opened up by the development of symbolic logic in the early twentieth century. However, as computer pioneer Arthur Burks pointed out in 1950, traditional logic dealt only with declarative sentences that could be true or false, whereas machine-language programs were made up of imperatives [7]. To Burks, the relevance of logic to programming lay not in its account of validity and proof, but rather in its detailed analysis of the syntax of formal languages.

Turing and von Neumann also saw a need to develop a technique of programming, at a time when the idea that programming might be difficult came as a surprise to many [8]. The ENIAC developers recognized that new numerical methods would be needed for high-speed calculation, but they defined the sequences of operations that ENIAC should execute in a format very similar to that used by Charles Babbage and Ada Lovelace a century before [9]. In common with other workers, such as Howard Aiken's group at Harvard, they do not seem to have considered that writing instructions for an automatic machine would be a problem. For many years, large-scale manual computation had utilized a division of [End Page 92] labor between those who planned the work and those who carried it out by performing simple arithmetical operations and filling in boxes in highly structured computation sheets [10]. It seemed a straightforward task to replace the latter group by the new machines.

However, when the renowned mathematician and expert calculator Douglas Hartree set up the solution to...

pdf

Share