In lieu of an abstract, here is a brief excerpt of the content:

• Information Entropy in Pynchon’s Fiction

Under “Courses in Radio and Communications” the 1953–54 catalog of the Cornell College of Engineering described an advanced topic, “Transmission of Information,” that might have interested a bright engineering physics student matriculating that autumn named Thomas Pynchon. “Transmission of Information,” course 4564 in the electrical engineering department, anticipated concerns about communications raised by the kind of fiction that Pynchon began to write before he graduated from Cornell, as an English major, in 1959. The course description contains a word much examined in Pynchon studies, “entropy,” but not used here in the more familiar context of thermodynamics:

This course [4564] deals with the general aspects of a transmission system, which consist of the source of information, the transmitter, the channel, the receiver, and the final destination of the message. The definition of information and a quantitative measure of information are given. The statistical properties of the source, its entropy, and the rate at which information is produced by the source are discussed. The transmission of primary signal functions into secondary signal functions at the transmitter, the capacity of the channel to transmit the secondary signal function in the presence of channel noise, and the possibilities of recovering the primary signal function at the receiver are studied. The over-all performance of transmission is discussed as to fidelity considerations and the effective rate of transmission. These principles are applied [End Page 185] to pulse-code modulation as an example of modern transmission of information. 1

In this essay I shall argue that “the general aspects of a transmission system” apply to our reading of Pynchon as well as to our listening to an electronic system like a radio. Central to designing a transmission system to communicate information efficiently is the concept of entropy as developed in information theory, especially by Claude Shannon. Entropy, as defined by Shannon, is connected closely to many issues about the communication of information—especially “fidelity considerations and the effective rate of transmission”—raised both by trying to understand information theory and by trying to read Pynchon. 2

It is most unlikely that Pynchon actually took course 4564, “Transmission of Information.” As a first-year student, he had a tightly prescribed curriculum in physics, mathematics, English, and drafting, and his one elective was in astronomy. A four-thousand-level course would have been reserved for advanced students, and Pynchon switched majors from engineering physics to literature after his first year. 3 [End Page 186]

Clearly it would be perilous to argue that taking such a course as “Transmission of Information” was the only possible mode of learning about entropy available to Pynchon. His own omnivorous curiosity and reading, as well as his sense of play in using recondite concepts, doubtless provided ample stimulation for the exploration of entropy in the various guises discernible in his texts. My analysis in this paper attempts to point to similitudes and congruities between entropy as used in communications theory, and narrative lines and devices in Pynchon’s fictions that explicitly draw attention to the act of communicating. Whether Pynchon’s impetus for adumbrating concepts about information entropy derived from recalling formal sources like Shannon, from just having fun, or from a combination of both, Pynchon’s readers will doubtless agree that his years at Cornell (including the composition of his first stories) prepared him well to succeed in the “modern transmission of information.”

## I

The program that attracted the sixteen-year-old scholarship student to Cornell was in a new topic, “Engineering Physics.” This conjunction of physics with engineering responded to the post-World War II recognition that progress in engineering depended on fundamental research in physics and the sophisticated mathematical treatment of such results, as contrasted to building empirically on conventional rule-of-thumb practice. One of the new sciences born of the rich research funding of the war was cybernetics, a discipline heralded by Norbert Wiener (who recoined the name) to designate a field soon in turn to split into a myriad of new applications and specialties in the postwar science boom. From the work of Wiener, Claude Shannon, John von Neumann, and others came the development of modern information...

### Additional Information

ISSN
1080-6520
Print ISSN
1063-1801
Pages
pp. 185-214
Launched on MUSE
1996-05-01
Open Access
No

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.