We are unable to display your institutional affiliation without JavaScript turned on.
Browse Book and Journal Content on Project MUSE

Find using OpenURL

Peter J. Denning
In lieu of an abstract, here is a brief excerpt of the content:

Click for larger view

Photo courtesy of Louis Fabian Bachrach.

A leading scientist in computing since his graduation from Massachusetts Institute of Technology in 1968, Peter J. Denning is best known for his pioneering work in virtual memory, especially for inventing the working-set model for program behavior, which eliminated thrashing in operating systems and became the reference standard for all memory management policies. He is also known for his work on the principles of operating systems, operational analysis of queueing network systems, the design and implementation of the Computer Science Network (CSNET), ACM digital library, and codifying the principles of computing. A primary goal of Denning's career has always been promoting the science in computer science through education, research, and the general health of the field.

David Walden :

Please tell me a bit about your early life.

Peter J. Denning :

I had interests in math, science, and nature from a young age. At school I was too small to be any good at athletics, which were socially popular, so I devoted myself completely to academics, which were not.

By age 12 I developed an interest in magician performances, especially those that depended on mathematical tricks. By age 13 I had discovered a deep fascination with electricity and electronics, which seemed to have a magic all their own.

My parents sent me to Fairfield Prep in 1956 to get me into an intellectual community and out of the athletics-infatuated public school culture. Under the wing of a gifted science teacher, I entered three science fairs with computers made of pinball parts and vacuum tubes—one to compute sums, one to solve linear equations, and the last to solve cubic equations. The second computer won the science fair. The third computer worked perfectly but fared poorly at the fair because I paid no attention to marketing and presentation—a valuable life lesson.

From Fairfield Prep, I went to Manhattan College to study electrical engineering in 1960. Although short on computing, its curriculumgave me a solid grounding in practical engineering—the building and testing of things people could use.

I came out on top of my class at Manhattan in 1964 and got a National Science Foundation fellowship good at any graduate school. I applied to MIT in fulfillment of my father's advice (he had wanted me to attend MIT rather than Manhattan).

Walden :

Say a bit about MIT.

Denning :

MIT had a completely different philosophy from Manhattan about EE principles and organization. To prepare for the PhD exams at the end of first year, I took all the MIT EE core courses in addition to my required master's courses. That intense preparation was barely enough. With the help of my master's thesis advisor, Jack Dennis, who took me under his wing, I passed the PhD qualifiers on the second try. He and I have had a long and productive friendship for almost 50 years.

My master's thesis was about scheduling requests for a rotating disk or drum memory so as to minimize mean access time, a critical issue for an experimental time-sharing system Jack Dennis had been developing. During that year, I worked closely with Allan Scherr, who taught me about systems programming, language design, compiling, data collection in an OS kernel,

Boxed Text.

Click here for boxed text.

discrete simulation, and queueing theory. Through the thesis, Jack and I showed that shortest latency time disk scheduling was optimal for time-sharing systems.

On passing my PhD qualifiers in the spring of 1966, I decided to tackle a much tougher resource allocation problem, which was looming in the design of Multics. The problem was how to build a stable computing system from multiprocess computations, which could have large variations in their processor and memory demands. I had to learn how to measure the demands of multiprocess computations, configure a system with appropriate capacity for the demand, and manage the allocation of CPU and memory dynamically. Jerry Saltzer told me of thrashing, a major instability they were encountering with multiprogrammed virtual memory systems, and challenged me to find a solution. That solution turned out to be much harder than either of us imagined. My quest...

You must be logged in through an institution that subscribes to this journal or book to access the full text.


Shibboleth authentication is only available to registered institutions.

Project MUSE

For subscribing associations only.