In lieu of an abstract, here is a brief excerpt of the content:

34 Chapter 2 Networked Computers The advent of networked computers signaled a remarkable expansion of the cryptographic field, as the financial and governmental sectors began to assess their needs for secure electronic transactions. In the early 1970s, IBM researcher Horst Feistel—at the time one of the few private-sector cryptographic researchers—began work on an algorithm that would, he argued, safeguard individual privacy in the coming computer society. Feistel proposed that ciphers could be used to encrypt databanks in order to guard their contents from anyone but authorized individuals. Feistel presented these ideas and the design principles of his Lucifer cipher in a 1973 Scientific American article, at the time one of the most explicit discussion of modern cryptographic principles ever presented to the American public.47 The design of Lucifer was remarkably simple: like the Vernam cipher, it worked directly with the binary representation of messages: “We shall tacitly assume that the messages we wish to handle cryptographically have first been translated into a sequence of binary digits. Any kind of information , be it letters, musical sounds, or television signals, can be represented by binary coding. We shall no longer worry about the original meaning of the messages. We shall deal only with their binary representation.”48 It relied on Shannon’s suggestion in Communication Theory of Secrecy Systems that strong and practical ciphers could be constructed from two simple methods—diffusion and confusion—which, when combined, made difficult any statistical (i.e., frequency) analysis of the ciphertext. Diffusion dissipates the redundancy found in natural languages by spreading it throughout the ciphertext.49 A simple permutation of the letters of the plaintext, for example “hello” to “lohel,” breaks the frequencies of bigrams and trigrams. Confusion makes the relation between the key and ciphertext as involved as possible, so that statistical analysis of the cipher text will not yield much information about the key. Mono- and polyalphabetic substitution are both methods of achieving confusion. Permutation and substitution are inadequate methods of encryption by themselves , but Shannon showed that their combination yields practical ciphers of great speed and resilience. The Lucifer ciphering process consisted precisely in the interleaving of successive operations of diffusion and confusion . The algorithm performs sixteen rounds of such operations on the Communication in the Presence of Adversaries 35 plaintext, sixty-four bits at a time. Because they are performed directly at the binary level, each operation necessitates only the simplest hardware to implement it, yet their composition yields a cipher of remarkable strength while achieving speeds yet to be met by any cipher based on algebraic properties.50 If Feistel’s goal of protecting individual’s privacy did not seem to be associated with any specific market, protecting banking transactions clearly was. IBM quickly realized the business potential of integrating cryptographic technologies into the data processing infrastructure undergirding the deployment of networked ATM machines (introduced in the United States in 1968). Aware of the increasing needs for a cryptographic algorithm suitable for industry purposes, the National Bureau of Standards (NBS) solicited proposals for a standard in 1973. Selection as a federal standard would be premised on meeting two distinct conditions: the algorithm would be forwarded to the National Security Agency (NSA) for evaluation , and the winner would grant nonexclusive royalty-free licenses to design, build, use, and sell hardware and software implementing the cipher.51 The only game in town at the time, Lucifer was selected and the details of the algorithm published in March 1975 in the Federal Register as the Data Encryption Standard (DES, pronounced dez), with requests for comments from the industry and the public.52 Almost immediately, critics pointed to the suspicious reduction of the original 64 bits keys length to 56 bits, with the assignment of 8 bits to error correction. The move seemed to suggest an attempt “to intentionally reduce the cost of exhaustive key search by a factor of 256.”53 Martin Hellman, a soon-to-become-famous Stanford professor of electrical engineering , wrote to the NBS that “Whit Diffie and I have become concerned that the proposed data encryption standard, while probably secure against commercial assault, may be extremely vulnerable to attack by an intelligence organization.”54 Diffie and Hellman subsequently published an extensive analysis of the standard, claiming that “using the simplest cryptanalytic attack [exhaustive key search], a $20 million machine can be built to break the proposed standard in about 12 hours of computation time . . . major intelligence agencies possess the financial...

Share