In lieu of an abstract, here is a brief excerpt of the content:

| 227 History of Nuclear Power in the United States andWorldwide Written by Karen W. Lowrie, with comments by Thomas Cotton Background The concept of the atom has existed for many centuries, going back to Greek philosophers who surmised that all matter is made up of tiny particles (atomos is Greek for indivisible). By the turn of the 20 century, physicists such as Ernest Rutherford, called the “father of nuclear science,” suspected that the atom, if disintegrated into smaller particles, could release a large amount of energy. Following Rutherford’s early experiments with radioactivity, other scientists discovered that radioactive elements have a number of isotopes (different forms of an element having the same number of protons in the nucleus but different numbers of neutrons) and that artificial radionuclides could be produced by bombarding atoms with protons, and more effectively with neutrons. In 1934, the Italian physicist Enrico Fermi showed that neutrons can split many kinds of atoms. His experiments with splitting uranium resulted in elements much lighter than uranium. In similar experiments, the German scientists Otto Hahn and Fritz Strassman split uranium (atomic number 92) with neutrons to create elements with only about half the atomic mass. Another scientist, Lise Meitner, further proved that the split of the nucleus caused some of the original mass to change to energy, confirming Albert Einstein’s theory about the relationship of mass to energy. A group of scientists from Europe and the United States began to believe a self-sustaining chain reaction might be possible that would create large amounts of energy. Their belief was based on the discovery that fission releases not only energy but also additional neutrons that cause fission in other uranium nuclei. It would happen with enough uranium under proper conditions . The amount of uranium needed to make a self-sustaining chain reaction is called a critical mass. The chain reaction concept was important to the development of atomic bombs. Power stations would have to introduce 228 | The Reporter’s Handbook: Additional Resources neutron-absorbing material to control the chain of nuclear reactions, even though uranium used in nuclear power stations does not contain enough of the readily fissionable isotope of uranium to allow a nuclear explosion under any conditions. By November of 1942, scientists gathered at the University of Chicago began construction of the world’s first nuclear reactor, which became known as Chicago Pile-1. The pile, consisting of uranium and cadmium control rods (to absorb neutrons) placed in a stack of graphite, was erected on the floor of a squash court beneath the University of Chicago’s athletic stadium. When the control rods were pulled out on December 2, 1942, more neutrons were available to split atoms. The chain reaction sped up and became self-sustaining . Fermi and his team ushered the world into the nuclear age with the first nuclear reactor. Most of the early atomic research focused on developing an effective weapon for use in World War II, under the code name Manhattan Project. But after the war was over, the U.S. Congress created the Atomic Energy Commission (AEC) to explore peaceful uses of nuclear technology, and researchers worldwide began to focus on ways to harness the tremendous power of nuclear reactions into the generation of electricity. Immediately after the war, reactor research was kept under very strict government control. The AEC oversaw the construction of the first experimental breeder reactor in Idaho (selected in 1949 as the National Reactor Testing Station). A breeder reactor produces both energy and additional fissionable material in the chain reaction. This reactor was completed in 1951 and became the first to produce electricity from nuclear energy. The BORAX III experimental reactor began producing power for Arco, Idaho, on July 17, 1955. A major goal of nuclear research in the mid-1950s was to show that nuclear energy can produce large amounts of electricity for commercial use. Government emphasized the beneficial uses of the atom and distanced it from the vision of the destructive mushroom cloud. After President Eisenhower’s “Atoms for Peace” speech at the United Nations in 1953, calling for greater international partnership in developing nuclear energy, he signed the Atomic Energy Act of 1954, which would spur the expansion of a civilian nuclear power program. The AEC sponsored the world’s first large-scale commercial electricity plant powered by nuclear energy, located in Shippingport, Pennsylvania . It became operational in 1957. Private industry became more involved in development, and the nuclear power industry in the United States...

Share