Nuclear Energy, Historical Evolution of the Use of

views updated

NUCLEAR ENERGY, HISTORICAL EVOLUTION OF THE USE OF

The history of nuclear energy is a story of technical prowess, global politics, unfulfilled visions, and cultural anxiety. The technology's evolution in the second half of the twentieth century progressed through several stages: theoretical development by physicists; military application as atomic weapons in World War II; commercialization by the electrical industry in several industrialized nations; proliferation (for military and non military uses) among less developed nations; crises spawned by power plant accidents, cost overruns, and public protests; and retrenchment and slowdown in the last few decades of the twentieth century. By far the most potent form of energy to be harnessed by humankind, nuclear power has not become the dominant form of energy because of the great economic costs and social risks associated with its use.

MILITARY ORIGINS

The concept of "atoms" dates back to the ancient Greeks, who speculated that the material world was comprised of tiny elemental particles, and for centuries thereafter alchemists attempted to unlock the secrets of the elements. But modern atomic science did not emerge until the turn of the twentieth century. In 1896 Henri Becquerel of France discovered radioactivity, and Albert Einstein calculated the mass-energy relationship (E = mc2) in 1905. By the 1930s, scientists in several countries were making progress toward understanding nuclear reactions, including Ernest Rutherford and James Chadwick in Great Britain; Enrico Fermi in Italy; Niels Bohr in Denmark; and Ernest O. Lawrence in the United States. The key breakthrough came in December 1938, when German physicists Otto Hahn and Fritz Strassmann achieved the first controlled atomic fission, splitting atoms of uranium into lighter elements by bombarding them with neutrons, and releasing enormous amounts of energy in the process.

The news spread quickly, and became charged with implications as Hitler's Nazis began their march through Europe. Two days before the outbreak of World War II in Europe in 1939, Bohr and John Wheeler of Princeton published an academic paper on fission. Several leading physists fled Germany and Stalin's Soviet Union for the United States, including Hungarian refugee Leo Szilard. Fearful that the Nazis might build a powerful atomic bomb, Szilard and fellow Hungarian émigré Eugene Wigner convinced Einstein (then at Princeton University) to write President Roosevelt to warn of the possibilities of atomic weaponry and to suggest U.S. action.

Einstein's letter (dated August 2, 1939) had little impact, however. What stirred the U.S. government into action were reports out of Great Britain in the early 1940s: one from German refugee physists Rudolf Peierls and Otto Frisch in 1940, which discussed the possibility of making a "super-bomb" from a uranium fission chain reaction; and a study from the top-secret British "MAUD committee" in 1941, which deemed a uranium bomb "practicable and likely to lead to decisive results in the war," and which urged the United States to make development of an atomic bomb its "highest priority."

At the time, the scarcity of fissionable material—whether natural uranium or man-made plutonium—seemed the greatest barrier to atomic bomb production. Now determined to win the atomic bomb race, President Franklin Roosevelt approached Colonel (soon-to-be Brigidier General) Leslie Groves of the U.S. Army Corps of Engineers to head what was code-named the "Manhattan Engineer District" (later popularly known as the Manhattan Project), America's atomic bomb project. It was a massive, sprawling effort that ultimately encompassed several leading university laboratories, three giant manufacturing sites, tens of thousands of constructions workers, the world's best scientific talent, several major corporations, and some $2.2 billion of federal funds.

The project's two key scientific advisers were Vannevar Bush, an electrical engineer at head of the Office of Scientific Research and Development, and a former dean of the Massachusetts Institute of Technology; and James Conant, a chemist, chair of the National Defense Research Committee, and president of Harvard University. Groves also recruited Nobel laureate Arthur Compton to the project, who in turn recruited Wigner and Szilard as well as Nobel laureates Fermi and James Franck to his burgeoning research laboratory at the University of Chicago (code-named the "Metallurgical Laboratory"). Meanwhile, the army brought in Boston-based Stone & Webster as principal engineering contractor, and the giant Du Pont chemical firm, which had no experience in plutonium production but took on the work for costs plus $1. After Fermi—working in a racquet-ball court under the stands at the University of Chicago's Stagg Field—achieved the first self-sustaining nuclear reaction in December 1942, the Manhattan Project settled on water as a coolant and forged ahead with construction plans.

At Hanford, Washington, along the Columbia River, the project built three production piles (reactors) and four separation plants for separating plutonium from other elements. At Oak Ridge, Tennessee (near Knoxville), it built uranium-235 facilities utilizing three processes—thermal diffusion, gaseous diffusion, and electromagnetic separation; the latter, under the direction of Lawrence, proved to be the most successful. And at Los Alamos, New Mexico (near Santa Fe), the Manhattan Project in 1945 began building a facility for making both plutonium and uranium bombs under the direction of the brilliant Harvard- and Gottingen-educated physicist J. Robert Oppenheimer. Like the Chicago-based scientists before him, Oppenheimer and his researchers often clashed with Groves and the project engineers, who preferred to compartmentalize and control information about the project rather than exchange it freely among the scientists. At Los Alamos, Oppenheimer's approach prevailed.

Confident that the uranium bomb would detonate with a gun-type neutron device but less sure of the plutonium bomb's "implosion" detonation device (invented by physicist Seth Neddermeyer), the scientists tested a plutonium bomb—dubbed "Fat Man" in honor of Winston Churchill—at Alamogordo, New Mexico, on July 16, 1945. The awesome power of "Project Trinity" immediately inspired doubts and fears about a nuclear future, even among several Manhattan Project scientists. Szilard, Franck, and others urged the U.S. military to demonstrate the atomic bomb to the Japanese in an uninhabited area; but the momentum of the project, and political pressure for a rapid end to the war, were too great. On August 6, the U.S. B-29 bomber Enola Gaydropped a uranium atomic bomb ("Little Boy") that detonated over Hiroshima, Japan, at 8:15 A.M., killing an estimated 75,000 to 100,000 people instantly, and another 200,000 from radiation over the next five years. Three days later, the United States destroyed Nagasaki with a plutonium bomb, and the Japanese surrendered shortly thereafter. By building the largest government-business-university collaboration in its history, the United States had harnessed atomic energy and brought World War II to a rapid end.

Over the next several years, the atomic bomb helped usher in a new kind of geo political conflict: the Cold War. Rather than clashing with conventional weapons, the two postwar superpowers—the democratic United States and the Communist Soviet Union—relied increasingly on their growing arsenals of nuclear weapons to protect their spheres of influence and to deter encroachment with the threat of mutual destruction. Tensions in the early Cold War were heightened when the United States refused to share atomic bomb technology with its erstwhile ally the Soviet Union. When the USSR successfully tested its own atomic bomb in 1949, the "arms race" between the superpowers to achieve nuclear superiority was on. In October 1952 the United States tested its first hydrogen bomb, at Eniwetok Atoll in the Pacific. By fusing light forms of hydrogen to create helium—the reaction responsible for the Sun's energy—the "Hbomb" exploded with 1,000 times the power (the equivalent of some 10 million tons of TNT) of its fission predecessor. The USSR had the H-bomb by 1953. The U.S. then began developing new, solid-fuel intercontinental ballistic missles (ICBMs) to deliver bomb warheads from the U.S. western plains deep into the Soviet Union.

The post–World War II generation was the first to live under the shadow of total and instantaneous annihilation, a reality that—coming on the heels of the holocaust and the massive military horrors of World War II—gave rise to a new nihilism among some philosophers and social thinkers. Psychologists began to probe the possible mental health consequences of life in the atomic age, although many concluded that nuclear apocalyse was too vast and horrible to comprehend. And, not surprisingly, the atomic bomb began to find its way into everyday life and popular culture, sometimes in bizarre and even lighthearted ways.

Communities began to refashion their civil defense procedures to accommodate the bomb. In the United States, this meant designating bomb shelters—usually in the basements of schools and other public buildings, but also involving construction of individual shelters beneath the yards of single-family homes in the nation's burgeoning postwar suburbs. Stocked with canned foods and other provisions, these underground chambers typically were small and spartan, although some realtors seized the opportunity to offer luxury models with modern conveniences. While never ubiquitous, individual bomb shelters nevertheless became normalized, as reflected in a 1959 Lifemagazine story about a newlywed couple who spent two weeks of "unbroken togetherness" honeymooning in their bomb shelter.

The atomic bomb also became a key subject in 1950s American film, particularly in the science fiction genre. While some films spun out scenarios about how the nuclear powers might accidentally bring on Armageddon—as in Fail-Safe and Stanley Kubrick's classic tragicomedy Dr. Strangelove—most focused on the insidious and little-understood effects of radiation on human and animal life. However implausible the premises of The H-Man, Attack of the Crab Monsters, The Incredible Shrinking Man, Attack of the Fifty-Foot Woman, and their ilk, such fantasies resonated with widespread anxieties about genetic damage from atomic fallout.

Although government officials attempted to educate the public and military personnel about atomic civil defense, in retrospect these efforts seem hopelessly naive if not intentionally misleading. Army training films advised soldiers to keep their mouths closed while observing atomic test blasts in order to not inhale radioactive flying dirt. Civil defense films used a friendly animated turtle to teach schoolchildren to "duck and cover" during a nuclear attack—that is, duck under their desks and cover their heads. Such measures, of course, would have offered pitiful protection to those in the blast zone.

Opinion polls showed that American anxiety about the atomic bomb ebbed and flowed in response to geopolitical events. Concerns ran high in the late 1940s in the wake of the atomic bombings of Japan (many wondered whether the weapon would be used in all military conflicts), and as the Soviet Union undertook a crash program in rocketry and atomic-bomb development. These fears cooled temporarily in the early 1950s, particularly after the death of Soviet dictator Joseph Stalin in 1953 raised hopes for U.S.-Soviet rapprochement. But ICBM development and a wave of H-bomb testing in the Pacific in 1954 stirred up renewed public fears about fallout, especially after milk in heartland cities such as Chicago was found to contain elevated levels of isotopes. The national heart rate spiked during the tense days of the Cuban Missile Crisis in 1962, then slowed after the United States and the Soviet Union signed the Partial Test-Ban Treaty in 1963. Whereas 64 percent of Americans identified the threat of nuclear war as their leading concern in 1959, only 16 percent put the same concern first in 1964.

"ATOMS FOR PEACE": THE ORIGINS OF THE NUCLEAR POWER INDUSTRY

The Atomic Energy Act of 1946 represented the interests of American scientists who wished to see nuclear energy developed for nonmilitary purposes. It called for the establishment of a five-member civilian Atomic Energy Commission (AEC), which could deliver weapons to the military only on presidential order. But the military tensions of the early Cold War delayed civilian nuclear power development until 1948, at which time 80 percent of the AEC's budget went to military ends. In 1951, U.S. civilian nuclear power development consisted of only a small experimental government (liquid metal) reactor in Idaho.

Through the efforts of Captain Hyman G. Rickover, a naval engineering officer, the U.S. Navy made rapid strides in nuclear ship and submarine development. Garnering support from Edward Teller and other key figures outside the navy, Rickover brought in Westinghouse, General Electric, and the Electric Boat Company to construct the Nautilus, the world's first nuclear submarine. First tested in 1955, the Nautilusran faster and ten times farther without surfacing than conventional submarines. Nuclear-powered aircraft carriers and other surface ships followed, and by the 1960s U.S. nuclear submarines were equipped with solid-fuel nuclear missiles. In contrast, the U.S. merchant marine did not emphasize use of nuclear power. Its demonstration vessel, the Savannah, operated by the U.S. Marine Corporation beginning in 1959, used a pressurized-water reactor instead of a boiling-water reactor, and required a heavy government operating subsidy.

In 1953 the AEC began planning a full-scale nuclear plant at Shippingport, Pennsylvania. The plant was to be owned and operated by Duquesne Light Company, managed by Rickover, and equipped with a version of the navy's pressurized-water reactor. But rapid U.S. nuclear power development came only in the wake of the new Atomic Energy Act of 1954, which permitted private power companies to own reactors and to patent nuclear innovations; and the Price-Anderson Act of 1957, which limited liability for individual companies to $560 million and gave government subsidies for liability insurance. The first American plants went on line in the early 1960s.

The USSR operated the first nuclear power plant supplying a national grid at Obninsk, south of Moscow in 1954. This modest (5,000 kW) plant was the opening wedge in an aggressive Soviet drive for nuclear energy as reflected in the Bolshevik slogan "Communism equals Soviet power plus electrification of the entire country." Two years later, the United Kingdom operated a full-scale commercial nuclear plant at Calder Hall, a facility designed to produce plutonium for defense with electricity as a by-product. France ran an experimental reactor, at Marcoule, that year, but began its commercial program with the 'Electricite' de France Chinon reactor in 1962. By 1965 the United Kingdom led in nuclear power output with 3.4 million kWh, followed by the United States (1.2 million kWh), the USSR (895,700 kWh); Italy (622,000 kWh); and five other nations producing at least 500,000 kwh each. Within a few years the United States took the lead in technology and total output, its pressurized-water and boiling-water reactors supplied by enriched fuel from Manhattan Project plants.

These were heady times for the nuclear industry. Well-informed experts predicted that electricity soon would become "too cheap to meter." At the 1964–1965 New York World's Fair, General Electric's "Progressland" exhibit featured "the wonders of atomic energy"; and the company's Medallion City claimed to produce fusion energy (using a 0.000006 second, 100 million flash). Fusion energy, said General Eelectric, was going to supply a billion years of electrical energy.

Nuclear power developed unevenly across the globe. In 1987 the United States operated 110 of the world's 418 nuclear plants, the USSR 57, France 49, the United Kingdom 38, Japan 37, Canada 19, and Sweden 12, while some regions—the Arab states, Africa (except South Africa), and most of Central America—had few or none. Although the United States produces a third of the world's electricity, it derives only 15 percent of it from nuclear energy.

In each country, the pattern of nuclear development reflected national technical and economic resources, politics, and culture. In France, for example, national economic planners ensured the rapid growth of nuclear energy by limiting citizen participation and by standardizing reactor design. The Federal Republic of Germany followed a style closer to the United States, and thus sustained greater political challenges by its environmentally sensitive Green Party.

In balancing development with social safety, the Soviet Union was a grim outlier. Recent research has revealed a series of Soviet nuclear accidents, many of them concealed from the outside world, beginning with a dramatic waste dump explosion in 1957 near Kyshtym in the Urals that spread more than 2 million Ci over 20,000 sq miles. The explosion of Unit 4 at Chernobyl, north of Kiev, on April 26, 1986, was the worst recorded nuclear accident in history. It happened when operators were testing a voltage-regulating scheme on turbogenerator 8 when coolant pumps were slowed. Vigorous boiling led to excess steam, slow absorption of neutrons, and increased heat in a "positive feedback" cycle that resulted in meltdown and explosion. By early May, airborne contamination completely covered Europe. Thirty-one people were killed, 200 suffered radiation sickness, hundreds of thousands of people were confined indoors or evacuated, foods that were suspected of being contaminated were banned, and scientists projected some 1,000 extra cancer deaths over the next fifty years. Perhaps most troubling, follow-up investigation faulted Soviet plant design as much as or more than operator error. Following the accident, public opinion shifted sharply against nuclear energy in Europe, but in the United States, the tide already had turned.

NUCLEAR POWER UNDER SIEGE

In the 1950s, widespread anxieties about the atomic bomb in the 1950s spurred few into action. One notable exception in the United States was the Greater St. Louis Committee for Nuclear Information, founded by biologist Barry Commoner and other scientists at Washington University in 1958. This watchdog group published Nuclear Information (renamed Science and the Citizen) in 1964. For the most part, however, citizens of the two superpowers and their satellites or allies saw nuclear weapons as a necessary tool in the global contest between democracy and communism. Indeed, for many, atomic supremacy was the most accurate measure of national prowess.

This changed dramatically in the 1960s and 1970s as antinuclear opinion switched from weapons to nuclear power and spread from activist groups into a large segment of the middle class. Antinuclear activism was strongest in the two nations with large nuclear power programs and the most open political systems—the United States and West Germany—although there were notable institutional and social differences between the two nations. And while anti-nuclear activism surely slowed the progress of nuclear power development in many countries, it was one of a cluster of economic, technological, political, and social forces constraining the industry.

Nuclear energy opponents initially targeted thermal pollution—the effect of nuclear power plants on local aquatic life by raising water temperature by several degrees. Attention soon shifted to the question of radioactive contamination of cooling water. When two scientists at Lawrence Livermore Laboratory in Berkeley, California—John Gofman and Arthur Tamplin—argued that the AEC acceptable level of 170 millirems per year would cause some 16,000 additional deaths (they later revised the figure to 74,000), an intense debate ensued that eventually led to a U.S. Senate committee and to a new standard of 25 millirems. Meanwhile, news of nuclear plant accidents in foreign countries reached the United States, which endured an accident of its own at Idaho Falls in 1961. New guidelines for remote nuclear plant citing followed, which resulted in scuttled plans for a Consolidated Edison plant in Queens, New York, and another in Los Angeles. Anti-nuclear activism was strongest in southern California, where protesters managed to stop the construction of a Pacific Gas & Electric plant at Bodega Bay in 1963 because it was near an earthquake fault.

The 1970s were hard times for the nuclear industry. The decade opened with the first Earth Day (April 22), which featured thousands of teaching events, many of them aimed at halting further nuclear power development, and ended with the accident at the Three-Mile Island nuclear plant in Pennsylvania. In between, the nuclear industry sustained increasingly sophisticated attacks from increasingly better organized opponents, such as the Environmental Defense Fund (1967); Ralph Nader's Critical Mass Project in Washington, D.C.; and the Union of Concerned Scientists, a consortium of nuclear scientists and engineers founded at MIT in 1971. In 1976, California passed a ballot initiative that halted nuclear plant construction until the federal government found a satisfactory way to dispose of radioactive wastes; while in New Hampshire the Clamshell Alliance was formed to oppose the construction of a nuclear plant at Seabrook, an effort that dragged on for years, involved tens of thousands of demonstrators, and helped force the plant's owner (Public Service Company of New Hampshire) to cancel plans for one of its two reactors and to declare bankruptcy in 1988—the first public utility to do so in American history.

In that case, protests caused delays that contributed to large cost overruns. But Seabrook was an exception; most nuclear utilities got into financial trouble with little help from protesters. Although oil prices rose dramatically in the 1970s—a spur to nuclear development—the "stagflation" of the times drove down demand for electricity from 7 percent to 2 percent per annum and drove up interest rates into the double digits. Between 1971 and 1978, nuclear capital costs rose 142 percent, making them more expensive to build per kilowatt-hour of capacity than new fossil fuel plants.

No case better illustrates the travails suffered by the nuclear power industry and its customers than the saga of the Washington Public Power Supply System (WPPSS). Thanks to abundant Columbia River hydropower, the Pacific Northwest enjoyed the nation's lowest electricity rates in the 1950s. By then the prime hydropower sites has been exploited, and demand for electricity continued to rise. In 1968 some one hundred utilities in the region, working through WPPSS, financed a ten-year, $7 billion Bonneville Power Administration plan to improve the region's hydropower and transmission and distribution assets as well as to build seven new thermal power plants, most of them nuclear. Construction costs skyrocketed, however; the second power plant (at Hanford, Washington), projected to cost $352 million in 1977, had consumed nearly $2 billion by 1980. Struggling to avoid bankruptcy, Bonneville boosted its wholesale rates 700 percent between 1979 and 1984. Irate ratepayers dubbed the project "WHOOPS."

Viewed in this context, the Three-Mile Island (TMI) accident was the coup de gracefor an already foundering industry. In spite of the fact that the hydrogen gas bubble that accumulated in Reactor 2 did not explode, although some contaminated gas escaped; and that the commissions who investigated the accident faulted human error rather than equipment failure, TMI caused (as the New York Times put it) "a credibility meltdown" for the nuclear industry. (The release of a major motion picture with an eerily similar scenario, The China Syndrome, a few weeks before the real accident amplified the public relations crisis.) The TMI cleanup cost roughly $1 billion, less than a third covered by insurance. The Nuclear Regulatory Commission, a federal agency, began to enforce an informal moratorium on new plant licenses pending further investigation of TMI, and soon demanded expensive retrofitting for similar plants. But even without TMI, no new reactors have been ordered in the United States since 1978.

NUCLEAR ENERGY AND NUCLEAR DEFENSE FROM THE LATE 1970S TO THE PRESENT

As they did at the dawn of the atomic age, nuclear weapons have overshadowed nuclear energy for since the late 1970s. The shift began then, when the Carter administration switched its Soviet stance from détente to rearmament and ordered 200 new MX missiles. By then, the nonprofit organizations Union of Concerned Scientists—perhaps seeing civilian nuclear energy as moribund—shifted its attention to weapons. The Cambridge, Massachusetts–based Physicians for Social Responsibility, led by Helen Caldicott, began to publicize the horrors of nuclear war, garnered a large grant from the Rockefeller Foundation, and saw its membership surge from 500 in 1978 to 16,000 in 1980. At the same time, a movement favoring a moratorium on nuclear weapons began to take shape, with 1980 presidential hopefuls Governor Jerry Brown of California and Senator Edward Kennedy of Massachusetts joining the cause.

The election of Ronald Reagan to the White House in 1980 heightened nuclear tensions. Reagan dubbed the Soviet Union an "evil empire" and spoke of first-strike capabilities and strategies of limited nuclear war. Supporters of the 1981 Nuclear Weapons Freeze Campaign, in contrast, called for a verifiable treaty with the Soviet Union to halt nuclear proliferation. In 1983 Regan announced the extravagant Strategic Defense Initiative to shield the United States from nuclear attack using space-based missile interceptors and other yet-to-be-developed technologies. Endorsed by atomic bomb pioneer Edward Teller, the project was so expensive and technically speculative that it became popularly known as "Star Wars," after a Hollywood space fantasy. Although the president called it a purely defensive measure, many foreign-policy experts saw it as a form of destabilizing escalation. Others considered it a ploy to force the USSR into heavy military spending (to achieve parity with Star Wars) at a time when the Soviet empire was undergoing political and economic stress under the reformist regime of Mikhail Gorbachev.

The question of how much Star Wars may have contributed to the 1991 collapse of the Soviet Union continues to generate debate. But the breakup of the postwar nuclear superpower clearly ushered in a new era of nuclear quiescence. To be sure, the fate of the former Soviet Union's tens of thousands of warheads remains a vital concern, especially in light of the region's growing economic pressures. For example will unpaid military officers be tempted to sell nuclear weapons to well-funded terrorist groups? In the post-Soviet world, the threat of nuclear warfare no longer is seen as a matter of global annihilation bur rather as a risk in political hot-spots such as the Middle East, Korea, or—if Iraq's Saddam Hussein should acquire nuclear technology—the United States as a possible target. And dreams of a Star Wars-like defense live on; late in his final term, President Bill Clinton announced plans to consider the development of a nuclear defense system reminiscent of, though less elaborate than, the Reagan plan.

Although reactor-building continues steadily in France and Japan, and may take hold in parts of the less developed world, nuclear power will need a major technological breakthrough (or a fossil fuel energy crisis) to make a comeback in the United States and Germany. One possibility is the fusion reactor, which produces nuclear energy by combining two light elements (such as deuterium) into a single heavier one that weighs less than the sum of its parts, with the difference released in the form of energy. Fusion produces less radioactive waste than fission and faces no fuel constraints (deuterium is found in water). The problem is that scientists have not been able to combine light atoms through collusion—they bounce off each other—but rather do so by accelerating them with heat so intense it either consumes as much energy as the reaction produces, or (at much higher levels) it cannot be contained safely.

Since the earliest days of the atomic age, physicists and engineers have predicted the coming of practicable nuclear fusion within "ten years" or "a generation." History therefore offers many reasons to be skeptical about the promise of nuclear energy. At the same time, this unparalleled form of energy is not going to return to the Pandora's box pried open by the Manhattan Project more than a half century ago.

David B. Sicilia

See also:Einstein, Albert; Emission Control, Power Plant; Energy Management Control Systems; Environmental Problems and Energy Use; Ethical and Moral Aspects of Energy Use; Explosives and Propellants; Historical Perspectives and Social Consequences; Matter and Energy; Military Energy Use, Historical Aspects of; Nuclear Energy; Nuclear Fission; Nuclear Fission Fuel; Nuclear Fusion; Nuclear Waste.

BIBLIOGRAPHY

Boyer, P. S. (1985). By the Bomb's Early Light. New York: Pantheon.

Campbell, J. L. (1988). Collapse of an Industry. Ithaca NY: Cornell University Press.

Cantelon, P. L.; Hewlett, R. G.; and Williams, R. C., eds. (1991). The American Atom. Philadelphia: University of Pennsylvania Press.

Hughes, T. P. (1989). American Genesis.New York: Viking.

Joppke, C. (1993). Mobilizing Against Nuclear Energy. Berkeley: University of California Press.

Melosi, M. V. (1985). Coping with Abundance.Philadelphia: Temple University Press.

Mounfield, P. R. (1991). World Nuclear Power. London: Routledge.

Rhodes, R. (1986). The Making of the Atomic Bomb. New York: Simon & Schuster.

About this article

Nuclear Energy, Historical Evolution of the Use of

Updated About encyclopedia.com content Print Article