Nuclear Weaponry

views updated

Nuclear Weaponry

Overview

Humans have warred against one another since the being of time. Until recently, this tendency to wage war endangered hundreds, thousands, or even millions of people, but never our existence as a species. The advent of nuclear weapons, however, has changed the potential outcome of war. First envisioned simply as larger bombs, nuclear weapons ended World War II, precipitated a decades-long Cold War, and functioned as a precarious deterrent to a Third World War, while, at the same time, making all humanity fear its results. For decades, the nuclear club was limited to a handful of nations, but it expanded in the last part of the 1990s and is threatening to increase again. Paradoxically, while the end of the Cold War reduced the chance for global nuclear annihilation, it may have increased the risk of smaller-scale nuclear war between nations with recently acquired nuclear weapons.

Background

In the final days of the World War II, while America and her allies were readying for a costly invasion of Japan, two nuclear weapons were dropped on Japanese cities, promptly ending the war. The enormous research and development effort that went into the design and construction of these weapons may well represent the greatest scientific and technical effort in human history. The Manhattan Project scientists, 15 of whom were Nobel laureates, was arguably the greatest collection of intellect ever assembled. The irony is that this tremendous effort was made to produce the first weapon powerful enough to threaten human existence and that of most other species on Earth.

Nuclear fission had an inauspicious beginning. In fact, Italian-born American physicist Enrico Fermi (1901-1954) was thought to have proved in the early 1930s that fission was impossible. He received a Nobel Prize for creating transuranic (heavier than uranium) elements by bombarding uranium with neutrons. Soon afterwards, however, Lise Meitner (1878-1968) and Otto Hahn (1879-1968) showed that Fermi's transuranics were actually fission products, caused by splitting uranium atoms. In 1935 Irene Curie (1897-1956) and her husband Frédéric Joliot-Curie (1900-1958) were awarded the Nobel Prize for showing that more neutrons are released from each nucleus during fission than are used to create fission. This sparked the idea of a self-sustaining chain reaction, if properly controlled, or the possibility of nuclear weapons if left carefully uncontrolled.

The impetus for developing American nuclear weapons was a carefully worded letter, written by a number of prominent scientists, that was delivered to President Franklin D. Roosevelt by Albert Einstein (1879-1955). This letter outlined the possibility that a nuclear weapon could be developed—and what could happen if Nazi Germany developed it first. A premier research team, called the Manhattan Project, was subsequently assembled. Charged with creating the first atomic weapon, they worked in strict secrecy. The German atomic weapons program, later revealed to be misguided and misdirected, never reached its goal, and the European war ended before the American's new weapon was completed. The war with Japan, however, lasted long enough and promised to have a sufficiently bloody end that atomic bombs were called into use, bringing that war to a rapid conclusion with devastating effect on Japan.

The immediate postwar period saw the American monopoly in nuclear weapons vanish. The Soviet Union, followed by Britain and France, developed nuclear (and then thermonuclear) weapons. Later, China, South Africa, and a number of other nations followed suit (although South Africa abandoned its nuclear weapons program in 1989). A generally successful international program to halt nuclear weapons proliferation helped limit their spread, although India, Pakistan, and (many suspect) Israel did successfully design and build their own nuclear arsenals. Other so-called rogue states have attempted to either design, purchase, or steal nuclear weapons or the technology to make them. Such attempts by Iraq, under the leadership of Saddam Hussein during the 1990s, raised the stakes of the Persian Gulf War and resulted in the imposition of contentious United Nations inspections after the war was concluded.

Impact

The development and fear of nuclear weapons was a defining aspect of the second half of the twentieth century. For many years, every scientific advance, athletic victory, family vacation, or school class existed in the shadow of imminent nuclear war. The Cold War spurred the space program, drove technological development, financed scientific research, invaded popular culture, and more. For nearly 50 years, people worldwide wondered if, or when, a nuclear war might occur, and many seriously debated whether it would be better to survive such a war or to be among the hundreds of millions killed.

As first conceived, nuclear weapons were simply a more efficient form of artillery. Little thought was given to fallout, cancer, genetic damage, or nuclear winter because these issues were beyond the experience of anyone on earth. Gradually, awareness grew that nuclear war would bring with it all of these things. This led to the atmospheric test-ban treaty, for which American chemist Linus Pauling (1901-1994) was awarded the Nobel Peace Prize. Massive demonstrations against nuclear weapons in Europe and the United States also led eventually to several treaties that first limited and later reduced the size of the superpowers' nuclear arsenals.

At the same time, research by Luis Alvarez (1911-1988) and Walter Alvarez showed that a major asteroidal impact had likely caused the mass extinction of the dinosaurs. Additional research by Carl Sagan (1934-1996) and others showed that such impacts could create a global winter by raising clouds of dust that would block out the sun for several years. When people realized that nuclear war could do the same thing, nuclear winter became something more worry about.

In many ways, the Cold War and nuclear weapons are inseparable. Only the fear of nuclear devastation kept the Cold War from turning hot, and even with that, nuclear war nearly broke out on several occasions. Yet, fears of mutually assured destruction (MAD) also worked to prevent a seemingly inevitable conflict between the two antagonistic ideologies of democracy and communism. This checkmate between the two opposed camps helped to keep war smaller in scale and regional in scope.

The amount of scientific research and technological advance wrought by the Cold War is truly incredible. Both the American and Soviet space programs were the result of national pride and fear of the other's missiles. When the Soviet Union launched Sputnik, the U.S. military saw, instead of a small satellite in orbit, a potential bomb flying over the pole toward New York. The Cold War and the space program brought the world miniature electronics, home computers, wireless telephones, Velcro, fuel cells, communications and weather satellites, and more. In addition, scientists in any number of specialties benefited, even if their research was only peripherally related to war. Astronomers, for example, benefited from advances in optics spurred by the Strategic Defense Initiative (SDI) (popularly known as "Star Wars"); the Human Genome Project, initially conceived as a way to better understand the genetic effects of radiation, was initiated by the Department of Energy.

The SDI program warrants special mention. Begun under the Reagan administration in the early 1980s, this ambitious program aimed to design a space-based system that could detect and destroy a full-blown Soviet nuclear attack. It is widely thought that Soviet fear of American technological prowess and the rapid development of Star Wars induced the Soviet government to spend more than it could afford on its military efforts. This may well have hastened the collapse of the Soviet Union and the overthrow of Communist governments throughout Eastern Europe at the Cold War's close. Though SDI's initial goals may have been ambitious, subsequent efforts have turned to intercepting small-scale launches by rogue nations rather than stopping a full-scale attack.

The cultural aspects of nuclear weapons were important, too. Movies like the original Godzilla and Them postulated monsters created by exposure to radiation from nuclear-weapons testing. American school children were taught to "duck and cover" in case of a nuclear attack. People dug bomb shelters beneath or beside their homes in case of a nuclear attack, and other movies, such as On the Beach, Dr. Strangelove, and Failsafe looked at nuclear war from a number of aspects. In some parts of the world, particularly Europe, these fears may have been even greater, as they were positioned between the two nuclear-armed superpowers.

The end of the Cold War removed many of these fears. Although the United States and Russia still possess enough nuclear weapons to destroy human civilization, both nations have agreed to reduce their stockpiles. That and other political events have since made an all-out nuclear war less likely. Replacing the fear of global nuclear catastrophe, however, is the fear of local nuclear war or of nuclear terrorism. The collapse and impoverishment of the former Soviet Union raised many fears that a combination of desperation, hunger, and poverty could lead former Soviet nuclear scientists to work for rogue governments, helping them obtain the materials and expertise needed to build their own weapons. Where Cold War movies explored total nuclear war, post-Cold War movies imagine nuclear terrorism.

We now find ourselves in a world dominated not by fears of global catastrophe, but, rather, by fears of nuclear terrorism or small-scale nuclear war between minor nuclear powers. It remains to be seen how and to what extent nuclear weapons will shape the geopolitical balance and social consciousness around the world.

P. ANDREW KARAM

Further Reading

Books

McPhee, John. The Curve of Binding Energy. New York: Noonday Press, 1994.

Rhodes, Richard. Dark Sun: The Making of the HydrogenBomb. New York: Simon and Schuster, 1995.

Rhodes, Richard. The Making of the Atomic Bomb. New York: Simon and Schuster, 1986.

Serber, Robert. The Los Alamos Primer: The First Lectures on How to Build an Atomic Bomb. Berkeley: University of California Press, 1992.

Other

Trinity and Beyond. 1997. Goldhill Videos. Videocassette.


THE DISCOVERY OF GAMMA RAY BURSTS

In an effort to better monitor nuclear weapons testing by the Soviet Union in the early 1960s, the United States began sending satellites into orbit that could detect gamma rays given off by nuclear reactions in atomic bombs. In conjunction with seismic readings obtained from stations built to detect atomic explosions, the satellites would let the United States know if the Soviet Union was testing weapons larger than allowed by treaty.

Shortly after being activated, these satellites, code named Vela, began detecting gamma ray bursts (GRBs). However, these bursts were not associated with any seismic events that resembled nuclear detonations, and the characteristics of the burst were far different than those expected from nuclear weapons tests. Scientists finally decided that these bursts of gamma rays had to come from space, since there was no reasonable terrestrial source. Since the Vela program was top secret this information was not released to the scientific community for many years.

Eventually, these gamma ray bursts were discovered by other scientists, who launched their own studies to investigate them. Although a number of theories were floated, in the late 1990s the riddle began to be solved. Astronomers now believe that GRBs are caused by an extreme class of supernovae at faraway points in the galaxy. Because they are detectable at great distances across most of the observable universe, they may also be the most energetic events known to occur.


About this article

Nuclear Weaponry

Updated About encyclopedia.com content Print Article