Nuclear Ethics: Industrial Perspectives

views updated

Nuclear Ethics: INDUSTRIAL PERSPECTIVES

There are powerful undercurrents in motion that seek to change the way people work with and think about the nuclear industry. The nuclear energy industry is capable of transforming terrestrial life for better or worse. Never has an industry possessed such awesome forces, and never has there been a greater need for an ethics to guide the way an industry develops. To this end it is useful to review the history of the industry and the highly diverse influences that have produced it. In particular there are two main influences. One is associated with military policy and focuses on geostrategic decisions related to nuclear war, whether offensive or defensive. The second is located within the civilian area, includes both nuclear medicine and nuclear power generation, and touches on issues of safety, environmental pollution, and economics. The focus here will be on the civilian aspects of the industry.


The Discovery of Radioactivity

At the end of the nineteenth century scientists were examining the properties of cathode ray tubes. These consisted of an enclosed glass vessel that had two electrodes set into the glass at opposite ends of the chamber. When almost all the air in the chamber had been removed and one of the electrodes was heated while the other electrode was given a positive charge (the anode), it was noticed that rays were emitted from the hot (cathode) electrode. In 1895 in Würzburg, Germany, Wilhelm Conrad Röntgen (1845–1923) noted that a plate coated with barium platinocyanide held in front of a functioning cathode-ray tube fluoresced and emitted light. What was more, when he placed a light-opaque material between the plate and the tube, the fluorescence did not cease. Clearly the rays derived from the tube, which he called "X rays," by passing through an opaque material, had done something that visible light rays did not do.

The next year in Paris, Antoine-Henri Becquerel (1852–1908) noted that certain minerals fluoresced when they were exposed to ultraviolet light and that they were capable of fogging an adjacent photographic plate, even when that plate was covered by a double layer of light-opaque paper. One such mineral was uranyl potassium sulfate crystal. He later showed that the effect was largely due to the metal component, uranium. While most of the interest at the time focused on the X rays, Marie Curie (1867–1934) and Pierre Curie (1859–1906) showed that other elements were capable of making penetrating radiations and in the process discovered the elements radium and polonium.

Becquerel, however, made one further vital discovery. After putting a sample vial containing the Curies' radium into his vest pocket, he noted some time later that his skin in the region covered by the pocket became burned. He thus discovered the biological effects of radiation, a phenomenon that was soon put to medical use for a wide variety of ailments, although most such treatments led to a worsening of the condition being treated. (Both the Curies and Becquerel received Nobel Prizes for their discoveries; Marie Curie became the first person to receive two such prizes for her discoveries in the chemistry of radioactive elements.)

Types of Radiation

From these beginnings it became clear that the radiation could be divided into several clear types. X rays and later c-rays (gamma rays) were shown to behave like light rays, being part of the electromagnetic spectrum, whereas β-rays (beta rays) were shown to be streams of negatively charged electrons and α-rays (alpha rays) were helium atoms without the electrons (that is, helium atomic nuclei consisting of two protons and two neutrons). Each of these radiations can be made to generate point sources of light for each energetic emission. From this it has been observed that each gram of radium emits some 3.7 x 1010 emissions per second—or 1 curie of radioactivity, a baseline parameter. By comparison, all humans are exposed to both cosmic rays from the sun and to radioactivity from rocks and gases of the earth to a level that varies between 20 detectable emissions per second to about 200 in special areas of such countries as India, Iran, and Brazil. In term of other units of measure, such radiations give normal background levels of 3–600 millisieverts (mSv or mGray) per year.

To acquire a concept of the properties of such radiations it is useful to note that


  • In terms of emissions, exposure to X rays, β-rays, or c-rays is less damaging than the equivalent amount of α-rays by a factor of about 20.
  • Exposure to 10 sieverts (Sv) in one day is normally lethal to one human.
  • Exposure to 10 Sv over one year would have a chronic effect on one human, such as cancer.
  • Workers or sailors involved in the nuclear industry or the nuclear-powered navy are allowed to be exposed to 2.2 mSv/day.
  • The 541 atmospheric tests of nuclear weapons set off between 1945 and 1980, which exploded the equivalent of 440 megatons of TNT, have increased the normal background radiation by 0.04 mSv/year.
  • The additional radiation from all the world's nuclear power stations amounts to 0.002 mSv/year.
  • A medical or dental X ray delivers, in seconds, 0.4 to 10 mSv.
  • A modern CAT scan exposes a person to some 10 mSv.
  • To achieve a biological effect the amount of radiation that has to be delivered has to exceed a certain "threshold" level.

Radioactivity in the Laboratory and Medicine

The civilian nuclear industry has, as a by-product, made available many radioactive materials that find uses in the laboratory or medical diagnostic facilities. Elements such as tritium (H3 or hydrogen with one proton and two neutrons), carbon-14, sulfur-35, and phosphorous-32 are all β-ray or -particle emitters, while iodine-131 is a c-ray emitter. People who work with chemical compounds containing such isotopes need not be unduly worried about the effects of radioactivity on their persons, because β-rays travel only a few millimeters and do not penetrate the walls of glass tubes or containers. By contrast, 3 million c-rays and 250,000 β-particles emanating from natural sources pass though an individual human every minute.

These radioactive isotopes have enabled scientists to map out the route taken during the chemical transformation of food materials to cellular components and wastes and have unlocked the mysteries that surrounded the process of photosynthesis on which advanced life depends. In the medical area, the use of X rays for diagnosis is widespread, and the use of radioactive iodine in immunoassays for the detection of micrograms of materials per milliliter of sample is a powerful tool in measuring hormone and other metabolites of interest in medical and veterinary applications. A more recent use of radioactive isotopes has been in assay systems that enable determining the sequence of the bases in molecules of nucleic acids. Such assays have been used to acquire knowledge of the full sequence of the human genome and identify particular genes that cause inherited defects.

Most ethical debate on the use of genetic engineering techniques for the correction of defects in single gene disorders (typically, cystic fibrosis or immune disorders caused by a faulty enzyme, amino deaminase) has taken the view that such efforts are worthy and should be encouraged. It is also held, however, that only the phenotype should be affected and efforts to correct the defect in gametes should not be allowed. When it comes to the use of genetic engineering to effect enhancements of individuals (eye, hair and skin color, intelligence, musical and athletic abilities, etc.), ethical arguments are adduced to prevent such efforts, although the use of the growth hormone gene may be applied to correct a pathological condition, dwarfism, but not to produce basketball players.

Cancer treatments based on radiation (X rays, c-rays, and β-rays) are many and varied. Whole-body radiation of 10 Sv (10,000 times the annual background exposure) will cause the cessation of the development of bone marrow. Cancer treatment is based on the need to kill cells whose replication control mechanism has become ineffective. There is, however, the risk of killing other (collateral) cells and also of causing a cancer as a result of damaging nucleic acid molecules (genes) in neighboring tissues. Therefore the basis of successful therapies is to engineer treatments to maximize the therapeutic effects while minimizing the chances of coincident damage.


From Nuclear Energy to Electrical Power via the Atomic Bomb

The route from radiation to the atomic bomb came via the demonstration of the fission of atomic nuclei in 1938 by the German chemists Otto Hahn and Fritz Strassmann, which was followed by the separate investigations of Niels Bohr and Enrico Fermi on the fission of uranium atom nuclei. Experiments of all four led to the understandings of the crucial position of the uranium-235 isotope as opposed to the more abundant version of that element, uranium-238. The separation of these isotopes occupied the scientific and engineering acumen of many in both the United Kingdom and the United States.


In 1941 the work done in the United Kingdom influenced Vannevar Bush in the United States to authorize the construction of a subcritical experimental nuclear reactor or "pile." President Franklin D. Roosevelt backed the program in October of that year. In April of the next year, Fermi relocated to the University of Chicago, where he built a larger and more active reactor in the Stagg Field squash courts; calculations regarding the amount of material that would be needed to make a bomb were set in motion. Using the mental and physical understandings and skills of tens of thousands of scientists and engineers who were given an unlimited budget the outlines of the nature of an atomic bomb emerged. In January 1945 after much empirical experimentation and theoretical calculation, the scientists concluded that some 10 kilograms of plutonium or 40 kilograms of uranium-235 would be the minimal amounts of material necessary to set off an atomic explosion. The first such explosion took place on July 16, 1945, at the Alamogordo bombing range in New Mexico, while the second was over the city of Hiroshima, Japan, twenty-one days later. In 1952 the first deuterium-(H2-) based fusion bomb (in which protons fused together to make the nucleus of an atom of higher atomic weight [lithium] than the original atoms [hydrogen]) was exploded at Enewetok Atoll in the Pacific Ocean, releasing power 100 times greater than that of the fission bombs—the equivalent of some 10 million tons of TNT.

Now that the genie had left the bottle, the way was open for both the peaceful and military use of nuclear energy by any country that could afford the time, expertise, and money. The first use of a nuclear reactor for the production of electrical energy occurred onboard a submarine, namely the USS Nautilus, completed in January 1954. As of 2005 there were over 150 ships (mainly submarines, aircraft carriers, and icebreakers) powered by more than 220 small nuclear reactors.

Land-based nuclear reactors that were designed to generate usable power in the form of electricity had the dual function of also making plutonium as a result of the nuclear reactions that occur when the fissile uranium generates heat. The uranium provided the electricity for national power grids, while the plutonium was added to the material that could be used for the production of bombs. The first such station to have this dual function was built in the United Kingdom at Calder Hall; it went commercial in October 1956. Since then, some 440 commercial nuclear power reactors and 284 research reactors have been built. They operate in 56 countries and supply some 16 percent of the world's total electricity base load. In Lithuania and France over 70 percent of the electricity supply is derived from nuclear reactors.


Assessment

Despite the large number of facilities that contain a nuclear reactor, the number of casualties that have resulted are relatively few. From the late 1950s to the early 2000s casualties directly associated with nuclear reactors numbered less than fifty. This is many fewer than the fatalities caused by other methods of generating electrical energy during the same period. There have been six serious events in which radioactivity has spilled over into the environment, the most damaging being that of the Chernobyl explosion in 1986 near the city of Kiev in Ukraine (then part of the USSR). Thirty-one people died and 1,800 children had to be provided with antidotes to thyroid cancer. Almost a million people were evacuated, and 10,000 square kilometers of land were designated as unfit for use. There was no evidence of other radiation-induced illnesses in the local population, which began moving back into the vacated area in the late 1990s.

There have been many studies examining the relationship between the incidence of leukemia and cancer and the locality of a power-generating nuclear reactor. Thorough examination of such data leads to the conclusion that although from time to time some radioactive material may have leaked from such establishments there has not been a noticeable and definitive increase in cases of cancer in the vicinity of such power stations.

Nevertheless, because nuclear reactors are associated with bombs, the fear of this technology has been disproportionate to its actual lethality. Paul Slovic's book on the perception of risk (2000) provides data that shows that while nuclear energy is perceived as generating the greatest risk, the actual risk is less than one chance in a million that a person who lives within five miles of a nuclear reactor for fifty years will die by an accident related to the reactor (a risk equivalent to that provided by smoking 1.4 cigarettes). Additionally, much has been made of the costs and dangers of decommissioning nuclear power reactors and of handling radioactive materials from this operation as well as the waste materials from the processing of spent fuel rods. The technology of radioactive waste storage has progressed, yet it is necessary to annually remove from circulation relatively small quantities (several tons) of highly radioactive material that retains its radioactivity for tens of thousands of years or longer. Were such material buried, as is suggested, there remains a danger that the containers may rupture, allowing seepage of radioactive material into the local groundwater. Nevertheless, sites for the indefinite storage of such materials held in a glass matrix within metal containers may be found in deep abandoned mines located in geologically stable areas.

The real terrors of the nuclear industry are in the area of bombs, a complex issue in and of itself. On the one hand, the end of the cold war (1945–1989) led to an overall decrease in the total number of nuclear weapons and agreements concerning the disposition of the remainder. On the other hand, China, India, Pakistan, and other countries have developed their own nuclear weapon capabilities. The expansion of trade will at least in some instances promote nonbelligerent conditions. And regardless of the connection between the nuclear power industry and nuclear weapons, one day oil and gas supplies will run out, and energy will still be needed.

At that time both worldwide population and its average rate of energy consumption are likely to have increased considerably. Although the energy of winds, rivers, tides, waves, and solar photons are likely to be increasingly captured and converted to distributed electrical power, it is unlikely that such supplies will satisfy human needs. The nuclear power option will increase in importance as conventional sources of energy are used up. It could be prudent to create the conditions for such an eventuality while the opportunity still exists to experiment without the pressures of urgent needs.

If in fact humanity turns to the nuclear power option, the issue of safety will need to be addressed. Modern societies have developed extensive systems of rules and regulations to protect the health and safety of those working with dangerous procedures, chemicals, or physical conditions. It may be expected that a parallel suite of regulations already in use in the nuclear industry will be extended and refined for a future, enlarged nuclear industry.

A related issue herein is that of global warming (or climate change). It is widely believed that the anthropogenic (human) production of carbon dioxide is, at the least, partly responsible for the increase in temperatures that has been observed around the planet. Many believe that this has been caused by human combustion of fossil fuels (coal, methane gas, and oil) for generating electricity and powering vehicles. An approach to militate against further increases in carbon dioxide proposed by James Lovelock, the initiator of the Gaia hypothesis, and others, is to use more nuclear reactors for the production of electricity. This electricity in turn could be used to generate hydrogen from the electrolysis of water to provide fuel for vehicles fitted with hydrogen-based fuel cells that generate electricity for onboard motors. This approach does not add to the carbon dioxide in the atmosphere and is safe, clean, and cost effective; it is possible to obtain 2.5 million times more energy from a gram of uranium than from the same amount of coal. A nuclear power program could be used in conjunction with other environmentally friendly approaches to energy generation, including wind, wave, biomass, and solar power.


Conclusion

The history of the development of the nuclear industry provides a paradigm of the emergence of a powerful technology from the observation of natural phenomena at the level of the individual scientist. At each stage the emerging new knowledge coupled with the development of techniques and equipment brought humanity to a more reliable understanding of the way nature worked and how humans operated. When the survival of the nation state was threatened as never before (after the devastating attack on Pearl Harbor, Hawaii, on December 7, 1941) America poured unlimited resources into the building of the atomic bomb. Could the scientists and engineers have decided not to develop atomic weapons at that time on the basis that the expression of the capability to develop such weapons could jeopardize the future survival of humanity? The question remains how humanity would respond to a similar challenge if it occurred again. In the end, humans have acquired awesome capabilities. It is perhaps thanks to the ethical strictures that humans have also built up over the ages that, for the most part, the use of the new and powerful technology has been restrained to beneficial ends. Such ethics are predicated on the bending of all human efforts to achieve the enhancement of the survival of humans on this planet, and they are perhaps encompassed in the following ethical statement by Hans Jonas: "Act so that the effects of your action are compatible with the permanence of genuine human life" (Jonas 1984, p. 11).

It might also be noted that there have been prominent scientists (Albert Einstein and Robert Oppenheimer in particular) who, having surveyed the results of their decisions in the heat of wartime, later recanted their enthusiasm for the project on which they worked so hard. Such retroactive evaluations may serve as a teaching device, but they do not help solve the problems that humans face in the early twenty-first century.

Energy released from nuclear reactions has the potential of providing almost unlimited amounts of virtually clean power into the indefinite future. It may also power spaceships, enable humans to colonize other planets of the solar system, and resolve medical pathologies. If it ever becomes feasible to progress to the harnessing of fusion power as demonstrated in the hydrogen bomb, then issues of power generation would no longer distract humanity from efforts to enhance the personal and social lives of all human beings. Yet, as with all the tools developed by humankind over the last 2.5 million years, it must be recognized that nuclear energy may be used to cause harm as well as provide benefits. Humanity's efforts, therefore, have to be directed at developing and practicing those ethics and morals that prevent harmful uses while enabling and encouraging beneficial deployments. The future of the human species depends upon the success of this endeavor.


RAYMOND E. SPIER

SEE ALSO Chernobyl;Nuclear Waste;Three-Mile Island.

BIBLIOGRAPHY

Jonas, Hans. (1984). The Imperative of Responsibility: In Search of an Ethics for the Technological Age, trans. Hans Jonas and David Herr. Chicago: University of Chicago Press.

Slovic, Paul. (2000). The Perception of Risk. London: Earthscan Publications.

Spier, Raymond E. (2001). Ethics, Tools, and the Engineer. Boca Raton, FL: CRC Press.

Spier, Raymond E. (2002). "Ethical Issues Engendered by Engineering with Atomic Nuclei." In Science and Technology Ethics, ed. Raymond E. Spier. London: Routledge.

INTERNET RESOURCES

Lovelock, James. "Nuclear Power Is the Only Green Solution." Available from http://www.perfect.co.uk/2004/05/james-lovelock-nuclear-power-is-the-only-green-solution. Lovelock's comment on the need for the activation of the nuclear power option.

Uranium Information Centre. "Nuclear Power in the World Today." Available from http://www.uic.com.au/nip07.htm. Provides information on commercial nuclear reactors that provide electricity to national grids.

Washington and Lee University. "Alsos Digital Library for Nuclear Issues." Available from http://alsos.wlu.edu. Provides a wide range of annotated references for the study of nuclear issues.

About this article

Nuclear Ethics: Industrial Perspectives

Updated About encyclopedia.com content Print Article