Physics: Radioactivity

views updated

Physics: Radioactivity

Introduction

Radioactivity is the spontaneous breakup of the nuclei of unstable atoms, which releases radiation in the form of fast-moving particles or high-energy electromagnetic waves (gamma rays). Since the discovery of radioactivity in 1895, radiation from radioactive substances and other sources has been used for medical, military, and technological purposes. Radioactive materials are used to image the inside of the human body, to treat cancer, to power nuclear weapons and nuclear power plants, to trace chemical reactions and drug metabolism, and to determine the ages of ancient organic materials and of Earth itself (about 4.5 billion years). Radiation, regardless of its source, has harmful health effects (though in medical applications these may be outweighed by positive health effects): at low doses it may cause cancer or heritable mutations, and at high doses it sickens or kills through direct damage to body chemistry. This article considers both radioactive elements and radiation from other sources, such as x rays.

Historical Background and Scientific Foundations

The Discovery of X Rays

The study of radioactivity began with the accidental discovery of x rays by German physicist Wilhelm Conrad Röntgen (1845–1923) in 1895. For decades, physicists had experimented with current flow between electrodes (charged pieces of metal) inside partially airless glass tubes (cathode ray tubes, named for discharges from their positively charged electrodes, cathodes). Röntgen wanted to investigate cathode rays emitted when the pressure in the glass tube was very low. Much to his surprise, he discovered during an experiment with a cardboard-shrouded tube that, on the other side of his laboratory, a fluorescent-coated screen used to detect cathode rays began to glow. It was so far away that Röntgen doubted the fluorescence was caused by cathode rays (now known to be electrons in flight). Instead, he suspected that the fluorescence was caused by a new kind of ray.

As he investigated further, Röntgen discovered that the mysterious new rays penetrated most materials, but to different degrees. Sheets of paper were transparent to them; so were a thousand-page book and a piece of wood; aluminum, less so. The rays also blackened unexposed light-sensitive photographic plates, but thin plates of lead stopped them completely. Flesh was mostly transparent to the rays, bones mostly opaque. In his first communication about the rays, Röntgen included various photographs, including one showing the bones in his wife's hand. Röntgen called the new rays “X-rays.” It took the research of many physicists over the next two decades to verify that x rays are electromagnetic waves of high energy and short wavelength—between 0.01 and 10 nanometers (nm, a billionth of a meter).

Scientists around the world immediately began to explore Röntgen's startling discovery. In February 1896, only two months after Röntgen published his first findings, research was so intense that the prestigious journal Nature had to declare that it could not keep pace with all the communications it was receiving on the topic.

X Rays in Medicine

Röntgen's photograph of the bones in a living human hand was splashed across the front pages of newspapers around the world, creating a public sensation. The medical profession quickly recognized x rays' potential. Because Röntgen's photograph depicted a hand clearly wearing a ring on the fourth finger, doctors saw the potential for locating metal objects (such as bullets or shrapnel fragments) in wounds; x-rays were first used for this purpose within a few weeks of their discovery. Industrial and commercial applications were also quickly found. The shoe-fitting fluoroscope, for example, which used x-rays to show the bones and soft tissues of a foot inside a shoe, became widely used in America, Europe, and Australia from the early 1920s until the end of the 1950s. This was an early example of the tragic misapplication of a poorly-understood technology: shoe-fitting x-ray machines exposed many thousands of people to cancer-causing x rays at doses that would today be considered dangerously high, while not providing any medical benefit. Indeed, they did not even improve service in shoe-stores significantly, but served primarily as a gimmick to impress customers. Children using the machine were exposed in a few seconds to as much radiation as a present-day industrial worker is allowed to receive in a year—and children are more vulnerable to radiation than adults. Moreover, the machines remained in use long after the harmful effects of radiation began to be understood, not being phased out in the United Kingdom until the mid-1970s.

Doctors soon discovered that when x rays were aimed at the human body, the skin often reddened in response, as if it had been sunburned. Since this was the same effect produced by ultraviolet light, which was used to treat various skin conditions, they assumed that x rays would have the same beneficial effects. X ray treatments were quickly introduced. When hair loss was noticed after treatment with radiation, many physicians began to use x rays as a depilatory (hair-removing) treatment. Doctors also experimented with x ray therapy on cancer patients, and early reports of successful healing led to a veritable boom in x ray therapy in the first years of the twentieth century. Unfortunately, the indiscriminate use of high-dose x rays probably caused far more cancer than it cured during this period.

X Rays' Harmful Effects

X rays' harmful effects quickly became apparent, however. At first the burns, rashes, dermatitis, and ulceration associated with x ray use were ascribed to apparatus malfunction, but Elihu Thomson (1853–1937), an engineer working for an x-ray machine manufacturer, doubted this. To prove that the x rays, and not his products, were responsible, he conducted a series of experiments on himself. By irradiating one of his fingers, he showed that x rays could produce severe, painful burns. He concluded that exposure to x rays beyond a certain limit would cause harm and warned his colleagues not to prolong exposure. U.S. inventor Thomas Edison (1847–1931) abandoned x ray research in 1903 after one of his assistants contracted fatal cancer in his hands by testing x ray tubes with them. By experimenting on guinea pigs, other scientists showed that x rays could blind, burn, cause abortions, or even kill; in light of these findings, shields began to be used to protect both patients and x ray operators. The growing recognition of x rays' harmful effects showed the need for safety guidelines.

The Discovery of Radioactive Elements

When Röntgen's discovery was first discussed at the French Academy of Sciences, members suggested that since the tubes emitting x rays were fluorescent, other fluorescent bodies might also emit the new rays. To test this hypothesis, the French physicist Henri Becquerel (1852–1908) conducted an experiment in which he placed sheets of uranium salt on a photographic plate that was wrapped in heavy black paper. This was placed in the sun for several hours, until the uranium salt

became fluorescent. Upon developing the photographic plate afterward, Becquerel discovered silhouettes of the mineral crystals and concluded that the fluorescent uranium salt emitted radiation that penetrated paper.

In the days following his initial experiment the sun appeared only intermittently. While waiting for better weather, he placed his wrapped photographic plates and uranium crystals in a drawer. After a few days he developed the photographic plates, expecting to find only very weak images. But much to his surprise the images were quite intense. After experimenting with various uranium salts and other fluorescent minerals for several weeks, he concluded that the rays were emitted by the uranium. We now know that it was merely a coincidence that uranium salts emit visible light rays (fluoresce) under some conditions as well as emitting penetrating radiation: The fluorescence has nothing to do with the penetrating radiation.

Expanding on Becquerel's work, Marie Curie (1867–1934) discovered in 1898 that x rays were also emitted by thorium. Curie and her husband, Pierre Curie (1859–1906), also found that the mineral pitchblende was much more radioactive than its uranium content would indicate. They discovered that it contained another element even more radioactive than uranium, which they named polonium, in honor of Poland, Marie Curie's native country. A few months later they found that pitchblende contained yet another highly radioactive element, a substance they named radium. Marie Curie eventually died from a blood disease almost certainly caused by radiation exposure. In 1899, the French physicist and chemist André-Louis Debierne (1874–1949) discovered a third radioactive substance, actinium.

Early Investigations

HENRI BECQUEREL 1852–1908)

Henri Becquerel (1852–1908) was born into a scientific family. Both his father, Alexandre Edmond Becquerel (1820–1891), and his grandfather, Antoine César Becquerel (1788–1878), were professors of physics at the Museum of Natural History in Paris. Each also served as president of the French Academy of Sciences.

Henri was educated at the École Polytechnique and the École des Ponts et Chaussées (Bridges and Highways School). Even before graduating he was teaching at the École Polytechnique; after graduation he became a government engineer in the Bridges and Highways Department. He also became an assistant at the Museum of Natural History, dividing his professional life among three institutions.

When the possibility of radiation from fluorescent bodies became a topic of discussion in the French Academy, Henri was 43 years old, settled and established, beyond the dogged pursuit and hard work of basic research. The study of fluorescence in uranium compounds fascinated both him and his father, however. Working with an inherited collection of uranium salts, he immediately took up this new field of investigation.

His discovery that uranium salts emitted x rays did not create the same excitement as Röntgen's discovery. During this time scientists studied numerous kinds of radiation—cathode rays, canal rays, x rays, and others; those discovered by Becquerel did not seem especially important at first. This changed when scientists learned that they were emitted not only from uranium, but from several different elements as well. In 1903 he shared the Nobel Prize for physics with Marie and Pierre Curie for their discovery of radioactivity. The International System unit of radioactivity, the becquerel, is named in his honor.

The early study of radioactivity was experimental, focusing on the collection and classification of data as scientists tried to answer many questions: What were these new rays? Were they emitted by all elements, or only by some? Was their activity affected by chemical processes, or by physical changes such as temperature? How did it fit into the periodic table of the elements?

During his first experiments, Becquerel noticed that the new rays would cause nearby electrically charged materials to lose their electrical charges. Unlike photographic plates, which provided only qualitative measurements, the discharge of an electroscope could quantify the radiation's intensity. He discovered that although the rays could penetrate paper, sheets of aluminum or copper decreased their intensity. In a series of similar absorption experiments, the New Zealand-born British physicist Ernest Rutherford (1871–1937) showed that the rays emitted from uranium contained at least two distinct types of radiation. The first, which was readily absorbed, he termed alpha radiation; the other, more penetrating type he termed beta radiation. In 1900, the French physicist Paul Villard (1860–1934) found a third type that was even more penetrating than beta radiation and was termed gamma radiation. Alpha (a), beta (ß), and gamma (γ) are the first three letters of the Greek alphabet and are often used in science to denote quantities of interest.

Rutherford conducted a series of experiments in which he measured the decay rate of various radioactive substances. He found that they all decayed according to an exponential law, I = Ioe-Lt—that is, that the radioactivity of a sample always decreased by the same percentage over an equal interval of time. The amount of time it took a sample's radioactivity to decrease by half (50%) Rutherford dubbed the “half-life” (i.e., after that much time, half the life of the sample's radioactivity was over). He also found that different substances have different half-lives, ranging from many thousands of years to a few seconds. These differences could be used to distinguish radioactive substances.

Discovering Transmutation

In 1900, Rutherford discovered that the element thorium, in addition to being radioactive, also emitted highly radioactive particles that traveled about the laboratory like a gas. Not knowing what it was, Rutherford called it “thorium emanation.” With the help of a young British chemist and future Nobel laureate named Frederick Soddy (1877–1956), Rutherford hypothesized that thorium was transmuting into the gaseous element argon. This hypothesis was strengthened in 1902 when Rutherford showed that the emission of alpha rays changed the elemental identity of the source atom. It now seemed certain that one chemical element could, in fact, transmute into another.

Radiation's various components emerged gradually. By 1902 it had become clear that beta rays are, in fact, fast-moving electrons. In 1905, British physical chemist William Ramsay (1852–1916) and Soddy discovered that helium was produced by the radioactive decay of radium, and in 1908 Rutherford and the German physicist Hans Geiger (1882–1945) concluded that alpha rays consist of helium atoms that have gained a positive charge (i.e., lost their electrons: thus, an alpha particle is a fast-moving helium nucleus). Gamma rays were finally established as the same type of radiation as x rays, only with even higher energy. Despite this progress in establishing experimental data, scientists found it difficult to create an atomic model that could explain radioactive decay.

Harmful and Therapeutic Effects

It soon became clear that radioactive substances and x rays affected the skin in similar ways. In June 1901 Becquerel found that after carrying a tube of radium in his shirt pocket for a couple of hours, he developed a skin burn that took almost two months to heal. Pierre Curie attached a piece of radioactive material to his arm for 10 hours, resulting in a wound that took about four months to heal and left a heavy scar.

IN CONTEXT: N RAYS

As new kinds of radiation were discovered, claims also emerged for rays that turned out not to exist. In 1903 the noted French physicist René-Prosper Blondlot (1849–1930), working at the University of Nancy, France, thought he had discovered a new kind of ray that was emitted not only from discharge tubes, but also from gas burners, metals in states of strain, and from stretched muscles and the human nervous system! He called them n rays, for the University of Nancy.

After his discovery, several other scientists claimed also to have observed n rays from various animal and vegetable substances. However, these results could not be reproduced, and many in the scientific community began to doubt their existence. When American physicist Robert W. Wood (1868–1955) visited Blondlot's lab to investigate these findings, he secretly removed and replaced parts of the apparatus, effectively rendering the experiment invalid. Blondlot and his colleagues, however, still claimed to observe n rays—solely because they expected to. This episode became a classic warning against matching results to expectations. In 1904, the editors of the scientific journal Nature concluded that n rays were an illusion.

When the doctor who treated Becquerel's burn noticed that it resembled an x-ray burn, he suggested that radium might have therapeutic effects similar to those of x rays. Radium treatments for cancer were quickly introduced, and a variety of applicators were developed to use the element in body cavities where x-ray treatment was difficult.

Popular hope for miraculous effects from radioactivity soared during the 1920s and 1930s. Inhaled radon gas, for example, was thought to have stimulating and restorative powers. So was the drinking of water from decorated urns in which radioactive substances were submerged. Likewise, since it was believed that low levels of radiation would kill germs and stimulate growth, radioactive materials were often used in products such as beauty creams. An unknown but certainly large number of cancers were caused by these useless exposures to radiation.

Induced Radioactivity

A new realm opened up in the years prior to World War II (1939–1945) when researchers discovered how to induce radioactivity in heavy elements by bombarding them with neutrons. Within just a few years, research on induced radioactivity lead to the discovery of nuclear fission, quickly followed by speculations about how the energy released during nuclear fission could be utilized.

When research on induced radioactivity began, however, no one considered the possibility that a nucleus could be split. In 1934 the husband and wife team of Irène and Frédéric Curie-Joliot (1897–1956 and 1900–1958) discovered that when bombarding light elements with alpha particles, they would transmute into isotopes of known elements; contrary to the known, stable elements, however, these new isotopes were radioactive.

IN CONTEXT: THE CURIE FAMILY

Marie Curie (1867–1934) was born Maria Sklodowska in Poland. Her father taught mathematics and physics in a secondary school and her mother, who died when Marie was 11, managed a boarding school for girls. When she was 18, Marie worked as a governess to support her younger sister Bronia's medical education in France. In 1891 Marie moved to Paris to study physics and mathematics at the Sorbonne. She graduated in 1894, and in the same year she met Pierre Curie (1859–1906), a newly minted doctorate in physical chemistry. They married in 1895 and had two daughters, Irène (1897–1956) and Ève. Pierre was appointed professor of physics at the Sorbonne in 1904, and when he died two years later, the chair was bestowed on Marie, who became the first women to teach at the Sorbonne.

With Henri Becquerel (1852–1908), Marie and Pierre Curie shared the 1903 Nobel Prize for physics for their discovery of radioactivity. Marie Curie received a second Nobel Prize for chemistry in 1911 for her discovery of the elements radium and polonium. Her daughter Irène Curie-Joliot and son-in-law Frédéric Joliot-Curie (1900–1958) won the 1935 Nobel Prize for chemistry for their synthesis of new radioactive elements. Pierre Curie died in 1906 in a traffic accident; Marie died in 1934 from aplastic anemia, a disease certainly caused by her exposure to radiation.

Because of the alpha particles' positive charge, which caused it to be repelled by atomic nuclei (which are all positively charged, the heavier the nucleus the more the charge), Curie and Joliot were able to induce radioactivity only in relatively light elements. But with the discovery of the electrically neutral neutron, Italian physicist Enrico Fermi (1901–1954) hypothesized that neutron bombardment might lead to the activation of heavy elements. To test their theory, Fermi and his collaborators began a series of experiments in which various elements were bombarded with neutrons. They found that for a large number of elements of any atomic weight, neutron bombardment produced unstable elements that disintegrated through the emission of beta particles.

Creating Elements Heavier than Uranium

In beta decay, the parent nucleus emits an electron and an antineutrino while transforming a neutron into a proton. The daughter nucleus is therefore lighter and of a higher atomic number than the parent. (The atomic number of an element is the number of protons in its nucleus.) During their neutron-bombardment experiments, Fermi's team turned their attention to heavy nuclei, especially uranium, which was element number 92 and the last element in the periodic table as it was then known. They wondered whether new elements with higher atomic numbers than uranium could be produced. This question opened the possibility that the list of chemical elements was not exhaustive—that new elements could be produced artificially.

Fermi's team bombarded uranium with neutrons and identified several beta-emitting products, one of which they found to be distinct from all elements heavier than lead. They concluded that it might be a new element, one with a higher atomic number than uranium that would surpass it in the periodic table; Fermi's team called it a “transuranium” element.

This was a daring hypothesis, since the last known gaps in the periodic table were just being filled. Many scientists accepted Fermi's contention that this was a new element, but the German chemist Ida Noddack (1896–1978) objected, saying that no conclusion could be drawn until the substance had been compared to all known elements, not only to those between lead and uranium. Fermi's team had not done this because they expected that parent and daughter nuclei would be close to each other in the periodic table. Noddack argued, however, that since nothing was known about neutron-induced transmutations, it was possible that the nucleus would simply fragment into much lighter elements than the original uranium. Her objections received scant attention, partly because her idea of the nucleus breaking apart seemed merely speculative and partly because she had made controversial claims of her own that had undermined her scientific credibility. However, on this point her guesses were approaching an important truth: atomic nuclei can fission (split apart).

Fermi's 1934 announcement piqued the interest of other scientists. The Curie-Joliots began similar experiments in Paris. In Berlin the physicist Lise Meitner (1878–1968) and chemists Otto Hahn (1879–1968) and Fritz Strassmann (1902–1980) competed to find new transuranic elements. Both groups conducted numerous experiments. Each produced results that were interpreted as transuranic, but most were complex products that required a variety of new hypotheses to be understood.

In two papers published in 1938, the Paris group claimed to have produced yet another transuranic element by neutron bombardment of uranium. They had difficulty identifying it, however, as it behaved more like a lanthanide (rare earth element) than any of the transuranics. The Berlin team immediately began to test the French results, but Meitner's participation was compromised when she had to flee Nazi Germany. Meitner was Jewish and had been protected up to that point by her Austrian citizenship. After Germany annexed Austria in 1938, however, she was subject to persecution by the Nuremberg Laws.

The Discovery of Nuclear Fission

Hahn and Strassmann stayed in Berlin, but corresponded with Meitner about their ongoing research. They found that the element produced by the Paris group was a mixture of several isotopes. In a famous experiment conducted on December 17, 1938, they were able to confirm the presence of barium, a lighter element than uranium. In a series of letters to Meitner, Hahn concluded that although it seemed impossible according to the laws of physics, as a chemist he had to conclude that their results indicated that the nucleus had been divided. He asked Meitner if she could figure out any other explanation.

Over Christmas, Meitner discussed Hahn and Strassman's results with her nephew, physicist Otto Frisch (1904–1979). Together they deduced that violent oscillations of the nucleus could indeed split it; within weeks of Hahn and Strassman's experiment, they published their discovery of nuclear fission.

This appeared to invalidate all findings of transuranic elements. As they continued to correspond in January and February 1939, Hahn was reluctant to relinquish the possibility of transuranic elements; Meitner, on the other hand, wanted to reinterpret all previous results in the light of their new discovery. Meitner's logic triumphed, and by the beginning of 1939 they had retracted all their previous results.

ENRICO FERMI (1901–1954)

Enrico Fermi (1901–1954) was born in Rome. His aptitude for mathematics and physics, recognized and encouraged even in grammar school, earned him a university scholarship. After earning his doctorate degree in physics from the University of Pavia in 1922, he first went to Göttingen to work with Max Born (1882–1970) and later to Leiden, the Netherlands, to work with Paul Ehrenfest (1880–1933).

In 1927, Fermi became professor of theoretical physics at the University of Rome. When the fascist Mussolini government enacted anti-Jewish laws similar to Germany's Nuremberg Laws, Fermi, his Jewish wife Laura, and their two children emigrated to the United States in 1938. There Fermi joined the group of immigrant physicists, including Albert Einstein (1879–1955) who urged the Roosevelt administration to begin developing an atomic bomb to counter the ominous possibility that Germany would do so first.

Fermi went to Washington to present the idea in more detail to a group of officers. The initial meeting created little interest, but the growing Nazi threat eventually moved even Einstein, antiwar by conviction, to encourage work on a nuclear weapon. When the Manhattan Project was launched in 1942, Fermi became one of the lead physicists on the project.

With the discovery of fission, a number of questions surfaced. Splitting a heavy nucleus into two light nuclei released both neutrons and energy. If the released neutrons could cause further nuclei to split, a continuous chain reaction might occur that would release an enormous amount of energy. Within a year after the discovery of fission, research papers were discussing how the energy content of atomic nuclei could be made technically useful.

Nuclear Fission in Wartime

These speculations took on a new dimension with the outbreak of World War II (1939–1945), as scientists realized that a chain nuclear reaction could be used in an immensely powerful new weapon. Physicists who had fled from Europe to the United States were so worried about the prospects of a German atomic bomb that they urged government to keep all uranium research secret. From 1940 onward, British and American physicists agreed to stop publishing their work on nuclear energy. They could submit papers to the journals, but they would not be published until it was considered safe.

A number of physicists were so fearful that Germany would develop a nuclear weapon that they urged Albert Einstein (1879–1955) to share their concerns with President Roosevelt in the summer of 1939. In a series of letters between 1939 and 1940, Einstein warned the president that secret German nuclear research had ominous implications, and he encouraged the United States to take action to prevent them from developing such a bomb first. Although American physicists, chemists, and engineers began to work on uranium chain reactions, not until the Japanese attack on Pearl Harbor in December 1941 did their work begin to expand into the large-scale program needed to design and build an atom bomb. Roosevelt authorized the massive funds necessary to launch the Manhattan Project in 1942. (Neither Germany nor Japan, as it turned out, ever had a serious atomic-bomb program.)

In the spring of 1943, a huge research laboratory was set up for the Manhattan Project at Los Alamos, New Mexico. One problem the team faced was how to assemble a critical mass of fissionable material that would allow a chain reaction to occur. A critical mass of a radioactive material is a properly-shaped sample of material heavy enough, that is, containing enough atoms close enough together, to sustain a chain reaction driven by its own radioactivity. In a chain reaction, the neutrons emitted by some atoms will trigger the breakup of other atoms. In a bomb, the result is the breakup of so many atoms so rapidly as to constitute an explosion.

In the end, two methods of producing a critical mass at the desired moment were pursued: In the guntype (uranium) bomb, one subcritical mass was shot into another to suddenly produce a critical mass and thus an explosion. In the implosion-type (plutonium) bomb, a spherical subcritical mass was surrounded by a chemical explosive. When detonated, the chemical explosive caused the plutonium to implode into a critical mass. The United States dropped “Little Boy,” the gun-type uranium bomb, on the Japanese city of Hiroshima on August 6, 1945, killing approximately 140,000 people; it dropped the implosion-type “Fat Man” bomb on the city of Nagasaki on August 9, 1945, killing approximately 74,000 people.

After the end of World War II, nuclear reactors were developed to generate nuclear energy for civilian purposes, especially the generation of electricity. The nuclear power industry grew rapidly in the late 1950s as commercial power plants were built in the United States and other countries, subsidized by governments. Unexpectedly high costs, slower-than-forecast increases in demand for electricity, and (to a lesser extent) concerns about reactor safety and radioactive waste disposal later slowed this growth in the United States and most other countries; some, like Austria and Denmark, prohibited the establishment of nuclear power plants altogether.

Radioactive Elements as Biological Tracers

Irene and Frédéric Joliot-Curie won the 1935 Nobel Prize for their synthesis of new radioactive isotopes of light elements. In his Nobel lecture, Frédéric Joliot-Curie noted that these isotopes could be used to detect physical and chemical processes in the body. The use of radioactive isotopes as medical tracers was pioneered by the Hungarian-born Swedish physicist Georg von Hevesy (1885–1966) when he lived in a boarding house. Suspicious that leftovers from the lodgers' plates were being recycled at later meals, he deposited a tiny amount of a radioactive isotope on a piece of meat that he left on his plate. The next day he brought an electroscope to the table and discovered that a supposedly fresh meat-containing dish was indeed radioactive.

By the same token, radioactive isotopes can be used to trace an element's progress through a body and its various physiological processes. With the use of the radioactive tracers, Hevesy revealed the principle of metabolic turnover, that is, that a substance administered to a living organism would leave the organism again, and that this turnover could be expressed by a decaying exponential function with a characteristic biological half-life for each element.

An important part of this discovery was the realization that some elements deposit selectively in different parts of the body, leading to searches for radioactive “magic bullets” to treat cancer and other diseases localized in particular organs. One of the most useful elements for this kind of treatment is iodine, which, when ingested, travels through the bloodstream to the thyroid gland. Patients with hyperthyroidism and some types of thyroid cancer could now be treated by administering a dose of radioactive iodine; this would concentrate in the thyroid gland where it would destroy the cells. However, most diseases are not treatable by this means.

Modern Cultural Connections

The application of radioactivity to the creation of nuclear weapons transformed the physical and psychological terms of modern life. With the proliferation of nuclear weapons in the 1950s and beyond—over 20,000 still remain in national arsenals, about 95% in the United States and Russia—the possibility that most of the human race could be extinguished in a sudden, self-inflicted disaster became real for the first time in history and has remained so. For the early nuclear nations, security seemed to inescapably depend on the possession of many thousands of nuclear weapons, carried at first by airplanes and later mounted on long-range missiles. Thus, since the end of World War II, whole societies have committed their sense of security to the maintenance of nuclear-weapons systems that can kill hundreds of millions or billions of people in a matter of minutes, and for half a century most persons have lived with the awareness (sometimes acute, sometimes muted) that not only one's own life but the lives of everybody that one knows could be ended suddenly, possibly even by accident, without any meaningful prospect of continuation in future generations.

In reaction to this nightmarish situation, early eagerness to declare that “the atom” (i.e., radioactive phenomena) has a peaceful, salvific (redeeming) side was strong. Promoters of nuclear power and medical technologies sometimes promised a utopian future where all problems would be solved by The Atom. In the first few decades after the development of nuclear power and weapons, public enthusiasm for peaceful nuclear energy was high. In the 1970s and beyond, public opinion in the United States and elsewhere became divided, with significant minorities or, in some countries, large majorities opposing the development of nuclear power and continued reliance on nuclear weapons.

The ability of low levels of radiation to cause cancer and genetic mutations in offspring has been a key part of the public debate over nuclear power. Radiation that separates electrons from atoms, a process termed ionization, is called ionizing radiation. X rays and the forms of radiation emitted by radioactive elements are ionizing. In large doses, ionizing radiation can ionize so many atoms in a cell that the resulting chemical reactions cause the cell's death. Or, ionizing radiation can cause cancer or heritable mutations by damaging the DNA (deoxyribonucleic acid) molecules present in almost all cells; these molecules contain instructions for day-today cell biochemistry and for producing offspring. Changes to DNA can make a cell cancerous or be inherited as mutations (usually harmful).

In the years immediately following Word War II, most analysis of long-term radiation effects on health was based on studies of the populations in Hiroshima and Nagasaki, who were exposed to a sudden burst of ionizing radiation from the atomic bomb. Other data were obtained from laboratory experiments on animals and from the results of human exposure to medical radiation. The Chernobyl disaster of April 1986 produced important data on effects of low-level radiation. The disaster occurred when operators shut off cooling systems for a power-generating nuclear reactor in Chernobyl, Ukraine. The reactor core melted down, causing steam explosions that blew open the containment building. The burning reactor then spewed radioactive materials into the atmosphere for days, out of control, dispersing large amounts of radioactive material, which was carried over much of Europe and Scandinavia by prevailing winds. The city of Chernobyl has been abandoned ever since the accident.

Among the approximately 6.6 million people exposed to Chernobyl, the World Health Organization estimated that as many as 50,000 new cases of thyroid cancer will develop. Such diagnoses have risen at least tenfold among Ukrainian children who were exposed to Chernobyl's radiation. Those who were under four years of age when the disaster occurred have a nearly 40% risk of developing the disease; those who were under two tend to develop particularly virulent forms. Because children affected by Chernobyl also produce more antithyroid antibodies than others, they may be at greater risk for hypothyroidism in later life.

In the early decades of scientific research on radiation and radioactivity, most scientists assumed that a tolerance dose could be established below which exposure to radiation was completely safe; the countervailing theory was that even low doses would cause harm (e.g., halving a dose would cause half the harm, all the way down to zero). These two schools of thought about radiation risk were at odds throughout the twentieth century. In 2006, after exhaustive review, the U.S. National Research Council Committees on the Biological Effects of Ionizing Radiation, a government body, stated definitively that the linear model (harm is proportional to dose down to zero) is best supported by data.

Debate over the advisability of continuing or increasing reliance on nuclear power turns partly on the question of whether the large inventories of radioactive substances generated by nuclear power plants can be contained. Advocates of nuclear power point out that accidents such as Chernobyl have been rare, given the number of operating nuclear power plants. In addition they maintain that air pollution generated by burning coal to make electricity has killed many times more people than nuclear power. Opponents of nuclear power argue that reactors remain dangerous; that increased reliance on nuclear power would entail dependence on breeder reactors that are even more dangerous; that terrorists or military attackers could cause Chernobyl-like disasters by bombing even safe reactors; and that the containment of nuclear waste over hundreds or thousands of years cannot be guaranteed, subjecting hundreds of future generations to unknown risk for the sake of a short-term benefit (electricity).

Both opponents and proponents of nuclear power agree that the technology needed to produce nuclear power can be modified or exploited to produce nuclear weapons: for example, in the early 2000s, there was intense international debate over whether Iran's nuclear-power program was being secretly exploited by that country to produce nuclear weapons, as the supposedly peaceful reactor programs of Israel, India, North Korea, and Pakistan had previously been exploited. International inspectors of Iran's nuclear facilities had, as of early 2008, failed to produce definite physical evidence of a bomb program, though the controversy about their potential to do so continued.

In response to scientific evidence that even low levels of radiation cause harm, medical equipment that uses x rays and other forms of ionizing radiation has been reengineered over the decades to minimize the received dose. Mammography (breast x rays), for example, first developed around 1950, was improved in the 1970s to lower the radiation dose. The dose received by healthy tissue during radiation therapy has also diminished greatly. Stereotactic radiation therapy, in which radiation beams are focused at a localized tumor from hundreds of different angles for a short period of time, can deliver a precise, significant dose of radiation to a small tumor, while the surrounding tissue receives comparatively small doses.

Primary Source Connection

The disposal of radioactive wastes is one of the most controversial problems plaguing most forms of nuclear technology. In this article by Demetria Kalodimos, the discovery that radioactive waste is being placed in a landfill meant for household trash in Murfreesboro, Tennessee, is discussed.

EXPERT: RUNOFF AT LANDFILL TESTS “VERY HIGH” FOR RADIOACTIVITY: SOME NUMBERS TWICE WHAT EPA ALLOWS

MURFREESBORO, Tenn.—The first test results are in for radioactivity at Murfreesboro's Middle Point landfill.

Channel 4 uncovered a little known state program that allows low-level radioactive waste from all over the country to be buried along with household trash at the landfill.

It's been happening for nearly 20 years, but the state has never required testing to monitor the effects of the dumping.

On a rainy Sunday afternoon, Channel 4 rode to the top of the Middle Point landfill to see what a state scientist could measure with a hand-held radiation detector.

Somewhere in the mountain of dirt and garbage, millions of pounds of low-level radioactive waste have been buried along with the household trash.

“If we stood around here all year, or anywhere else, with 11 micro r per hour, we would get about 96 millirem for the year,” said state radiological inspector Billy Freeman.

The numbers sound complicated, but they're no higher or lower than you'd expect to find in this area from decaying rock, soil, the sun, what the scientists call “natural background.”

The state said the added risk, even from tons of processed radioactive waste at the landfill, is minimal.

“(It is) almost inconsequential. It would be 1 percent of a member of the public's limit. One millirem per year is an inconsequential dose. Is it a realistic dose? Yes, it is a very realistic dose and that's why we chose it,” said Freeman.

But even the state's radiation expert admits Channel 4's walking tour was less than realistic in terms of what might be going on deep under our feet.

“We did a rough and dirty walkover with a portable survey instrument. It's a very sensitive survey instrument, and I trust its readings. It's a calibrated device, so these are accurate readings. If your question is what is two feet down, six feet down, 100 feet down, in no way did our survey today give you an estimation of that, no,” said Freeman.

This brings Channel 4 to another set of tests.

In the nearly 20 years Tennessee has allowed treated low-level radioactive waste to be dumped here, there has never been a requirement to test the air, water or soil for radioactivity.

The very first tests on the liquid runoff from the land-fill, what's called leachate, paint a potentially troubling picture.

“The readings are very, very, very high,” said Dan Hirsch, Expert of Nuclear Policy at the University of California at Santa Cruz.

Gross Alpha radiation in the leachate measured 82. The EPA standard for drinking water is 15.

Gross Beta, the leachate, measures 3,395. This is 68 times higher than the maximum allowed in drinking water.

Tritium, a radioactive element that attaches itself easily to water, measured at more than 38,000. This number is nearly twice what the Environmental Protection Agency allows.

It is true that no one drinks leachate, so is drinking water a fair comparison?

“It is relevant because landfills do leak,” said Mark Quarles, ground water expert.

A professor of nuclear policy compared the numbers to a 2002 survey of 50 landfills in California.

“The gross beta readings are just astronomical. I've not seen radiation readings that high for leachate. The monitoring that we had in California suggested we had a problem. Although the highest reading we had was eight times lower than the reading reported in Tennessee,” said Hirsch.

They also looked at waste-water sludge, comparing Middle Point's to a landfill that's not taking radioactive waste in Clarksville.

“When you look at the sludge of Murfreesboro to Clarksville, the gross Beta radiation is 9.5 times higher than that of Clarksville. The Tritium is 139 times higher than that of Clarksville,” said Quarles.

Tennessee officials said a lot of everyday, careless disposal could be to blame.

Tritium, for example, is found in old illuminated exit signs, the kind that light up without a power source and quite often end up in landfills.

But Tritium, or Hydrogen 3, was also in loads of contaminated soil taken from Middle Point landfill by officials from the University of California at Los Angeles almost six years ago at the rate of 400 tons per month.

What's causing the problem? Is there a problem?

Without baseline testing, it may be impossible to know.

“Now we're so far into it they don't have the baseline, and the way you're supposed to do it is you're supposed to sample for all these constituents before you place the waste,” said Quarles.

The state is still waiting for results of testing at the four other landfills in Tennessee that are accepting low-level treated radioactive waste.

A state advisory board will meet Thursday to discuss more testing and the 60 day moratorium the legislature has put on the dumping program.

Demetria Kalodimos

kalodimos, demetria. wsmv-tv. “expert: runoff at landfill tests ‘very high’ for radioactivity,” july 4, 2007. http://www.wsmv.com/news/13620876/detail.html (accessed october 4, 2007).

See Also Physics: Nuclear Physics; Physics: The Inner World: The Search for Subatomic Particles.

bibliography

Books

Kevles, Bettyann Holtzmann. Naked to the Bone: Medical Imaging in the Twentieth Century. New Brunswick, NJ: Rutgers University Press, 1997.

Kragh, Helge. Quantum Generations: A History of Physics in the Twentieth Century. Princeton, NJ: Princeton University Press, 1999.

Romer, A., ed. The Discovery of Radioactivity and Transmutation. New York: Dover, 1964.

Walker, J. Samuel. Permissible Dose: A History of Radiation Protection in the Twentieth Century. Berkeley: University of California Press, 2000.

Web Sites

American Institute of Physics. “The Discovery of Fission.” http://www.aip.org/history/mod (accessed May 8, 2008).

American Institute of Physics. “Marie Curie and the Science of Radioactivity.” http://www.aip.org/history/curie (accessed May 8, 2008).

U.S. Department of Energy. “Human Radiation Experiments.” http://www.eh.doe.gov/ohre/index.html (accessed May 8, 2008).

U.S. Environmental Protection Agency. “History of Radiation Protection.” http://www.epa.gov/rpdweb00/understand/history.html (accessed May 8, 2008).

U.S. National Research Council. “BEIR VI: Health Risks from Exposure to Low Levels of Ionizing Radiation.” http://dels.nas.edu/dels/rpt_briefs/beir_vii_final.pdf (accessed February 5, 2008).

Hanne Andersen

About this article

Physics: Radioactivity

Updated About encyclopedia.com content Print Article