The 1980s Medicine and Health: Topics in the News
The 1980s Medicine and Health: Topics in the NewsAIDS
THE FIRST ARTIFICIAL HEART
THE CASE OF "BABY M" AND SURROGATE MOTHERHOOD
RECOGNITION OF ALZHEIMERS DISEASE
THE RISE OF EATING DISORDERS
In the late 1970s, a rare form of cancer called Kaposi's sarcoma and an unusual type of pneumonia called Pneumocystis carinii began to appear in previously healthy homosexual and bisexual men in the United States. What was disturbing to health officials was the fact that the patients' immune systems were not functioning properly, causing them to be susceptible to diseases that would not normally occur in a healthy person. Also, the form of pneumonia these men had contracted was so rare that even a few cases in a single year made it definable as an epidemic.
The Centers for Disease Control and Prevention (CDC), the federal epidemiology agency in Atlanta, made its first official announcement of the new, unidentified illnesses on June 5, 1981. (The CDC researches health problems and works to prevent and to control the spread of disease.) The CDC informed the medical community of the fatal nature of the new illnesses and highly unusual spread of the normally rare Kaposi's sarcoma among young homosexual men (Kaposi's sarcoma was usually limited to the elderly). Since the first cases in the United States seemed to affect only homosexuals, the U.S. Public Health Service named the complex of diseases GRID: gay-related immune deficiency. When heterosexuals began to become victims also, GRID became known as acquired immunodeficiency syndrome, or AIDS.
This destruction of the body's immune system also began to be seen among intravenous drug users, people who had received blood transfusions, and sexual partners of people who had the disease. It was soon determined that AIDS was caused by a virus that could be passed from person to person through contact with blood or bodily fluids. The disease can be transmitted through semen or vaginal fluids during unprotected sex with an infected person and through direct contact with infected blood. Intravenous drug users who share hypodermic needles are at an especially high risk. AIDS can also be passed from an infected mother to her unborn child.
In 1983, the virus believed to cause AIDS was discovered and named HTLV-III (human T-cell lymphotropic virus type III). Its name was soon changed to human immunodeficiency virus or HIV. HIV damages the immune system by attacking certain white blood cells called lymphocytes (specifically those called helper T cells), which normally help to protect the body against invading microorganisms. When these cells are destroyed, the body loses its ability to fight infection and becomes vulnerable to a variety of diseases and rare cancers that are the hallmarks of AIDS.
The AIDS epidemic spread rapidly after 1982. While ten new cases a week were diagnosed that year, one hundred cases a week were diagnosed just two years later. By the end of 1988, the total number of AIDS cases reported to the CDC numbered eighty-six thousand. Many physicians realized that they were in the midst of a major medical problem. Public fears fueled by media reports led to a kind of collective hysteria, but public health agencies and the federal government moved very slowly against the disease.
There were two primary reasons for the federal government's reluctance to confront the issue. First, the diseases associated with AIDS were rare enough that there were relatively few trained physicians and medical researchers who were familiar with them. Second, the first patients with AIDS were primarily homosexual men, and the administration of President Ronald Reagan (1911–) was uncomfortable with the link between the spread of AIDS and homosexual behavior. Critics believe that by not addressing the issue more aggressively, the White House lost valuable time in helping the public to understand how AIDS could be transmitted and further fueled public panic as well as a backlash against the gay community. In the early years of the crisis, the federal government's research teams saw AIDS as a budget problem and did not provide major funding for AIDS research until the epidemic had spread to people throughout all walks of life across the country.
Top Causes of Death in America: 1985
|1. Heart disease||771, 169|
|5. Lung disease||74,662|
|6. Influenza and pneumonia||67,615|
|9. Liver disease||26,767|
Fear and rejection of the victims complicated the nation's ability to deal with the deadly disease. Since the earliest victims identified as AIDS sufferers were homosexual males and intravenous drug users, some people who held strict moral views believed the disease was a form of punishment for the victims' "sins." Misinformation was common. A 1985 poll revealed that about half of all Americans believed AIDS could be transmitted through casual contact, such as sharing a drinking glass. It could not. The fatal nature of the disease also terrified and panicked people. School-children with AIDS were rejected, and attempts were made either to keep them out of school or to isolate them from their classmates.
In 1982, the nation was shocked by the news that seven Chicago-area residents had died after taking Extra-Strength Tylenol capsules that an unknown person had laced with cyanide, a poisonous salt. The incident led to a rash of copycat poisonings of other food and drug products, including Extra-Strength Tylenol laced with strychnine (a poisonous plant product) in California, mouthwash tainted with hydrochloric acid in Florida, and cold medicines, allergy remedies, and appetite suppressants spiked with rat poison in other areas of the country.
In response, the U.S. Food and Drug Administration (FDA) ordered tamper-resistant packaging for over-the-counter drugs. Tylenol manufacturer Johnson & Johnson exceeded the FDA packaging requirements with a triple safety seal. But in early 1986, a twenty-three-year-old woman died of cyanide poisoning in New York after she took two capsules from a freshly opened bottle.
Johnson & Johnson pulled Tylenol from store shelves and announced that it was discontinuing the manufacture and sale of the capsule form of its drugs. Instead, Tylenol would be produced only as tablets and caplets, tablets that were capsule-shaped and coated to make swallowing easier. Both the American people and American corporations had to make new adaptations to face the uncertain and often dangerous of the world of the 1980s.
The announcement in July 1985 that film actor Rock Hudson had AIDS (he died on October 2, 1985) dramatically increased public awareness of the crisis, but many problems remained unsolved. Debates over whether to isolate patients led to controversies. Efforts to protect public health led to discrimination against foreign visitors or potential immigrants. Those testing positive for AIDS were not permitted to enter the country. Some victims of the disease found it difficult to get adequate health insurance coverage because of the high expense of treating patients. The slow process of the U.S. Food and Drug Administration (FDA) for approval of new AIDS drugs caused AIDS activists to demand that the government speed up the process by postponing some of the required testing.
By 1989, there was still no vaccine to protect against HIV infection, nor were there many major drugs to prolong and ease the lives of victims. AZT (azidothymidine) was one drug licensed by the FDA for AIDS patients. It interfered with virus replication, prolonging life for many years in some patients and delaying the onset of "full-blown" AIDS in people with no symptoms. But for some, it had harmful and toxic side effects, including nausea, vomiting, loss of red and white blood cells, and muscle pain and weakness.
Since no vaccines or effective lifesaving therapies were available during the decade, federal health officials tried to educate the public on ways to reduce the risk of contracting the disease. In 1988, the U.S. Public Health Service issued a candid brochure about HIV infection and AIDS based on the surgeon general's report. Every household in America received it in the mail. "Safe sex," meaning sex using AIDS-preventive measures, became a common phrase even as controversies arose over providing condoms for high-school-age children instead of encouraging sexual abstinence. The question of providing clean needles to drug addicts to prevent the spread of AIDS also created controversy.
By the end of the 1980s, AIDS was still spreading in the United States and the rest of the world. Despite efforts to create new weapons against the disease, it still baffled scientists and the medical community. Some infected with HIV were without symptoms, and some AIDS victims lived throughout the decade with the disease, raising hopes that information could be found from their ability to fight off the disease. By 1989, scientists felt that while they might not be able to cure the disease, they might find some additional therapies that could keep the AIDS virus in check without the side effects of treatments like AZT.
THE FIRST ARTIFICIAL HEART
The first successful artificial heart procedure took place in 1957 when Dr. Willem Kolff and Tetsuzu Akutsu of the Cleveland Clinic developed a heart that kept a dog alive for one and a half hours. Researchers spent the next several decades developing four-chambered hearts for temporary use in humans. The first artificial heart intended as a permanent replacement for a diseased human heart was the Jarvik-7, developed by Robert K. Jarvik, a physician working in the artificial-organs division of the University of Utah Medical Center. It consisted of compressed-air tubes leading outside the chest to a power source and was first widely tested on animals. Among the more than one hundred calves, sheep, and goats receiving the artificial heart, three had strokes related to infections.
Tampons and Toxic Shock
After 344 cases of a rare and baffling illness that came to be known as toxic shock syndrome were reported in 1980, the Centers for Disease Control and Prevention (CDC) linked women's use of tampons to the outbreak of the sometimes fatal syndrome. One study of a group of sufferers discovered that 71 percent of them had used Procter & Gamble's Rely tampons. Procter & Gamble ordered a recall of its tampons and soon found itself in court.
Toxic shock syndrome (TSS) is a severe illness associated with infection by the bacterium Staphylococcus aureus. The CDC found it occurred most commonly in menstruating women who used tampons (about 75 percent of TSS victims) although it also occurred in children, men, and nonmenstruating women. Up to 5 percent of the cases are fatal. The syndrome has a sudden onset with high fever, vomiting, diarrhea, and a sunburnlike red rash that could occur anywhere on the body. Within a day or two, victims can suffer a drop in blood pressure, ranging from mild symptoms of dizziness to fatal shock. Treatment includes intensive antibiotic therapy.
Studies of Rely and other tampons indicated that certain types of superabsorbing tampons contained cellulose chips that absorbed magnesium, which acted as a nutrient to encourage the growth of Staphylococcus aureus. The bacteria, in turn, generated poisonous waste products, which were circulated by the blood. Hundred of lawsuits were brought on behalf of victims who suffered brain damage, gangrene, partial paralysis, and death. The superabsorbing products were removed from the market, and materials educating menstruating women about the safe use of tampons were distributed.
After the U.S. Food and Drug Administration (FDA) granted approval for human use, the Jarvik-7 was first implanted on December 2, 1982, into
the chest of Barney Clark, a sixty-one-year-old retired dentist suffering from a fatal disease of unknown cause that was destroying the muscles of his heart. After many medical setbacks including seizures, pneumonia, and nosebleeds, Clark finally died on March 23, 1983, of complications from preexisting kidney and lung disease. The Jarvik-7 heart, however, worked until the end, and its implantation was considered a success.
Over the next few years, four more patients received artificial heart transplants. However, all eventually died. Three of the five suffered strokes. Three of the five also died after relatively short periods. Barney Clark lived 112 days after receiving his artificial heart. Murray Haydon lived for sixteen months. William Schroeder survived the longest, living for 620 days before a fourth stroke and lung infection led to his death. Unlike the three experimental animals whose strokes were related to infections, the human strokes were caused by blood clots that originally formed in the heart and then traveled to the brain. Anticoagulants were given to artificial-heart recipients to prevent their blood from clotting, but the drugs produced other serious complications.
After these disturbing setbacks, physicians questioned the use of the devices as permanent replacements and began to use them as a temporary "bridge" for people awaiting a human heart transplant. Some patients did not have a suitable donor heart immediately available or needed time to
Baby Fae and Her Baboon Heart
On November, 15, 1984, at Loma Linda University Medical Center in southern California, a tiny baby girl died twenty days after she had heart surgery. "Baby Fae," as she had come to be known, died with the heart of a baboon pumping blood through her body. The baboon heart experiment offered hope that animal organs could be used in ailing infants for whom transplant organs were difficult to obtain. Baby Fae was born with a fatal congenital deformity known as hypoplastic left heart, which left the entire left side of her heart useless. A successful transplant from a baboon promised a new life for Baby Fae and a revolution in pediatric heart surgery.
Leonard Bailey, the pediatric heart surgeon who had performed the surgery, had experimented with interspecies transplants for seven years, grafting lamb hearts into baby goats. For transplants between animals and humans, he chose baboons because of their biological similarity to humans. However, he made a grave error with Baby Fae by using a heart from a baboon with a different blood type. Baby Fae was given antirejection drugs so her body would not reject the foreign organ, but the strain on her body was too much, and she eventually died of kidney failure.
Controversy surrounded the entire operation, and the medical community was sharply divided. Many physicians challenged the use of an animal heart when a human heart seemed preferable. Animal rights groups protested the sacrifice of a healthy baboon for what they saw as medical sensationalism. Those concerned with medical morality worried about the ethical questions of consent for an infant in such a risky undertaking. Questions even arose about her psychological well-being once she was old enough to understand that the heart that beat within her chest was that of a baboon. In the end, many questions remained unanswered. The medical community and Americans across the nation would reflect on the case of Baby Fae for a long time to come.
recover from health conditions that made a transplant ill-advised. The first FDA-authorized temporary use of an artificial heart as a bridge to a human heart transplant occurred in August 1985 at the University of Arizona Medical Center in Tucson. There, a twenty-five-year-old Arizona man suffering from a severe viral heart infection awaited a new donor heart. Seven days after the surgery, he suffered a series of mild strokes, necessitating an urgent human heart transplant. His Jarvik-7 was later found to have blood clots on its left side, where the main pumping chamber joined the aorta (main artery of the body).
The artificial heart did prove potentially useful in a way its designers had not seen. Originally intended as a permanent replacement for a diseased heart, it became used more as a temporary bridge to keep patients alive until a human heart could be found for transplant. However, the procedure was enormously expensive, and many people raised the question whether such money might be better spent on prevention than on a risky and expensive procedure for one patient. Federal funding for the Jarvik-7 project stopped in 1988, and implantations were restricted to temporary use. On January 11, 1990, after reviewing the ongoing problems with the device, the FDA recalled the Jarvik-7 and forbade its further use in human patients.
THE CASE OF "BABY M" AND SURROGATE MOTHERHOOD
Surrogate motherhood—in which a woman becomes pregnant and bears a child for another woman, often for payment—became news to most Americans in 1986 when the notorious case of "Baby M" made headlines. In most cases of surrogate motherhood, a married couple in which the husband was fertile but the wife was infertile or unable to carry a pregnancy, entered into a privately arranged contract with another woman. That women (the surrogate) agreed to be artificially inseminated with the sperm from the fertile husband and to carry the developing fetus to term. The contract usually called for restrictions on the surrogate mother's behavior during pregnancy and gave the husband of the married couple some authority to make medical decisions about her and the fetus. After giving birth, the surrogate assumed no parental rights, turning the baby over to the married couple according to the terms of the signed contract, which usually involved a large sum of money.
In 1985, Mary Beth Whitehead signed a contract agreeing to act as a surrogate mother for William and Elizabeth Stern for a payment of ten thousand dollars. When the baby was born on March 27, 1986 (a girl who came to be known to the public as Baby M), Whitehead felt she had made a terrible mistake. She named the infant Sara Elizabeth Whitehead, took her home, and turned down the ten thousand dollars. On March 30, the Sterns took the infant to their home. The baby was back at the Whitehead home the following day. During the second week of April, Whitehead told the Sterns she would never be able to give up her daughter. The Sterns responded by hiring an attorney to fight for the contract's enforcement. The police then removed Baby M from White-head's custody.
Whitehead then sued the Sterns in January 1987. After a well-publicized and prolonged custody battle, the lower New Jersey court upheld the contract, giving the child to the Sterns. Elizabeth Stern formally adopted the child, whom the Sterns named Melissa Elizabeth Stern. In 1988, however, the New Jersey Supreme Court reversed the previous ruling and banned surrogate contracts for pay. It gave custody of the child to William Stern, invalidated Elizabeth Stern's adoption of the child, and gave Mary Beth Whitehead broad visitation rights.
The Baby M case raised troubling questions. Should surrogacy be outlawed? Who was qualified to determine a child's "best interest"? Was a surrogate-mother contract baby-selling? At the end of the decade, answers to these and other questions were still being formulated. Many state legislatures began to regulate surrogate arrangement because of the large sums of money involved and the growing industry of surrogate-mother brokering. Some states legalized commercial surrogacy, passing laws that made couples who contracted for surrogacy services the legal parents of the children produced. Other states banned the procedure entirely.
RECOGNITION OF ALZHEIMER'S DISEASE
Alzheimer's is an irreversible brain disease in which damaged and dying brain cells cause devastating mental deterioration over a period of time. Often confused with senility (mental and physical deterioration associated with old age), its symptoms include increasingly poor memory, personality changes, and loss of concentration and judgment. Victims reach a point where they are unable to speak, think, or care for themselves. Death usually occurs within ten years after diagnosis.
The disease is named after German neurologist Alois Alzheimer (1864–1915), who was the first to describe it. In 1906, he studied a fifty-one-year-old woman whose personality and mental abilities were obviously deteriorating: she forgot things, became paranoid, and acted strangely. After the woman's death, Alzheimer examined her brain and noted an unusual thickening and tangling of the organ's nerve fibers. He also found that the cell body and nucleus of nerve cells had disappeared. Alzheimer noted that these changes indicated some new, unidentified illness.
More than seven decades would pass before researchers again turned their attention to this puzzling, destructive malady. Before the 1980s,
[Image not available for copyright reasons]
many Americans had never heard of Alzheimer's disease. Although many families watched their loved ones succumb to it, the disease did not become familiar to the public until the news broke in 1981 that legendary film star Rita Hayworth (1918–1987) suffered from Alzheimer's disease.
Researchers in the 1980s found treatments for the disease even more elusive than finding its cause. No wonder drug existed for the disease, nor even a clear and consistent treatment approach. Researchers found that some drugs helped some of the people some of the time, and each patient responded differently. Sadly, having a diagnosis and receiving treatment for Alzheimer's could not stop the disease; a person inevitably got worse. By 1989, much was known about the symptoms of the disease, but little was still known of the causes. Ongoing research attempted to unlock the secrets of the disease, but neither a full scientific understanding nor a cure existed.
Doctors' Average Salaries: 1985
Alzheimer's disease received considerable public attention during the 1980s. The media emphasized the impact the illness had on both its victims and their families. Most communities did not have services or care facilities developed specifically for the long-term needs of Alzheimer's patients, nor did they provide support services for the care-givers of those patients. But many communities did have agencies that provided for the various needs of the elderly, and by decade's end many of these agencies often provided support and service to both patients and their caregivers.
THE RISE OF EATING DISORDERS
Eating disorders are psychological conditions that involve overeating, voluntary starvation, or both. The best-known eating disorders are probably anorexia nervosa and bulimia. Eating disorders are virtually unknown in parts of the world where food is scarce. They are also rarely seen in less prosperous groups in developed countries. Although these disorders have been documented throughout history, they gained national attention in the United States when Karen Carpenter, a member of the popular singing duo The Carpenters, died on February 4, 1983, from heart failure caused by chronic anorexia nervosa.
The word anorexia comes from the Greek adjective anorektos, which means "without appetite." Anorexia is a form of extreme self-starvation and distortion of body image. The problem for people suffering from anorexia is not that they are not hungry. They starve themselves out of fear of gaining weight, even when they are severely underweight. Their self-image is so distorted that they see themselves as fat even when they look almost like a skeleton. Some anorexics refuse to eat at all; others nibble only small portions of fruit and vegetables or live on diet drinks. In addition to fasting, anorexics may exercise strenuously to keep their weight abnormally low. The self-imposed starvation takes a heavy toll on the body. Muscles begin to waste away. Bones stop growing and become brittle. The heart weakens. Muscle cramps, dizziness, fatigue, and even brain damage and kidney and heart failure are possible. About 10 to 20 percent of anorexics die, either as a result of starvation or suicide.
In the 1980s, anorexia nervosa occurred mostly among adolescent women. As it became more common during the decade, estimates of its incidence were as high as one in one hundred teenage girls and young women. Unlike victims of many other psychological disorders, anorexics had many social traits in common. Anorexia was fifteen times more likely to be found in females than males, typically began in adolescence, and was most common in wealthier families.
Little known even by physicians before the 1980s, bulimia was first thought to be an aspect of anorexia nervosa. It gets its name from the Greek word boulimos, meaning "great hunger" or, literally, "the hunger of an ox." People suffering from bulimia go on eating binges, often gorging on junk food, then force their bodies to get rid of the food to prevent weight gain, either by making themselves vomit or by taking large amounts of laxatives.
Like anorexia, bulimia results in starvation. But there are behavioral, physical, and psychological differences between the two disorders. Bulimia is much more difficult to detect because people who suffer from the disorder tend to be of normal weight or may even be slightly overweight. They tend to hide their habit of binge eating followed by purging by vomiting or by using laxatives. Left untreated, the disease causes vitamin deficiency and serious physical ailments such as liver, kidney, and heart disease. Repeated vomiting can rupture the stomach, and the acid in the vomit erodes tooth enamel. About 40 percent of women with bulimia develop irregular menstruation, and, like anorexics, about 20 percent stop having their periods entirely.
RSI: A New Injury for a New Age
With the rise of video games and computers in the 1980s came a new type of syndrome. The many aches and severe pains resulting from these activities became known as repetitive stress injury (RSI). Video-game players found themselves with an especially troublesome pain and swelling in the wrist caused by the rapid repetition of button pressing, paddle twisting, joystick pushing, sphere rolling, or combinations of these actions. An estimated 65 percent of all video-game fans suffered from RSI at some time.
However, most publicity went to injuries from computer use. According to the Bureau of Labor Statistics, RSI accounted for nearly half of all 1988 workplace illnesses in private industry, compared to only 18 percent in 1981. The big increase came from data processors and journalists who spent long hours at the keyboard. Working at a computer for lengths of time stressed the wrists, elbows, and shoulders. Tendons in the arm became inflamed, leading to numbness and pain. Unless the injuries were diagnosed and treated, they could develop into serious lifelong disabilities. The science of ergonomics, or how humans adapt to the workplace, came to the rescue with design alterations to minimize such problems. Experts said frequent short breaks from work were crucial, but the real key was to make technology adapt to humans instead of the other way around.
During the decade, theories about these disorders included psychological, biological, and social explanations. Psychological explanations for anorexia focused on the fear of maturing and the fear of loss of control. Bulimia was regarded as a fear of food that created a compulsion, which led to stress and fear around episodes of binge eating and purging. Scientists also thought the disorders might be associated with a disorder of the hypothalamus, the gland that produces hormones and regulates hunger, thirst, and temperature. Since some bulimics improved after treatment with antidepressant drugs, other scientists linked the disorder to decreased levels of the chemical serotonin in the brain. Social scientists blamed social pressures. In American society, women have been constantly bombarded with advertisements and unrealistic role models suggesting that a woman's only worth is in her youth and her slim appearances. A survey in 1984 revealed that 45 percent of underweight respondents thought they were too fat and needed to lose weight.
Residential treatment facilities for anorexics were developed during the 1980s that included family therapy and counseling. Between 15 and 25 percent of anorexics relapsed occasionally; another 15 to 20 percent continued to be anorexic. Bulimics could be treated successfully outside the hospital since their disorder was not so life threatening. Treatment usually consisted of therapy and antidepressant drugs, but a high rate of treatment failure was reported. Because anorexia nervosa and bulimia were still poorly understood, treatments for them were not universally successful in the decade.
In the early 1980s, a new and controversial branch of medicine began to center on the links between health and environmental factors. Pollution-related health hazards were problems for decades, but the 1980s saw a growing concern about air pollution inside the home or workplace.
A major culprit was formaldehyde, a chemical preservative used in many building materials such as adhesives, furnishings, particleboard, and foam insulation. Many people suffered allergic reactions to the chemical, sometimes severely. Unfortunately, it was not the only problem. Many new homes and office buildings were made with many synthetic, chemically treated building materials and finishings. Air breathed in these buildings was contaminated by many things, including toxic carbon monoxide (from incomplete burning of fuel) and nitrogen dioxide (produced during the burning of natural gas and thought to cause increased respiratory problems in children in winter). The air in tobacco-free homes and buildings could be filled with as many as 150 contaminants from stove gases, furnaces, solvents, paints, furnishings, mold, and pesticides. Many of these chemicals occurred as ingredients or by-products of common items such as household cleansers, construction materials, and cigarettes.
New buildings were models of energy efficiency, with little outside ventilation. As a result, chemical contaminants were being trapped indoors. U.S. Environmental Protection Agency (EPA) studies of American homes found chemical levels that were two to five times higher indoors than outdoors. Officials estimated as many as 30 percent of new office buildings made workers ill. The EPA ranked indoor air pollution among the nation's top five environmental health problems during the decade.
Radon, a product of the radioactive breakdown of radium found in certain rock formations, also made headlines in the 1980s as people in several states found high concentrations of the radioactive gas in their homes. The colorless and odorless gas can enter a building through cracks in the foundation and can build up to potentially dangerous levels in closed areas. Long exposure to radon can lead to lung cancer. Scientists working for the Centers for Disease Control and Prevention estimated that high radon levels could cause as many as thirty thousand lung cancer deaths in the country each year. In the fall of 1985, the EPA announced plans to conduct a national survey on radon and present a five-year plan to lessen its health hazard. Pennsylvania became the first state to help home owners to measure radon levels and increase ventilation to disperse the gas.
American Nobel Prize Winners in Physiology or Medicine
|1980||Baruj Benacerraf||George D. Snell|
|1981||Roger W. Sperry||David H. Hubel|
|1982||No award given to an American|
|1984||No award given to an American|
|1985||Michael S. Brown||Joseph L. Goldstein|
|1986||Stanley Cohen||Rita Levi-Montalcini|
|1987||No award given to an American|
|1988||Gertrude B. Elion||George H. Hitchings|
|1989||J. Michael Bishop||Harold E. Varmus|
In 1980, fewer than four of every three hundred thousand Americans died from the respiratory disease asthma. By the end of the decade, that figure had nearly doubled. No one knew for sure just why the increase occurred, but some researchers suspected outdoor and indoor air pollution. Statistics showed higher rates among minorities and city dwellers who were more likely to live and to work in "sick buildings."
Many in the medical community turned their attention to the role of the environment and to environmental medicine in the nation's public health picture. Prevention, rather than treatment, seemed to be the key. Many sick buildings in the workplace solved their air-quality problems by properly maintaining their heating, ventilation, and air-conditioning systems. Careful selection of building materials, equipment, and cleansing supplies could also limit the level of indoor contaminants.