The 1990s Medicine and Health: Topics in the News

views updated

The 1990s Medicine and Health: Topics in the News



Health-care reform was one of the first and most contentious major-policy initiatives tackled by President Bill Clinton (1946–). Health care had first become a public policy issue for Americans after World War II (1939–45), when President Harry S Truman (1884–1972) advocated national health insurance. The American Medical Association (AMA; the nation's leading medical organization), however, vigorously opposed it. Finally, in 1965, the Social Security Act established Medicare and Medicaid, providing medical insurance for retired persons (Medicare) and for those on welfare (Medicaid).

Other Americans still had to pay for their own health care, either through employer-sponsored insurance plans or out of their own pockets. The working poor assumed the most risk under these conditions because they did not qualify for Medicaid and generally worked for employers who did not offer medical insurance. From the 1960s to the 1980s, health-care costs continued to rise rapidly because of economic trends and technological advances in medicine. By the 1990s, even employers with health-care benefits found it difficult to continue to provide the level of protection to which workers had become accustomed without raising employees' premiums or reducing their benefits.

Health maintenance organizations (HMOs; prepaid group health plans) sought to lower insurance costs by focusing on preventive care rather than corrective medicine. HMOs also sought to reduce medical costs by requiring certain procedures to be authorized in advance by insurance companies. Family doctors were replaced by larger groups of salaried physicians, which lowered overhead costs for the HMO but made health care less personal. Despite these cost controls attempted by the HMOs, health costs and insurance premiums continued to rise.

Calls for health-care reform came from across the nation, but lobbying from special-interest groups such as the AMA often got in the way. In 1991, more than three dozen health-care reform bills were introduced in the U.S. Congress. None of them passed. The following year, President George H. W. Bush presented a health-care reform plan that promised to provide coverage for the more than thirty-five million Americans without health insurance and to stop the spiraling costs for the Medicare system. This legislation, too, died in Congress.

Finally, in 1993, President Clinton responded to the health-care crisis by choosing First Lady Hillary Rodham Clinton, to lead efforts to reform the $900-billion American health-care system. In October of that year, President Clinton unveiled the plan developed by Mrs. Clinton's task force. The National Health Security Plan proposed to overhaul U.S. health care. All Americans would receive health insurance coverage under the plan. The plan would have been financed through a combination of savings in existing programs, new revenues, and a series of subsidies or grants. Employers would pay 80 percent of their employees' health insurance premiums, with the government providing subsidies to low-income workers and some small businesses. There would have been a seventy-five-cent per-pack cigarette tax. All insurance would be purchased through regional health alliances under government control. The plan also required half of U.S. medical school graduates to specialize in primary care.

Doctors' Average Salaries in 1995

General surgery$225,000
Emergency medicine$170,000
Internal medicine$138,000
Family practice$124,000

Even before the plan reached Congress, the insurance industry, the Republican Party, the AMA, and other groups had organized a media campaign against it. They labeled the plan "socialized medicine" (a system of national health care regulated and subsidized by the government, such as those in Canada and Great Britain). They claimed the plan would reduce the quality of medical services in the country and remove a patient's right to choose his or her own doctor. The AARP (American Association of Retired Persons, the nation's largest organization for people over the age of fifty) had initially supported Clinton's plan. The influential lobbyist group switched sides, however, when its members began to fear that they would lose the option of being able to choose their own doctors and other health-care providers. Without a strong support system among the public, lawmakers were unwilling to jeopardize their political careers by supporting the controversial plan.

After being introduced into Congress, the National Health Security Plan was assigned to several committees that held hearings on it over the next year. Parts of the plan were eventually made into law, but in vastly different forms that hardly reformed anything. Most of the plan simply died in committee. Even President Clinton gave up supporting it, realizing it was a lost cause. Various Democrats and Republicans in Congress offered competing health-care reform plans throughout the 1990s. Some minor bills were passed, but most major reform efforts continued to stagnate. By the end of the decade, the U.S. medical system of private insurance and market-driven health care remained largely unchanged.


Throughout the 1990s, the acquired immunodeficiency syndrome (AIDS) epidemic continued to take a devastating toll in human lives all over the globe. In 1999, 2.8 million people worldwide died from the disease, bringing the total number of deaths attributed to AIDS by the end of the twentieth century to 18.8 million. In 1999 alone, almost 5.5 million people worldwide were infected with the human immunodeficiency virus (HIV; the virus that causes AIDS)—roughly fifteen thousand new cases each day. In the United States, the reported number of Americans living with HIV/AIDS at the end of the decade was 412,471. The Centers for Disease Control and Prevention (CDC; the federal government agency responsible for developing and applying disease prevention and control) estimated the actual number was probably double that, since many cases went unreported.

In spite of these staggering numbers, death rates for those suffering from AIDS in the United States declined dramatically during the decade. AIDS fell from being the eighth-leading cause of death in 1996 to fourteenth a year later. Medical researchers and others attributed this decline to the development of multidrug "cocktails," a potent combination of antiviral drugs often including protease (pronounced PRO-tee-aze inhibitors), which were first developed by drug researchers in 1994. Protease inhibitors, working in combination with other drugs, suppressed the spread of HIV in the cells of the person infected.

Gulf War Syndrome

About 697,000 men and women of the U.S. military served in the Persian Gulf War. The conflict began in August 1990 when Iraq invaded and occupied Kuwait and culminated in February 1991 with an armed battle between Iraq and a coalition of nations led by the United States. After the war ended, some U.S. service personnel returned home with various illnesses such as asthma, short-term memory loss, fatigue, rash, muscle aches and pains, and weakness. This collection of diverse symptoms affecting returning veterans became known as the Gulf War syndrome.

Despite receiving medical discharges, some veterans were denied full disability pay. The military and the Veterans Affairs Department initially dismissed the complaints as unrelated to service in the Persian Gulf. Spouses of some veterans came down with some of the symptoms as well, and some of their pregnancies resulted in premature births and an elevated incidence of birth defects and illnesses.

The potential causes of the syndrome are as varied as its symptoms. Soldiers breathed smoke from burning waste dumps and oil wells and encountered a variety of paints, solvents, and pesticides. Some veterans believe they were exposed to chemical or germ warfare agents that were dispersed into the air after the coalition bombed Iraqi storage facilities. The U.S. military acknowledged that it had detected minute traces of sarin (a nerve agent) and mustard gas in the desert.

Another potential source of the problem may have been medications administered to the U.S. soldiers to protect them against chemical weapons. Researchers at Duke University and the University of Texas linked the veterans' problems to chemicals used to protect them from insects and nerve gas. Studies showed that animals treated with only one of the drugs in question did not develop illnesses, but those receiving both the antinerve-gas pill and the insect repellents did exhibit symptoms resembling Gulf War syndrome.

Between 1994 and the end of the decade, the U.S. Defense Department spent $100 million on Gulf War health research. In 1999, a report prepared for the Defense Department pointed out that the drug administered to protect the soldiers against chemical weapons could not be ruled out as a possible cause of the syndrome. While no definitive answer was determined as to the cause of illness, the Department of Veterans Affairs and the Department of Defense now recognize that the Gulf War syndrome is a real medical condition.

If a newly infected individual started this drug treatment soon after diagnosis, the patient could delay the onset of AIDS symptoms for many years. This delay allowed many HIV-positive men and women to enjoy relatively normal lives. Although AIDS remained incurable, it now was viewed as a chronic (long-term) condition rather than one that was immediately fatal. Unfortunately, this treatment regimen had two major drawbacks: Researchers found that once HIV-infected people stopped their drug therapy, the virus rebounded in their bodies. Also, the treatment was very expensive. This had significant implications for health insurance coverage and hospital costs. Many AIDS patients exhausted the coverage limits permitted by their insurers and were forced to deplete their life savings to pay for the vital and necessary drug therapy.

Battling AIDS has been a political as well as a scientific fight ever since the disease was first reported in 1981. Robert C. Gallo of the Tumor Cell Biology Laboratory of the National Cancer Institute in the United States and his French rival, Luc Montagnier of the Pasteur Institute in France, both claimed to have been the first to isolate and identify the HIV virus. More was at stake than pride, however, since the claim determined control over patent rights for increasingly important diagnostic tests. Other issues in the debate were hotly contested. Since AIDS was often perceived as being transmitted by homosexual activity and intravenous drug use, some members of the religious community argued that the disease represented divine punishment for immoral behavior. Uncomfortable with the link between AIDS and homosexual behavior, President Ronald Reagan (1911–) and his administration hardly addressed the issue in the 1980s.

The homosexual community and others pressured pharmaceutical companies and the government throughout the 1990s to increase spending on AIDS research, treatment, and prevention. Although effective in prolonging life, anti-AIDS drugs were expensive; their manufacturers faced criticism for profiting from the suffering of desperately sick people. Pharmaceutical companies also were attacked for not supplying the drugs at reduced cost to impoverished areas of the world where the disease was rampant, such as Africa. Activists also accused the U.S. Food and Drug Administration (FDA) of being too slow to approve some new anti-AIDS drugs and for approving other drugs too hastily without assuring their safety and effectiveness.

Because AIDS was most often transmitted through unprotected sexual relations and had long been associated with homosexual activity, many people sought to hide their HIV status. In fact, a few infected individuals were prosecuted for not telling their partners of their HIV status before engaging in unprotected sex. On several occasions, individuals were charged with having used the HIV virus as a lethal weapon by intentionally infecting others with the disease.

Top Causes of Death in America in 1995

1)Heart disease737,563
4)Lung disease102,899
6)Influenza and pneumonia82,923
10)Liver disease25,222

Perhaps because the disease struck hard within the arts community, playwrights and filmmakers took up the cause against AIDS in the 1990s.

Early in the decade, Tony Kushner's play Angels in America, winner of four Tony Awards and the 1993 Pulitzer Prize for drama, focused on AIDS. In film, Tom Hanks gave an Academy Award-winning performance as an AIDS-infected lawyer fighting to keep his job in Philadelphia (1993). The acclaimed made-for-television movie, And The Band Played On (1993), told the story of the discovery of AIDS. Almost everyone who attended the Academy Awards ceremonies during the 1990s wore a small red ribbon, demonstrating their commitment to curing AIDS and their compassion for its victims.

Although the search for an AIDS cure vaccine continued throughout the 1990s, progress was painfully slow. Like the virus that causes the common cold, HIV tends to mutate readily. A vaccine or cure for one strain proved to be worthless against other mutated forms. The only known ways to prevent the spread of the disease—abstinence (complete avoidance) from both unprotected sex and from sharing intravenous drug needles—demanded changes in patterns of social behavior. Unfortunately, unsafe sexual practices—including sex among teenagers—were again on the upswing in America at the close of the decade.


Abortion continued to divide the nation as it had since the 1973 Roe v. Wade decision by the U.S. Supreme Court, which established a woman's legal right to choose an abortion during the first trimester of pregnancy. Throughout the 1980s, debates between antiabortionists (often called prolife advocates) and those who supported a woman's right to choose (often called pro-choice advocates) grew increasingly heated. In the late 1980s, Operation Rescue, a zealous antiabortion group, inspired and encouraged a nationwide militant antiabortion movement that aggressively protested at clinics where abortions were performed.

While street protests in front of clinics continued in the 1990s, a few antiabortion advocates adopted terrorist tactics, murdering doctors who provided abortion services in Florida and New York. On March 10, 1993, antiabortionist Michael Frederick Griffin gunned down physician David Gunn at the entrance to a Pensacola, Florida abortion clinic. Griffin was convicted of murder and sentenced to life in prison. The following year, on July 29, 1994, John Bayard Britton, a physician who performed abortions, and his escort, James H. Barnett, were murdered in Pensacola by Paul Jennings Hill, a former Presbyterian minister. In 1994 alone there were twelve attempted or actual murders of abortion clinic personnel, as well as twelve additional attacks on clinics by bombs or fire. On October 23, 1998, while he sat in the kitchen of his home outside Buffalo, New York, Barnett Slepian,

a doctor who performed abortions, was shot and killed by a sniper. At the end of the decade, Slepian's killer remained at large.

Slepian's name, along with those of hundreds of other physicians, judges, and prominent feminists across the United States, had appeared on an Internet Web site known as the "Nuremberg Files." (The name is in reference to the war crimes trial held at Nuremberg, Germany, after World War II [1939–45]). Besides names, the Web site listed photographs, home and work addresses, phone numbers, and other personal information. When an individual on the list was injured in an attack, his or her name was grayed out; when an individual was killed, his or her name was marked with a strike through it. Spokespersons for both Planned Parenthood and the National Abortion Federation considered the sponsors of the site responsible for antiabortion violence. On February 2, 1999, a federal court agreed and ordered the Web site shut down.

The introduction of prescription drugs to help induce abortion soon changed the landscape surrounding the contentious and violent issue. Drugs such as methotrexate, an anticancer medication, were successfully used to terminate pregnancies. Oral contraceptives taken in high doses also were effective in terminating pregnancies more than 75 percent of the time. The first major alternative to surgical abortion, however, was the prescription drug mefipristone or RU-486. First developed in 1980 in France and approved for use in 1988, RU-486 had helped induce abortions in more than six hundred thousand women in countries across Europe by 1999. Yet RU-486 was not available to American women, primarily because manufacturers, afraid of boycotts, chose not to distribute the drug in America until the early 1990s. Clinical trials of the drug were finally conducted in the United States beginning in 1994. Finding it safe and effective, the U.S. Food and Drug Administration (FDA) approved the marketing of RU-486 in 2000.


Genetic information in humans is stored in units known as genes, which carry instructions for the formation, functioning, and transmission of specific traits from one generation to the next. Genes determine individual human characteristics—from eye and hair color to height to musical and literary talent. In 1953, English chemist Francis Crick (1916–) and American biologist James Watson (1928–) first determined the chemical explanation for a gene. They discovered the chemical structure for deoxyribonucleic acid (DNA; large, complex molecules that occur in the nuclei of all living cells and are unique to each person). Genes make up the segments of DNA.

Genetic disorders are conditions that originate in an individual's genetic make-up. Many of these disorders are inherited and are governed by the same rules that determine whether a person has dimples or red hair. Medical scientists know of about three thousand disorders that arise from errors in an individual's DNA. Conditions such as sickle-cell anemia, muscular dystrophy, and cystic fibrosis result from the loss, mistaken insertion, or change of a gene in a DNA molecule. For years, scientists have sought to find a way to correct these deficient genes—a procedure known as human gene therapy (HGT).

Alone in Antarctica: Physician Heal Thyself

The National Science Foundation's Amundsen-Scott South Pole Station is located at the geographic South Pole in Antarctica. Composed of a number of structures, the station houses instruments used to monitor the upper and lower atmosphere and to conduct astronomy and astrophysics research. The station's winter personnel, some twenty-eight scientists and support crew, are isolated between mid-February and late October. Temperatures in mid-winter drop to one hundred degrees Fahrenheit below zero.

In March 1999, the only doctor at the station, forty-seven-year-old Jerri Nielsen, discovered a lump in her right breast. In June, after telling the rest of the crew at the station about her condition, she began corresponding with a breast cancer specialist at Indiana University via e-mail. Because of the severe winter weather, Nielsen could not be evacuated from the station. She and her colleagues at the station would have to perform the necessary diagnostic operation to determine if the lump were cancerous.

On July 11, in a dangerous and challenging mission, an Air Force C-141 Starlifter made a fifteen-hour round-trip flight from New Zealand and airdropped medical supplies for diagnosis and treatment, including chemotherapy drugs. Nielsen then directed a few crew members through the procedure of removing lump tissue from her affected breast. Digital microscopic images of the lump cells were then transmitted by video back to the cancer specialist in the United States. The specialist confirmed Nielsen's diagnosis of breast cancer, and the crew began to give her the chemotherapy drugs.

Although the tumor shrank at first, it soon began to grow. Nielsen's doctor urged that she be evacuated from the station as soon as possible. Along with another ailing member of the crew, she agreed to the daring rescue mission. On October 16, 1999, after several missions were delayed because of weather, a U.S. National Guard crew landed, picked up the pair, and dropped off a replacement physician. Back in the United States, Nielsen eventually underwent a mastectomy (surgical removal of her breast).

In May 1990, a research team at the National Institutes of Health (NIH) attempted HGT on a four-year-old girl suffering from a rare immune deficiency. The patient received about one billion cells containing a genetically engineered copy of the gene that her body lacked. In 1993, the NIH approved a procedure to introduce normal genes into the airways of cystic fibrosis patients. According to the NIH, by 1999 more than 390 gene therapy studies had been initiated, involving over four thousand people and more than a dozen medical conditions.

Breast Implants

Silicone gel implants—used to cosmetically enhance women's breasts—were developed in 1964. By the late 1990s, between 1.5 and 1.8 million American women had undergone breast implant surgery. Because these implants were developed prior to a 1976 law requiring Food and Drug Administration (FDA) approval, they did not have to undergo federal scientific testing. Women and their doctors assumed breast implants were safe. In May 1992, however, the FDA finally conducted hearings to determine implants' safety.

The FDA panel found that the implants had not been proven to be safe, but neither was there conclusive evidence that they were harmful. In April 1993, the FDA temporarily banned using silicone gel implants for cosmetic purposes while studies continued. Despite the inconclusive evidence and the ongoing research, the implant crisis soon made national headlines. Patients with complaints or problems they felt were caused by their breast implants appeared on magazine covers and on television talk shows. They also began to file lawsuits against implant makers.

Mounting scientific evidence that the implants did not cause cancer or other diseases did little to influence judges or juries. As a result, multimillion-dollar verdicts were returned against the implant manufacturers. The largest manufacturer, Dow Corning, decided to cut its losses and stopped making implants. Ultimately, Dow Corning agreed to settle the pending lawsuits in order to end the legal proceedings.

Despite such successes, most HGT experiments have produced largely disappointing results. In 1999, HGT research was dealt a severe blow when Jesse Gelsinger, an eighteen-year-old from Tucson, Arizona, died in an experiment at the University of Pennsylvania. The young man, who suffered from a rare genetic liver disorder, had volunteered for an experiment to test gene therapy for babies with a fatal form of that disease. His death led to demands for increased oversight of HGT research and forced researchers to defend their program publicly.

A scientific breakthrough in the 1990s was cloning even more startling than gene therapy. Cloning is the creation of a cell, group of cells, or an entire organism that contains the same genetic information as that of the parent cell or organism. Humans have utilized simple methods of plant cloning such as grafting and stem cutting for more than two thousand years. The first cloning of animal cells took place in 1964, and the first successful cloning of mammals was achieved nearly twenty years later. These experiments had one characteristic in common: They involved the use of embryonic cells, those at a very early stage of development. Biologists have always believed that such cells have the ability to adapt to new environments and are able to grow and develop in a cell other than the one from which they are taken. Adult cells, they thought, did not retain the same adaptability.

American Nobel Prize Winners in Physiology or Medicine

1990Joseph E. Murray
E. Donnall Thomas
1991No award given to an American
1992Edmond H. Fischer
Edwin G. Krebs
1993Phillip A. Sharp
1994Alfred G. Gilman
Martin Rodbell
1995Edward B. Lewis
Eric F. Wieschaus
1996No award given to an American
1997Stanley B. Prusiner
1998Robert F. Furchgott
Louis J. Ignarro
Ferid Murad
1999Günter Blobel

A startling announcement in February 1997 showed an error in this line of reasoning. A team of Scottish researchers, led by embryologist Ian Wilmut (1945–), reported that they had cloned an adult mammal for the first time. The product of the experiment was a sheep named Dolly, seven months old at the time of the announcement. She differed from previous cloning experiences in that she originated from an adult cell. A study of Dolly's genetic make-up showed that she was identical to the adult female sheep that supplied genetic material for the experiment.

Advances in the cloning process developed rapidly after Dolly's debut. Only a year and half later, biologists from the University of Hawaii announced in July 1998 they had cloned dozens of mice—even cloning some of the clones. What made the cloning of adult mice astounding is that mouse embryos develop soon after being fertilized. Due to this speedy embryonic development, scientists had thought a mouse would prove to be difficult or impossible to clone..

When scientists and others suggested the possibility of cloning humans, an active debate arose in the country about the morality of such an undertaking. In March 1997, President Bill Clinton (1946–) banned research into human cloning in all federally sponsored laboratories and asked that private researchers also comply with the ban. In January 1998, however, physicist Richard Seed announced his intention to attempt human cloning. Research into animal cloning continued. Despite the controversy and potential limitations, scientists argued that cloning could benefit society by preserving endangered species and advancing medical understanding of aging and diseases.

About this article

The 1990s Medicine and Health: Topics in the News

Updated About content Print Article


The 1990s Medicine and Health: Topics in the News