The 1940s Medicine and Health: Topics in the News

views updated

The 1940s Medicine and Health: Topics in the News

AMERICA GOES TO WAR AGAINST DISEASE
MEDICAL SCHOOLS REJECT MINORITY STUDENTS
PRESIDENT TRUMAN TAKES ON THE AMA
"MAGIC BULLETS" USED TO FIGHT INFECTIOUS DISEASES
WAR ADVANCES MEDICINE
STEPS TAKEN TOWARD A POLIO VACCINE
MENTAL PATIENTS SHOCKED INTO SANITY

AMERICA GOES TO WAR AGAINST DISEASE

An epidemic is defined as the rapid spread of a disease through a population. Epidemics are very dangerous because they affect large numbers of people and give medical and governmental authorities very little time to react. Once an epidemic has begun, it is often very difficult to bring it under control. In the 1940s, several diseases threatened to become epidemics including influenza, polio, malaria, typhus, dengue fever, and yellow fever. When American military personnel returned from abroad, many of them brought back contagious illnesses such as typhus and malaria, thus putting their colleagues and fellow citizens at risk.

The Office of Malaria Control in War Areas (MCWA) was an emergency organization based in Atlanta, Georgia during World War II (1939–45). Although originally set up to tackle malaria, a serious threat in the Atlanta area, in 1943 it expanded to deal with dengue fever in Hawaii and yellow fever in the southeastern states. In 1945, the MCWA used specially trained teams of scientists to find out the main causes of infectious diseases. In 1946, it was reorganized for peacetime by U.S. Surgeon General Thomas Parran (1892–1968), and it was renamed the Communicable Disease Center. Later known as the Centers for Disease Control and Prevention (CDC), the agency tries to manage and control diseases spread from person to person, from animal to person, and from the environment. One of the most powerful weapons it had against disease during the 1940s was the insecticide DDT.

DDT (or dichlorodiphenyltrichloroethane) was first made in 1874, and its ability to kill insects was discovered in the 1930s. One advantage of using DDT as an insecticide was that it stayed active for many months after it was used. DDT sprayed on a mattress would remain toxic to bed bugs for almost a year. Clothing dusted with the chemical stayed free of lice for up to a month, even when it was regularly washed. This was especially important for military personnel, since many soldiers caught typhus from lice. DDT also controlled the spread of malaria. Just a few ounces of DDT dropped in a swamp killed all the mosquito larvae, thereby stopping the spread of malaria by mosquito bite.

Around 350,000 pounds of DDT were manufactured every month in the late 1940s. The insecticide was used in the home, on the battlefield, and on farm crops to protect them from worms, moths, and aphids. Scientists insisted DDT was safe for humans as long as they did not eat it. But it would be twenty years before the negative effects of DDT were recognized. The most important of these was the way it destroyed all insects, leaving no food for birds, small mammals, and other important creatures. The effect of using DDT was to destroy all wildlife, causing a "silent spring," the term used later by environmental activist and author Rachel Carson.

Federal support for disease control expanded rapidly in the 1940s. Apart from advising on where chemicals such as DDT should be used, the CDC was responsible for much of the research into the spread of disease. Not only did its scientists work on the biology of diseases, but they also plotted epidemics using statistics. In this way, they learned a great deal about the way epidemics worked and how to control them.

MEDICAL SCHOOLS REJECT MINORITY STUDENTS

As the 1940s began, there was already a severe shortage of doctors in the United States. When America joined the war in December 1941, many younger doctors enlisted and were sent abroad. In a period of just three years, the patient-to-doctor ratio more than doubled, to 1,700 to 1. Yet despite severe physician shortages, well-qualified members of ethnic minority groups such as Jews, Italians, and African Americans found it difficult to find a place at medical colleges.

One reason given for this difficulty was overcrowding. There were eight applicants for every freshman vacancy. But minority students found they were more likely to be passed over for admission than white students. It is estimated that between 35 and 50 percent of applicants to medical schools in the 1940s were Jewish. Yet only one in thirteen Jewish applicants was accepted. For African Americans the situation was even more desperate. One-third of the seventy-eight approved medical schools were in the southern and border states. All twenty-six of those colleges were closed to black students. There were just two black medical schools in the 1940s: Howard and Meharry. The others ran hidden quota systems for minority applicants, admitting a small fixed number of black, Italian, Catholic, and women students.

Medical Fads of the 1940s

Advances in medicine in the 1940s gave many doctors the hope that, in the long run, most human ailments would be curable. But they often found problems where none existed. In New York in 1945, for example, 61 percent of eleven-year-olds had already had their tonsils removed, and doctors believed that around half of the rest would benefit from a preventive tonsillectomy (an operation to remove the tonsils). By the end of the century, conventional medical practice was not to remove tonsils except in extreme cases of infection. Other minor symptoms diagnosed as serious ailments during the 1940s were flat feet, crooked teeth, poor posture, and heart murmurs. None of these are considered serious problems today.

A way around discrimination in U.S. medical schools was to study abroad. Before the war, many affluent Jewish and African American students studied medicine in England, Scotland, and elsewhere. After 1945, however, the National Board of Medical Examiners would not allow graduates of foreign medical schools to take its examinations to practice medicine in the United States. Only English schools were exempt. This restriction not only stopped many Americans from studying abroad, but it also prevented immigrant doctors from practicing medicine. Despite the shortage of homegrown doctors, and the large numbers of European immigrants, there were almost no foreign-born doctors on hospital staffs in the 1940s. Yet medical schools denied that they were racist. They claimed their quota systems were based on geographical and income distribution rather than on race. After fighting the Nazis, many Americans wanted to see racism removed from daily life. Even so, official and unofficial quota systems would remain part of American medicine for many years to come.

PRESIDENT TRUMAN TAKES ON THE AMA

The issue of whether or not the United States should have a compulsory health insurance system was brought to a head in the late 1940s. Many European nations, including Germany, had set up similar systems up to fifty years before. American attempts to create some form of national health insurance plan first began in 1915. That year, the American Association for Labor Legislation proposed medical protection for workers and their families. Throughout his period as president of the United States, beginning in 1933, Franklin D. Roosevelt was keen on developing a national health-care system. In 1945, a few months after the war ended, his successor, Harry S Truman, asked Congress to pass a national medical care program.

Truman's health-care bill went before Congress on November 19, 1945. The plan included federal funding to build new hospitals, to expand public health and disease prevention measures, and to create more medical schools, compulsory health insurance, and disability insurance. The American Medical Association (AMA) could not argue with the need for more medical schools or additional public health measures. But it was deeply opposed to compulsory insurance. The AMA was especially unhappy because the insurance Truman had in mind would cover all classes of society, not just the needy. This would remove the freedom of doctors to set their own fees.

Despite a lukewarm response from Congress, the American public favored the proposed system at first. It would remove much of the "economic fear" associated with being sick. But the AMA and Republicans in Congress fought hard against Truman's plans. Senator Robert Taft (1889–1953) of Ohio attacked it as creeping socialism. He said: "It is to my mind the most socialistic measure this Congress has ever had before it." The Republicans took control of Congress in 1946, and Truman's bill foundered. But after his surprise presidential election victory in 1948, Truman kept up the pressure. The AMA spent $1.5 million fighting national health insurance in the 1940s. At the time, it was the most expensive political lobbying in history.

In the end, the AMA succeeded in killing off Truman's plans for national health insurance. They did so by appealing to the nation in the language of the cold war. National health insurance, the AMA argued, was the first step toward becoming a socialist state like the Soviet Union. In the tense political atmosphere of the late 1940s, many Americans were convinced by the AMA's warnings. Public support disappeared, and the idea of a national health-insurance program was shelved. It was defeated by the wealth and power of the AMA, which was supported by businesses trying to avoid the extra costs mandatory health insurance would impose on them. The episode was an example of the way big business and powerful lobbying groups could directly influence American government policy.

"MAGIC BULLETS" USED TO FIGHT INFECTIOUS DISEASES

As in other areas of technology, World War II also speeded up discoveries in medicine. Many breakthroughs in drug treatments for infectious diseases had been made in the 1930s. But it was in the 1940s that those medicines came into common use. Antibiotics, or drugs that combat bacterial infections, were widely used in the 1940s. They saved many lives on the battlefields of Asia and Europe, and they revolutionized the treatment of illness in the second half of the twentieth century. Such drug treatments were known as "magic bullets" because they could kill many different infectious diseases.

The first real breakthrough in the war against infectious disease came in 1932. In that year, German chemist Gerhard Domagk (1895–1964) discovered that a red dye known as Protosil protected mice against the streptococcal infection. In 1936, the active ingredient in the dye, sulfanilamide, was identified. A whole series of "sulfa" drugs were developed as war broke out in Europe. By the early 1940s, these sulfa drugs were in use around the world. U.S. and Australian troops used the sulfa drug sulfaguanidine to treat dysentery contracted by soldiers in New Guinea. Their common foe, the Japanese army, suffered from the same illness but was soon beaten because of an inability to treat it. Sulfa drugs also could treat streptococcal infections, pneumonia, gonorrhea, meningitis, and many other diseases.

Sulfa drugs quickly became part of every physician's prescription arsenal. But other drugs were also becoming available. Although Scottish scientist Alexander Fleming (1881–1955) first discovered penicillin in 1928, drug companies did not take an interest until 1941. That year, when American scientists discovered a faster-growing type of penicillin on a rotten cantaloupe melon, penicillin soon went into industrial production. By 1942, there was enough penicillin on Earth to treat around one hundred patients. By 1943, with the U.S. government in charge of penicillin production, the United States was able to supply the entire needs of the Allied armed forces. Penicillin was especially useful to the military. Before antibiotics like penicillin came along, many soldiers with minor wounds died of infections. This was also true after surgery, in both military and civilian medicine. Many patients who survived operations died a few days later from infected wounds. Penicillin went into civilian use in 1945.

Other antibiotics besides penicillin were also discovered or developed in the 1940s. Selman A. Waksman (1888–1973), who studied tiny organisms living in soil, discovered actinomycin, an antibiotic taken from a type of fungus. Actinomycin turned out to be toxic to humans, but Waksman's research also led to the discovery of streptomycin. This antibiotic is effective against bacteria untouched by penicillin. It effectively removed the threat of killer diseases such as tuberculosis. Soil scientists also discovered chloromycetin, an antibiotic that cures a whole range of diseases, including typhus. It was developed from a substance found in soil in Venezuela; chloromycetin went into general use in 1949.

Allergy Relief

In 1946, it was estimated that up to 20 million Americans suffered allergic reactions of one kind or another. There was no relief for symptoms such as itchiness, swelling, sneezing, and skin rashes. Sufferers simply had to avoid whatever it was that affected them. Then, in 1946, scientists discovered histamine, the chemical in the body that causes allergic reactions. Not long afterward, they invented a drug to stop the effects of histamine. The new antihistamine drug was effective in 85 percent of hay fever sufferers, and in 50 percent of allergic asthma patients. The chemical name of the antihistamine was beta-dimethylaminoethyl benzhydryl ether hydrochloride. Fortunately, someone came up with a shorter name: Benadryl.

These so-called magic bullets changed medicine dramatically. Within a few years, diseases that were thought to be untreatable could be cured. Though some antibiotics had toxic effects, they increased people's expectation of a long and healthy life. But the magic bullets were not perfect; even in the 1940s, there were signs that some infectious diseases were becoming resistant to sulfa drugs. And doctors slowly learned that patients can be left even more vulnerable to disease if they do not complete a prescribed course of antibiotics. Since the 1940s, many antibiotics, including penicillin, have lost their power to combat certain strains of infectious disease.

WAR ADVANCES MEDICINE

During the 1930s, President Franklin D. Roosevelt offered federal support for health care through his New Deal policies. But World War II had a much greater impact on the development of medicine in the United States. Increased government funding for research brought advances in drug therapies, surgery, and disease prevention. The federal government even helped with the manufacture and distribution of new medicines. The problems faced by soldiers on the front lines led to improvements in psychiatry, as well as the treatment of physical injuries, fatigue, and exposure. Military personnel also were much more likely to suffer from diseases such as influenza, pneumonia, dysentery, gangrene, and venereal (sexually transmitted) diseases. The pressures of war greatly increased the speed and success of medical research in those areas.

Common Killers

During World War II, 325,000 Americans were killed by enemy action. But in 1945 the U.S. Census Bureau reported that between 1942 and 1944, around 500,000 people were killed by cancer. In the first half of the decade, 2 million Americans died from diseases of the heart and blood vessels. Heart disease was the number-one killer illness in the 1940s. Yet Americans did not rate it as a major threat. In April 1940, a Gallup poll showed that the four diseases Americans believed to pose the biggest risk to public health were: syphilis (46 percent); cancer (29 percent); tuberculosis (16 percent); and infantile paralysis, or polio (9 percent).

But although drug treatments and medical knowledge advanced during the war years, World War II caused problems for civilian medicine. The peacetime Army Medical Corps numbered around twelve hundred medical personnel. With a wartime army of eight million soldiers, the Medical Corps had to be expanded. At its peak, it included forty-six thousand medical staff members working in fifty-two general hospital units and twenty evacuation hospital units. Most of the doctors and trained medical staff were taken from civilian hospitals. This triggered a crisis in medical care at home. In an effort to solve the shortage of doctors, medical schools cut the length of their programs from four to three years. Although the three-year courses were year-round programs rather than four nine-month periods, many doctors complained of falling quality. The Committee on Medical Research of the Office of Scientific Research and Development declared that the intensive course led to "surface learning" and poor discipline among young doctors.

The war also changed the way physicians thought about their profession and brought new branches of medicine into the mainstream. In civilian life, most doctors worked on their own or in small private practices. Wartime experience taught them the value of group practice. It allowed individual physicians to develop special interests and skills that could benefit patients. But having experienced life in the army, many were resistant to the idea of giving up control in their civilian life. During the late 1940s, physicians and the federal government struggled for control of American medicine.

STEPS TAKEN TOWARD A POLIO VACCINE

One of the most feared diseases of the 1940s was poliomyelitis (also known as infantile paralysis or polio). Parents were terrified when their children complained of headaches, fever, or sore throats. These were all symptoms of polio. Although in most cases the virus cleared up completely in a few days, a significant number of children, and some adults, were permanently affected. Of those whose nervous system was invaded by the disease, 25 percent suffered mild disability. Another 25 percent became severely disabled, with paralyzed arms and legs. In some cases, the disease paralyzed muscles in the throat and chest, stopping the patient from breathing. More children died from polio during the 1940s than from any other infectious disease. The summer months were the time when epidemics (the rapid spread of diseases) broke out. Children spread polio among themselves when they played together in swimming pools and around playgrounds. When a local polio epidemic broke out, parents kept their children indoors, away from the risk of infection.

Many scientists actually believed it was lack of exposure to the disease that was causing large outbreaks. Improved sanitation since the end of World War I meant that children could avoid contact with polio until they were able to go out on their own. By then, their natural resistance to the disease had become weak. This theory was supported by the fact that the age of polio victims was rising. In 1949, the U.S. Public Health Service published figures showing that, in 1916, 95 percent of polio victims were under the age of ten. In 1947, only 52 percent were below that age. Like President Franklin D. Roosevelt, whose legs were paralyzed by the disease at the age of thirty-nine, an increasing number of those afflicted were adults. This shift in the age distribution caused an unexpected problem for the U.S. Army. Many American soldiers had been brought up in well-kept, modern homes. When they were exposed to the polio virus in the filthy conditions of the battlefield, they caught the disease easily.

After 1945, the number of polio victims continued to rise. By 1949, there were over thirty thousand cases reported each year. But although a 1943 study had shown that gamma globulin (a substance taken from blood) protected monkeys from polio infection, it was impossible to test the technique on humans. With no known cure, polio patients had their paralyzed limbs supported with mechanical braces. If necessary, they were kept breathing with an "iron lung" machine. Elizabeth Kenny (1886–1952) treated patients by massaging their paralyzed limbs and had some success. But the medical profession was unsure of her methods and they were not widely used. The only way to be sure not to contract polio was to stay away from any known victims and to avoid swimming pools and other public places. In other words, it was impossible.

Although polio was still incurable in the 1940s, scientists were gradually winning the race to find a vaccine. John F. Enders (1897–1985), a virologist (virus expert) at Harvard University, discovered a way to artificially produce large quantities of the polio virus quickly in the laboratory. Vaccines are based on a dead or weakened strain of the virus, so large amounts are needed for mass vaccination programs to succeed. Enders's discovery made possible the production of polio vaccines by Jonas Salk (1915–1995) and Albert Sabin (1906–1993) in the 1950s. In 1954, Enders was awarded the Nobel Prize in physiology or medicine for his work on polio.

MENTAL PATIENTS SHOCKED INTO SANITY

Methods of treating mental illness during the 1940s were limited. There were few drug therapies that allowed patients to live normal lives. Severe cases of depression, mania, and schizophrenia were almost untreatable. But in the late 1930s, the idea of shock therapy using drugs or electricity became popular. Early shock therapies were carried out using drugs such as insulin, camphor, or metrazol. Injections of insulin sent patients into a deep coma, while metrazol caused convulsions and spasms. Patients suffering from schizophrenia seemed to respond best to these treatments. Electric shock therapy, however, was cheaper, more easily controlled, and less dangerous than drug therapies. First developed in Italy in 1938, electroconvulsive therapy (ECT), as it was known, soon became the most common treatment for severe mental illness. In 1944, the New York Academy of Medicine produced a study that claimed that an eight-week course of electric shock therapy could replace many years of mental hospital care.

An even more drastic treatment for mental illness was psychosurgery. Psychosurgery was first practiced in the United States in 1936 by Walter Freeman (1895–1972). By the 1940s it was an accepted treatment technique. Psychosurgery involved cutting away parts of the brain that were thought to be malfunctioning. In 1941, two hundred Americans had brain surgery for mental illnesses such as depression, suicidal tendencies, and violent episodes. Surgeons used a long, hollow needle to disconnect the front parts of the brain (known as the prefrontal lobes) from the rest. By disconnecting them, doctors hoped to make patients calmer and easier to manage.

Doctors reported that patients regained their emotional responses after a long recovery period. But they were also left confused, withdrawn, and unable to cope with social situations. In the second half of the decade, it

became clear that psychosurgery was being used primarily as a way to control difficult and dangerous patients. The operation could not be reversed, and many people thought the "cure" was worse than the disease. Even so, in the twenty-first century psychosurgery is still used in certain extreme cases.

Beyond the headline-grabbing stories of electric shocks and psychosurgery, the specialty of psychiatry made major steps forward during the 1940s. For the first time, the mental health of Americans came under scrutiny during World War II. Medical examinations of men drafted into the military showed a surprising number with mental or neurological (brain-related) illnesses. More than 1,000,000 recruits were rejected on those grounds. The horror of war caused mental disorders in many thousands of serving soldiers. Around 850,000 men were treated for mental problems caused by the war. The 25 psychiatric specialists assigned to the Army in 1940 had expanded to 2,400 by 1944.

Treating Depression

In the 1940s, clinical depression, or "melancholia," was the ailment most commonly treated using electric shocks, known as electroconvulsive therapy (ECT). The patient would be strapped to a hospital table with electrodes attached to the sides of his or her head. Patients were usually sedated because spasms caused by the shock could injure neck muscles. A current of between seventy and one hundred volts was applied to the head for one-tenth of a second. The patient was knocked unconscious by the shock but usually revived within a few minutes. This treatment was repeated three times a day for up to eight weeks. ECT was not thought to be dangerous. But it caused memory loss, confusion, and sometimes left patients with a sore neck. In the twenty-first century, ECT is still used for the treatment of severe depression, but it remains very controversial.

When the war ended, such statistics were used to persuade the government and medical organizations that psychiatry was a medical specialty worthy of support. In 1946, Congress passed the National Mental Health Act. The act put in place education programs for thousands of mental health professionals. It began a trend in psychiatry toward the prevention of mental illness. In the 1940s, for the first time, mental patients were treated as people who could be helped.

About this article

The 1940s Medicine and Health: Topics in the News

Updated About encyclopedia.com content Print Article

NEARBY TERMS

The 1940s Medicine and Health: Topics in the News