The 1920s Medicine and Health: Topics in the News

views updated

The 1920s Medicine and Health: Topics in the News

COMMUNICABLE DISEASES: RIDDING THE WORLD OF DEADLY INFECTION
DIAGNOSIS TECHNOLOGY: GETTING TO THE ROOT OF DISEASE
HEALTH OF WOMEN AND CHILDREN: SIGNIFICANT REFORMS AND BACKLASH
INSULIN: COMMUTING A DEATH SENTENCE
NEW MEDICAL MACHINERY: THE IRON LUNG
PENICILLIN: THE ORIGINS OF A LIFESAVING DRUG
RORSCHACH TEST: HIDDEN PSYCHOLOGICAL MEANINGS
RURAL DISEASES: HOOKWORM, PELLAGRA, AND TULAREMIA
VITAMINS: ISOLATING SUBSTANCES ESSENTIAL TO HEALTH

COMMUNICABLE DISEASES: RIDDING THE WORLD OF DEADLY INFECTION

During the 1920s, individuals who contracted a host of diseases found their health seriously imperiled and their lives endangered. Back then, measles was a common childhood illness. Its symptoms include fever, sore throat, and skin rash. While the disease usually was not fatal if the child who contracted it received adequate care, large percentages of youngsters in foundling hospitals died of measles during the decade. Additionally, a very real danger existed for developing blindness. One of the initial steps in finding a cure for measles was the identification and isolation of the microorganism (germ) that caused the disease. While this was not accomplished during the decade, a measles-fighting serum was developed from the blood of convalescing measles patients; it provided limited resistance to the disease.

Prior to 1923, scarlet fever was a menace to the health and well-being of people of all ages. Those who contracted this deadly contagious disease were in danger of suffering from blindness, deafness, heart and kidney ailments, and permanent paralysis. Usually a yellow flag and a printed notice were posted outside the home of an afflicted individual to warn potential visitors of the danger. A person with scarlet fever usually remained isolated for one month, after which the room in which he or she resided had to be fumigated. Clothing and dishes used by the infected individual had to be carefully cleaned with disinfectant.

Among the symptoms of scarlet fever are a red body rash and inflammation of the mouth, throat, and nose. Early in the decade, husband-and-wife researchers George (1881–1967) and Gladys (1881–1963) Dick isolated the germ that caused scarlet fever. Their effort led to the development in 1924 of a serum that effectively battled the disease. While this serum eliminated the potential of mass epidemics of scarlet fever, there still was no cure.

Tuberculosis was another lethal communicable disease common in the 1920s. It was caused by the presence of bacteria in the body; while it could ravage just about any organ or tissue, well over 90 percent of all tuberculosis cases were centered in the lungs. Before the decade began, researchers were able to identify carriers of the disease. Then in 1921, French microbiologist Albert Calmette (1863–1933) and veterinarian Camille Guerin (1872–1961) produced the first tuberculosis vaccine. This breakthrough resulted from their discovery that the body's immune system built up a resistance to the disease after being exposed to a mild tuberculosis infection. Calmette and Guerin's vaccine, known as the BCG (Bacillus Calmette-Guerin), first was used in Paris in 1922. By the end of the decade, it was employed throughout Europe and Asia. However, the medical community in England and the United States demanded more extensive testing. It was not until the 1950s that the vaccine was accepted worldwide.

Syphilis, a disease that is caused primarily by sexual contact, can be fatal if left untreated. Side effects include paralysis and mental derangement. In 1906, August von Wasserman (1866–1925), a German bacteriologist, devised the first syphilis test. It involved examining the blood of the infected individual to see if antibodies had formed within the body to fight the disease. Not only was the Wasserman test complicated and time-consuming, but its overall effectiveness was also questioned. The February 1923 issue of the American Journal of Public Health noted that the "many variable elements of this test give it numerous sources of error.… " Reuben Leon Kahn (1887–1979), a major in the U.S. Army Medical Service Corps (MSC), had developed a blood test that was easier, quicker, and far more accurate. Kahn introduced his test in 1923; soon afterwards, his method was used regularly around the world.

DIAGNOSIS TECHNOLOGY: GETTING TO THE ROOT OF DISEASE

During the 1920s, a host of newly developed instruments enabled doctors to chart the functions of the body and diagnose illnesses more easily. Hans Berger (1873–1941), a German psychiatrist and scientist, was fascinated by the function of the brain and the relationship between the brain and the mind. Understanding this connection, he believed, would provide insight into mental functioning and disturbances. His attempts to discover a method of detecting and recording human brain waves resulted in his invention of the electroencephalograph, which employs a pair of electrodes (electric conductors) placed on the scalp to transmit the wave patterns to one of the instrument's several recording channels. Berger made his first unsuccessful attempt to record brain waves in 1920 by stimulating the cortex (the outer layer of the cerebrum and cerebellum, which are two parts of the brain) of individuals suffering from skull defects. He did so by applying electrical current to the skin covering the defect. Despite his failure, Berger carried on his work. Before the end of the decade, he had devised his electroencephalograph and successfully recorded the first electroencephalogram (more commonly known as an EEG). EEGs became a vital source of information for doctors examining patients who had suffered significant head injuries, cerebral infections, brain tumors, and illnesses related to the nervous system. For his work, Berger earned recognition as the "father of electroencephalography."

George N. Papanicolaou (1883–1962), a Greek physician who settled in the United States in 1913, specialized in cytology (the area of biology that centers on the make-up, function, history, and production within the body of cells). Upon studying the vaginal discharges of female guinea pigs, Papanicolaou observed changes in the sizes and shapes of their cells. He associated these changes with alterations in the uterus and ovaries that occurred during the pig's menstrual cycle. He identified similar changes in the vaginal cells of women, and he noted abnormalities in the cells of those patients afflicted with cervical cancer. All this research resulted in his perfecting what came to be known as the Papanicolaou (or "Pap") smear: a procedure in which cells are subjected to a staining (coloring) technique that marks diseased tissue. The "Pap" smear proved an invaluable tool in early cancer detection. It enabled doctors to note the presence of cancer cells five to ten years prior to the appearance of symptoms. Besides cervical cancer, the test was employed in the diagnosis of colon, prostate, bladder, lung, breast, sinus, and kidney cancer. Papanicolaou began his research in 1920. He first announced his findings eight years later, in a paper titled "New Cancer Diagnosis." At the time, he noted that "a better understanding and more accurate analysis of the cancer problem is bound to result from use of this method." However, "Pap" smears were not widely accepted until the late 1940s.

Some medical devices had been developed in previous decades; during the 1920s, their inventors were honored for their work. In 1887, Augustus Waller (1856–1922), a lecturer at St. Mary's Medical School in London, demonstrated the possibility of measuring heart activity. In 1903, Willem Einthoven (1860–1927), a Dutch physiologist, expanded on Waller's work. He developed the first practical electrocardiograph, a machine which measured electrical currents of the heart and diagnosed irregularities in heart action. Einthoven's first machine was a string galvanometer (a device which consists of a thin thread on which the electrical current is detected and measured). He continued perfecting the process; eventually, electrodes attached to wires were placed on the patient, whose heartbeat was recorded. The electrocardiograph became an important diagnostic resource for patients with heart ailments. For his work, Einthoven was awarded the Nobel Prize for physiology or medicine in 1924.

HEALTH OF WOMEN AND CHILDREN: SIGNIFICANT REFORMS AND BACKLASH

In 1921, approximately eighteen thousand American women died during childbirth. Meanwhile, the previous year, 248,432 American children under the age of five had died. The death rate for infants in orphanages approached 100 percent.

These figures were startling and disturbing. The harsh reality that childbirth was a potentially deadly proposition and that youngsters were susceptible to a range of possibly fatal maladies resulted in swift government action. Child hygiene information was made available to parents. Public health and infant welfare services were established. Six national health groups united to form the American Child Health Organization. Its president was Herbert Hoover (1874–1964), who in 1928 was elected U.S. president. During the 1920s, Hoover helped raise funds to support health education in America's schools and supported immunization against diphtheria and smallpox. He even headed a fund drive that netted the American Red Cross $15 million in donations.

Repressive Child Care

In 1946, Dr. Benjamin Spock (1903–1998) published his Common Sense Book of Baby and Child Care, a landmark guide that became the preeminent source for sound advice on child rearing. The earlier decades of the century also had their own primary authority on babies: Dr. L. Emmett Holt (1855–1924), a professor of pediatrics at Columbia University. In 1894, Holt first published his influential book, The Care and Feeding of Children. Through 1935, it was updated and republished fourteen times. His advice was archaic when compared to Spock's. Contemporary parents who followed it might even be accused of child neglect!

Holt theorized that mothers should not pick up crying babies because they would become spoiled if overly handled. He wrote, "Babies under six months old should never be played with; and the less of it at any time the better for the infant." He even recommended that babies wear mittens or have their hands fastened to their sides during sleep, to prevent sucking and masturbation. "In more obstinate cases," he suggested, "it may be necessary to confine the elbow by small pasteboard splints.…"

The Sheppard-Towner Maternity and Infancy Protection Act, passed by the U.S. Congress in 1921, was a milestone in health reform for women and children. Sheppard-Towner grew out of women's concerns for their children's health, and their very real fears that their offspring would die. The cause of high infant mortality rates during the 1920s was not so much a lack of scientific knowledge as a lack of education and available health services for women. The act made available federal funding for prenatal (prebirth) and infant health clinics and educational material for pregnant women and mothers. It was operated by the Children's Bureau, a division of the U.S. Department of Labor.

Initially, Sheppard-Towner was to remain in effect for five years. During this period, the infant mortality rate significantly declined, while the maternal death rate also moved downward. However, when the law came up for renewal in 1926, political conservatives denounced the program's female administrators as socialists. The American Medical Association (AMA) also lobbied for the repeal of Sheppard-Towner, because its members reportedly dreaded the competition presented by free health centers. Even though the act was supported by President Calvin Coolidge (1872–1933), it only remained in place for another two years before being repealed. Between 1928 and 1932, fourteen attempts were made to reverse the repeal. All of them failed.

INSULIN: COMMUTING A DEATH SENTENCE

Diabetes, a disease that often occurs in children, is an ailment in which the pancreas is unable to produce the proper level of insulin. Insulin is essential for muscle cells to utilize glucose (the form of sugar absorbed into the body). Each year, diabetes killed untold thousands. Before the 1920s, a diabetes diagnosis was considered the equivalent of a death sentence.

During the early 1900s, countless diabetes-related experiments were conducted. Finally, Frederick Grant Banting (1891–1941), a young Canadian physician, began experimenting with the assistance of Charles Best (1899–1978), a twenty-one-year-old medical student. The two worked in a laboratory at the University of Toronto where John J.R. MacLeod (1876–1935), a professor of physiology and head of the school's physiology department, was an authority in diabetes research. MacLeod initially was skeptical about Banting and Best's desire to conduct research. In May 1921, after MacLeod went on holiday to his native Scotland, Banting and Best took over one of the university's laboratories. Within months, they were able to induce a diabetic coma in a dog by removing its pancreas. Then they restored the animal to health with a substance they had isolated from the pancreas of another dog. They labeled this substance the "X Factor." However, the first animal soon died. This led Banting and Best to determine that the "X Factor" needed to be injected into the diabetes sufferer each day. The two conducted further experiments, and eventually they named the "X Factor" insulin.

The following year, Banting and Best tested insulin on themselves to prove that it was not harmful. Then they injected it into a twelve-year-old boy who was dying of diabetes. He quickly recovered. Other sufferers soon came forward, and soon insulin was being administered to countless diabetics. While Banting and Best labored to determine how to produce mass quantities of insulin, MacLeod traveled around offering lectures on "his" discovery.

Banting's life had been touched by diabetes: when he was fifteen, both his best friend and his sweetheart succumbed to the disease. Meanwhile, Best's favorite aunt recently had died of diabetes. As a result, both men were determined to find a cure for the disease. While insulin intake does not cure diabetes, it provides a way of effectively controlling the disease, and its discovery has prolonged the lives of millions. In 1923, Banting and MacLeod were awarded the Nobel Prize for physiology or medicine as "co-discoverers" of insulin. Banting believed that Best also should have been acknowledged, and gave half of his monetary prize to his colleague. MacLeod then agreed to share half of his prize with James B. Collip (1892–1965), the chemist who had worked with them to purify the "X Factor."

NEW MEDICAL MACHINERY: THE IRON LUNG

During the 1920s, polio (poliomyelitis, or infantile paralysis) was still an infectious and deadly disease. Sufferers were afflicted with nerve cell destruction, muscle deterioration, and crippled limbs. The invention of the iron lung, a mechanical respirator, greatly eased the plight of polio victims, and allowed them to stay alive indefinitely. The lung was devised by Philip Drinker (1894–1972) of Harvard University after he observed research into the development of artificial respiration techniques for patients who had just undergone surgery. Drinker and his colleagues experimented using paralyzed cats. The iron lung they developed consisted of a large, airtight metal tank into which all but the head of the patient was placed. The machine breathed for the patient by operating a set of bellows (an instrument which draws in and discharges air). The bellows created pressure inside the machine, which acted like a human diaphragm and freed the patient's lung to expand and contract. The iron lung was first used in 1928 on an eight-year-old polio sufferer who was experiencing respiratory paralysis. The machine kept her breathing for five days, until she died of other polio-related complications. Next it was used on a Harvard student afflicted with polio. The lung assisted him in his breathing for several weeks. Eventually, he recovered.

For several decades, the iron lung remained an indispensable tool for saving the lives of polio patients. It was not until the 1950s that the development of a polio vaccine effectively eliminated the disease.

PENICILLIN: THE ORIGINS OF A LIFESAVING DRUG

Some health-related breakthroughs made during the 1920s laid the groundwork for future scientific innovations. One was the discovery of penicillin by Alexander Fleming (1881–1955), a Scottish bacteriologist and research scientist.

Medical Nonmiracles

In 1927, Morris Fishbein (1889–1976), editor of the Journal of the American Medical Association, published a book with lengthy, unusual, and self-explanatory title: The New Medical Follies: An Encyclopedia of Cultism and Quackery in These United States, with Essays on The Cult of Beauty, The Craze of Reduction, Rejuvenation, Eclecticism, Bread and Dietary Fads, Physical Therapy, and a Forecast as to the Physician of the Future.

Fishbein wrote,

Among the hundred or more types of healing offered to the sophisticated is aerotherapy. Obviously aerotherapy means treatment by air, but in this instance hot air is particularly concerned. The patient is baked in a hot oven. Heat relieves pain and produces an increased flow of blood to the part heated.… Aerotherapy as one department of physical therapy becomes a cult when it is used to the exclusion of all other forms of healing. In New York a progressive quack established an institute equipped with special devices for pouring hot air over various portions of the body. He issued a beautiful brochure, illustrated with the likenesses of beautiful damsels in various states of negligee, smiling the smile of the satisfied.… In this document appeared incidentally the claim that hot air will cure anything from ague to zoster.…

Fishbein also observed,

One Dr. Fitzgerald of Hartford, Connecticut, has divided the body into zones, lengthwise and crosswise, and heals disease in one zone by pressing of others. To keep the pressure going he developed little wire springs. For instance, a toothache on the right side may be 'cured' by fastening a little spring around the second toe on the left foot. Naturally, Fitzgerald has never convinced any one with ordinary reasoning powers that there is anything to his system—except what he gets out of it.

Early in the decade, Fleming discovered lysozyme, an antibacterial substance found in such body fluids as mucus, saliva, and tears. Then in

1928, he was organizing some petri dishes in which he had been growing bacteria. Fleming noticed that mold was growing on one of the dishes. He further observed that the mold had killed the bacteria. After taking a sample of the mold, he found that it was from the penicillium family. He named its active ingredient penicillin and first reported his discovery the following year. Fleming continued experimenting with the mold, while chemists began to grow and refine it. In 1950, penicillin was produced artificially for the first time. This breakthrough, directly linked to Fleming's 1928 discovery, resulted in the widespread employment of penicillin as an infectionfighting agent and the world's most effective lifesaving drug.

RORSCHACH TEST: HIDDEN PSYCHOLOGICAL MEANINGS

In 1921 Hermann Rorschach (1884–1922), a Swiss psychoanalyst, devised a test whereby patients undergoing mental evaluation would be shown a series of ten symmetrical ink blots. Upon observing them, patients described and interpreted what they "saw" within the patterns, details, and shadings. As a result, aspects of their personality were revealed.

While previously, psychiatrists had employed such tests as free-association exercises, Rorschach believed that a carefully devised examination could be used as a major component in offering a thorough psychological evaluation of a patient. According to Rorschach, descriptions and responses could be analyzed to determine the psychological processes existing within the patient's mind. This data would then allow the psychiatrist to diagnose specific clinical disorders.

During subsequent decades, scientists have questioned the value of the Rorschach test. However, it has been extensively used in many countries around the world.

RURAL DISEASES: HOOKWORM, PELLAGRA, AND TULAREMIA

A number of serious diseases primarily plagued residents of America's hinterlands. Before and during the early years of the twentieth century, many of those living in the rural South were impoverished and wore no shoes. As a result, they were susceptible to hookworm, a disease that manifested itself in warm climates. The worm that caused the malady was hatched from the larvae of eggs found in soil tainted by human excrement. After entering the body through the feet, the worm made its way into the victim's intestinal tract. From there, it lived off the victim's blood.

The hookworm-infected individual suffered from anemia and often died. Afflicted youngsters, if they survived, often emerged with physical and mental deficiencies. To combat hookworm, the Rockefeller Foundation organized a health commission to inform people about proper sanitation. Between 1910 and 1915, the foundation surveyed sixteen southern counties and determined that the rate of hookworm infection was 59.2 percent. By 1923, the rate had decreased to 23.9 percent. Nonetheless, hookworm remained a serious problem in America throughout the decade.

Pellagra was an often fatal malady with symptoms including skin rashes, stomach irregularities, and mental disorders. At the beginning of the 1910s, its cause and cure remained unknown. Joseph Goldberger (1874–1929), a U.S. Public Health Service (USPHS) researcher, was assigned to study the disease. Goldberger determined that pellagra was caused by improper nutrition and could be abated by adding protein and niacin to the diet. He also found that brewer's yeast was an effective remedy. In 1927, the USPHS determined that 170,000 Americans were afflicted with pellagra; from 1924 through 1928, the mortality rate was 58 percent. For six years, beginning in 1927, the American Red Cross handed out over two hundred thousand pounds of brewer's yeast in an effort to combat the disease.

Tularemia, a lesser-known affliction, was as equally jarring to its sufferers. This communicable disease was caused by an organism that grew in the blood of infected rodents, and its symptoms included fever, body-aches, headache, chills, vomiting, sore and oversized lymph glands, and a circular sore at the location of the infection. Those most at risk included anyone who spent large amounts of time outdoors; victims included hunters and butchers who had handled infected meat, or any individual bitten by a tick or deer fly that previously had bitten an infected animal. During the 1920s, treatment for tularemia primarily consisted of lengthy bed rest, lasting between two months and a year. In later decades, such antibiotics (chemical substances, produced by microorganisms, that halt the growth of or completely destroy bacteria) as streptomycin, gentamycin, and tobramycin were used to combat the disease.

VITAMINS: ISOLATING SUBSTANCES ESSENTIAL TO HEALTH

Before the decade began, Dutch physician and pathologist Christiaan Eijkman (1858–1930) and British biochemist Frederick Hopkins (1861–1947) discovered that important nutrients are required to maintain good health. These nutrients became known as vitamins. For their discovery, Eijkman and Hopkins shared the Nobel Prize for physiology or medicine in 1929. Meanwhile, many researchers studied vitamins during the 1920s, and several new vitamins were discovered during the decade.

In 1922, Elmer McCollum (1879–1967) reported the discovery of vitamin D, a fat-soluble vitamin found in milk, egg yolks, and fish-liver oils. McCollum's research proved that vitamin D and sunlight were useful in combating rickets, a disease caused by a calcium deficiency that results in insufficient skeletal growth and deformed bones in children. Harry Steenbock (1886–1967) furthered McCollum's work by determining that exposure to sunlight converted chemicals found in food into vitamin D. Also in 1922, Herbert McLean Evans (1882–1971) and Katharine Scott Bishop (1889–1975) announced the discovery of vitamin E, a fat-soluble vitamin whose deficiency results in muscle and vascular abnormalities and infertility. In 1926, B.C.P. Jansen (1884–?) and W. F. Donath reported the isolation of vitamin B1 (thiamine), utilized by the body to convert amino acids, fats, and carbohydrates into energy. In 1928, Albert Szent-Gyorgyi (1893–1986) reported the isolation of vitamin C (ascorbic acid), a water-soluble vitamin found in leafy vegetables and fruits. This discovery resulted in the eradication of scurvy, a disease whose symptoms included skin discoloration, anemia, and tooth loss. Because no fresh fruits and vegetables were available on long sea voyages, scurvy was common among sailors. Finally, in 1929, Carl Dam (1895–1976) reported the discovery of vitamin K, a fat-soluble vitamin necessary for the clotting of blood.

Szent-Gyorgyi, a Hungarian-born chemist and biologist, earned the Nobel Prize in physiology or medicine in 1937 for isolating vitamin C. He accomplished this feat while experimenting on cell respiration, during which he separated a chemical from lemon-and orange-producing plants. He named this substance ascorbic acid, or vitamin C. Referring to the process by which researchers employ trial and error to achieve medical breakthroughs, Szent-Gyorgyi is credited with two significant observations: "A discovery is said to be an accident meeting a prepared mind"; and "Discovery is seeing what everybody else has seen and thinking what nobody else has thought."

About this article

The 1920s Medicine and Health: Topics in the News

Updated About encyclopedia.com content Print Article

NEARBY TERMS

The 1920s Medicine and Health: Topics in the News