Medical Practice in the Military

views updated May 29 2018

Medical Practice in the Military. Military medicine in the United States has both led and followed overall American medical practice. Military medicine has been responsible for some of the most dramatic worldwide advances in health care science; at the same time, it has greatly benefited from the civilian sector's progress. Not surprisingly, military medicine has been in the forefront of mass casualty treatment and trauma care. On occasion, U.S. military surgeons and physicians have also extended the capabilities of international science in diagnosing and defeating some of humanity's most powerful killers, such as malaria. Military medicine has contributed too in the field of prevention.

Colonial and Revolutionary War Practice.

For the first three centuries of American history, New World military medical practices differed little from, and often trailed, European practices. In the late eighteenth century, however, at the time of the Revolutionary War, the social standing of American medical doctors began to rise in comparison to their Old World counterparts.

Before the Revolution, following British practice, American colonial militia organizations usually provided for a surgeon to accompany a regimental‐size force on campaign. In this setting, with an officer class that was heavily oriented to an aristocratic hierarchy, the decidedly middle‐class medical profession was in the lower reaches of influence. Additionally, although there was considerable knowledge of anatomy, a doctor's ability to heal was extremely limited. Surgeons were not commissioned. They were regarded as contract personnel, necessary for military operations, but little more than the tradesmen and teamsters who also accompanied a military column. Military doctors were expected to be both physicians (healing primarily by medicine) and surgeons (healing by manual or instrumental operations). In the English hierarchy, the doctor was only one step above a barber.

Several prewar militia surgeons participated in the opening years of Revolutionary War. John Warren of Boston, who had studied medicine as an apprentice, served with Massachusetts units besieging the British in Boston in 1776 and performed smallpox inoculations. He accompanied Gen. Nathanael Greene's column when the war moved to Long Island, New York, and was appointed surgeon to the Continental army's hospital in Boston.

During the Revolutionary War, American military medical practice did not differ from European in the treatment of wounds and diseases. Such treatment had remained essentially unchanged for two centuries and would see little improvement in the first fifty years of U.S. history. University‐trained physicians were rarely found in British or British colonial military organizations. A military surgeon was likely to be only modestly qualified and had simply taken the title after a period understudying another doctor. Bleeding, based on the theory that purges rid the system of impurities, and the use of mineral drugs—especially heavy metal compounds of mercury, antimony, and arsenic—were common practices. Opium was occasionally used, but alcohol was the more common analgesic. Literature on the subject of military medicine was largely limited to Richard Brocklesby's Economical and Medical Observations on Military Hospitals (1764) and to Philadelphian Benjamin Rush's Directions for Preserving the Health of Soldiers, published during the Revolution. It was not necessarily true that those claiming to be doctors were familiar with even the most fundamental medical texts. The lack of standards on licensing allowed almost anyone to claim expertise in healing.

British physicians went mostly unnoticed during the war, but Rush and four other American medical doctors were signers of the Declaration of Independence. These men and others used their prominence successfully to impress on the Congress and Gen. George Washington the need to create hospitals, stockpile medical supplies, and institute smallpox inoculation for U.S. military forces. Smallpox inoculation—considered a novel and advanced practice for the period—is credited with preserving the Continental army at a critical juncture. Revolutionary War military medicine left its mark on the profession when John Warren's (1753–1815) Boston Army Hospital course on surgical anatomy for young physicians provided the basis for Harvard College's new medical education department. The absence of a rigid social hierarchy in America contributed to the elevation of surgeons and physicians in the New World, rendering their advice in the young nation's governing circles more weighty than in an English setting.

American naval medical practice differed substantially from the army. Naval health care was limited to the use of contract surgeons, who signed on to a ship for a single cruise during the War for Independence. This continued after the war, but in 1801, Congress authorized half‐pay for naval surgeons between cruises, thereby ensuring stability in the navy's medical ranks. The most prominent U.S. naval physician of this era was Edward Cutbush, a militia surgeon during the 1794 Whiskey Rebellion, who worked aboard the frigate United States in 1799. Cutbush produced a widely used text on naval medicine in 1808 that stressed the importance of hygiene and proper diet.

Pre–Civil War Practice.

The pre–Civil War era was marked by the achievement of badly needed military medical organizational and pay changes along with welcome advances in healing. In 1818, Secretary of War John C. Calhoun brought about a permanent medical department to administer health care for the army. His choice to head the department was Dr. Joseph Lovell, an energetic physician who began the systematic collection of medical data and standardization of entrance examinations for aspiring military surgeons. In 1834, Lovell finally persuaded Congress to tie the pay of army surgeons to a major's salary. During the Seminole Wars in Florida, army physicians conducted experiments on malaria victims, and in contradiction to prevalent practice, discovered that large doses of quinine were effective in saving lives. The army's treatment quickly entered civilian practice and this powerful age‐old killer began to be tamed. In 1847, Congress gave army medical officers commissioned rank; for the first time, they began to use military titles. Later, the navy followed suit.

While American military medicine was improving in organization, system, and prestige, it was making little headway against the appalling loss of life in military campaigns—save for the experience with malaria. During the Mexican War, the approximately 100,000‐strong U.S. expeditionary forces lost about 1,500 men in battle, but more than 10,000 to disease. Much of this loss was due to unsanitary conditions in the camps, shallow latrines, ill‐sited drinking water and wash areas, and the lack of sufficient ambulance wagons. Failing to cover body wastes in open latrines promoted the spread of disease by flies. Cooks who handled food with dirty hands and washing areas sited upstream of watering areas were common practices that contributed to long sick lists. These errors and unnecessary losses, chiefly due to the chaos caused by the rapid mobilization of an untrained, ill‐experienced officer corps, were repeated at the beginning of the Civil War.

The Civil War and Post–Civil War Eras.

Shocked by the large numbers of camp deaths during 1861 and the early months of 1862, regular Union army surgeons moved quickly to preserve lives. The Army of the Potomac's medical directors, Charles Tripler and Jonathan Letterman, created a large ambulance service to evacuate the sick and wounded, ordered vaccination against smallpox, and supervised quinine prophylaxis for malaria. These two officers also established a network of supporting hospitals and impressed on Northern officers their duties to constantly stress sound field sanitation practices. By the Battle of Fredericksburg in December 1862, Federal forces enjoyed a decided medical advantage over their less well equipped and less well medically staffed Confederate opponents. This advantage contributed to the all‐important battlefield numerical superiority of Union forces.

Neither Confederate nor Union surgeons were capable of reducing the chances of death due to gunshot wounds. Penetrating wounds by minié balls were usually fatal. A mortality rate of more than 62 percent occurred with a chest penetration, and only 11 percent of the soldiers who received a stomach wound survived.

Although the post–Civil War era, 1866–98, saw little serious American military action, it was a period of progressive change and innovation that produced the golden age of American military medicine. Army and navy surgeons embraced the best of European medical science: Louis Pasteur's germ theory of disease during the late 1860s and Joseph Lister's techniques of antiseptic surgery. Physicians in both services increasingly found themselves giving care to service members' wives and children. This change corresponded with growth in the scope and frequency of medical care for the general population of the United States. In 1884, Congress formally authorized what had been common practice for some time, health care for military dependents. Additionally, the American military, along with the rest of American society, learned about the purification of water supplies and the sanitary control of sewage.

The foundation for U.S. military medicine's claim to nineteenth‐century world renown was brought about by a former Civil War surgeon, George Miller Sternberg. Sternberg, a yellow fever victim in 1875, was detailed to the National Board of Health in 1879 with the Havana Yellow Fever Commission after his recovery and began working at Johns Hopkins University in conjunction with an army assignment a few years later. Sternberg traveled to Europe and learned the best science of bacteriology of the time; he published a book on malaria. This was followed in 1892 by the first American textbook on bacteriology. Established as the premier bacteriologist in the United States, Sternberg was appointed surgeon general of the U.S. Army by President Grover Cleveland in 1893. Using his authority and prestige, Sternberg convinced Congress to found the Army Medical School and recruited such promising young medical officers as Capt. Walter Reed.

The Spanish‐American War of 1898 provided the opportunity for Sternberg, Reed, and others to use their knowledge in the fight against disease. As in the case of the Mexican and Civil Wars, “camp fevers” were rife, especially in the southern U.S. mobilization centers. Of the 6,400 men who died between 1 May 1898 and 30 April 1899, fully 5,400 or 84 percent died of disease. The “fevers” were variously diagnosed as typhoid, malaria, yellow fever, and typhomalaria. Reed, using microscopic examination of blood smears, discovered that the chief culprit was typhoid, a disease that could be halted by well‐known camp sanitation practices. Unfortunately, the rapidly mobilized, mostly volunteer force had a predominantly politically appointed officer corps that was almost wholly ignorant of military affairs and proper field sanitation practices.

Twentieth‐Century Medical Practice.

After the war, The United States had acquired a tropical colonial empire and desired to build a canal linking the Pacific and Atlantic Oceans in Central America. It therefore had great need of experienced military physicians—for both research and teaching. Reed headed the Yellow Fever Commission in 1900–01. The commission proved beyond doubt that the previously established mosquito transmission theory for the disease was correct. Another army surgeon, William C. Gorgas, quickly used his authority as sanitation officer in Havana, Cuba, to eliminate mosquito breeding places; he demonstrated a dramatic decline in that city's normal yellow fever and malaria sickness and mortality rates. Sent to Panama for the canal project, Gorgas brought the malaria morbidity rate down by 90 percent in 1913. The Panama Canal was made possible by the work of Reed, Gorgas, and their fellow army medical workers.

The U.S. Navy began improving its medical practices during the Spanish‐American War and enhanced the prestige of its medical personnel shortly thereafter. A medical corps had been created in 1898, and while fighting was still in progress several merchantmen (commercial cargo ships) were converted into hospital ships. The next year, medical officers were given commissioned rank. Later, medical officers were given command authority over the hospital ships and crews, a controversial decision that was ultimately resolved in favor of the nautical surgeons by the commander in chief, Theodore Roosevelt.

In 1908, the army created the Medical Reserve Corps, an augmentation organization that was separate from the National Guard. A veterinary corps was added to the medical department. Not only were veterinarians highly useful in promoting the health and utility of horses and mules, they were essential in the inspection of meat for troop consumption. Provisions were made with the American Red Cross to supply nurses in time of emergency. The prestige of army doctors rose when Gen. Leonard Wood, a Harvard Medical School graduate and former army physician, became chief of staff of the army in 1910.

These innovations and changes were needed when the United States became involved in World War I in April 1917, joining the Allies in opposition to the Axis powers. Some of the first medical problems faced by the rapidly expanding military medical organization centered on combat aviation. Early in U.S. operations, it was discovered that 300 percent more pilots were dying from accidents than from enemy action. Army medical officers learned that aviators were flying to the point of exhaustion. Flight surgeons were created, and these specialists impressed on commanders the need for sufficient rest between missions. Chemical warfare required the creation of mobile degassing units—organizations that operated under medical supervision and provided showers and new clothing for units that had been exposed to chemical weapons. With the aid of British researchers, the U.S. Army and Navy had adopted a typhoid fever vaccine in 1911, a practice that saved the lives of large numbers of American youth. Compared with 1898, the typhoid death rate in 1917–18 was reduced 185 times. Additionally, an antitetanus serum introduced at the turn of the century greatly reduced the incidence of wounded men succumbing to lockjaw. In France, the Allies instituted a disciplined triage system. Casualties were sorted to facilitate life‐saving priority treatment according to chances of survival. The practice of attaching laboratories to hospitals contributed to rapid diagnosis, and X‐ray machines found their way into military hospitals in France. However, the greatest single improvement was undoubtedly the introduction of blood transfusion to reduce the deadly effects of shock among wounded soldiers.

Between the world wars, military medicine in the United States was influenced by socioeconomic changes and by the burgeoning technological innovations associated with increasingly complex methods of waging war. The growth in specializations within civilian medicine also affected military practice. Increasingly—especially in the navy—officers were selected for postgraduate specialty training in such fields as neurology. As greater percentages of American women chose hospital deliveries, the military services sought out training in obstetrics and gynecology from civilian medical colleges and universities so as to provide military dependents with the modern procedures that all U.S. citizens had grown to expect.

In the 1920s, the income of physicians grew faster than American society in general and service recruiting of medical practioners became more difficult. The Veterans Administration was divorced from the military departments, but it had the indirect effect of assisting in service recruitment by providing an added, postcareer benefit, medical care for those who incur health problems during military service. Both the army and the navy established aviation medical research facilities that developed equipment to allow aircrews to cope with high altitudes and extreme cold. At Wright Field, Ohio, Capt. Harry G. Armstrong of the Army Air Corps studied embolism in pilots and determined that the governing factor was the formation of nitrogen bubbles in the body at high altitudes. The navy also established a submarine medicine program at New London, Connecticut, which produced specially trained corpsmen for submarines and assisted in the development of underwater breathing equipment for use in escape techniques.

During World War II, American military medicine benefited greatly from technological advances. German‐developed atabrine of the 1930s produced a superior prophylaxis against malaria; and the development of penicillin by the pharmaceutical industry vastly improved the chances that a wound victim would overcome infection. Experience with “shell shock” in World War I had stimulated the field of military neuropsychiatry to improve the treatment and handling of World War II soldiers who experienced “battle fatigue.” During that war, psychiatrists discovered that greater recovery rates were often possible if a shaken soldier was returned to his unit and resumed friendships and customary relationships than if he was retained in an unfamiliar mental treatment setting.

Further advances in life‐saving techniques came about through the navy's modification of LSTs (Landing Ship Tanks) into floating evacuation hospitals and the army's creation of Portable Surgical Hospitals. The wide‐scale use of DDT controlled a serious outbreak of typhus in Italy. Defeating typhus, a debilitating, sometimes deadly disease caused by several types of Rickettsia microorganisms, was critical to the Allied cause. Carried by fleas and lice, these microorganisms spread from the civilian to the military populations quickly. The menace was only quelled by a massive overall “dusting” with DDT insecticide. Finally, the scale of American military medicine in World War II explained much of its success. In 1942–45, 40 percent of the country's physicians and health care givers served the military, a population that comprised only 8 percent of U.S. society.

In the three‐decade‐long Cold War era, 1957–89, military medicine continued to adapt to civilian standards, adjusting for the continued diminishment of monetary incentives for doctors choosing a military career while adding to the record of significant advances for the medical profession. Both the army and the navy established residency programs in military hospitals that were designed to meet civilian specialty requirements. Dependent care burdens were partially eased by the 1956 Dependents Medical Care Program, which permitted the use of and compensation for civilian medical care when military facilities were not available. Recruiting difficulties were somewhat ameliorated by the 1972 Health Profession Scholarship Program, which provided medical college tuition and stipends in return for a period of uniformed service. And members of the Army and Navy Nurse Corps, given temporary commissions during World War II, were established as a regular branch and awarded permanent commissioned status.

Cold War medical professionals achieved improvement on the World War II record in saving the lives of American battle casualties. Partially due to the transport of the wounded by helicopters, the rate of those who died from wounds was halved during the Korean War (1950–53) over that of the 1940s. Long‐range air transportation of patients, a World War II innovation, was extended so that almost 30 percent of evacuations were accomplished by this means. The technique of using helicopters and long‐range evacuation continued during the Vietnam War in the 1960s.

Medical advances by military officers in this era included the breakthrough achievements of Navy Capt. Robert Phillips's work on carefully balanced fluids and electrolytes in the treatment of another ancient and worldwide killer, cholera. Army Capt. Edwin J. Pulaski's pioneering work on burn victims in 1947, the establishment of the burn research unit at Brooke Army Medical Center in San Antonio, Texas, and the skin graft innovations of Col. Curtis Artz that followed all contributed to wholly new methods in treating burns throughout the world.

Conclusion.

American military medical experience in the initial years of the post–Cold War era provided every reason to expect a continuing story of successful adaptation to changing environments. By 1991 and the Persian Gulf War, the military establishment had adjusted to its scarcity of doctors by producing an elaborate organization of helpers, technicians, and specialists, who worked under the supervision of doctors. Of the more than 24,000 U.S. Army medical personnel sent to Saudi Arabia, just over 3,000 were medical doctors or dentists. The rest—nurses, assistants, technicians, and specialists—carried on the bulk of health care tasks. Combat casualties in this war were thankfully few, but there was no reason to expect that this new structure could not have performed well in more traumatic circumstances. It had adapted successfully to changing conditions—an established and centuries‐old hallmark of American military medicine.
[See also Combat Trauma; Demography and War; Disease, Tropical; Diseases, Sexually Transmitted; Toxic Agents.]

Bibliography

Surgeon General of the Navy, ed., The History of the Medical Department of the United States Navy in World War II. 3 vols., 1950–53.
M. M. Link and and H. A. Coleman , Medical Support of the Army Air Forces in World War II, 1955.
Bureau of Medicine and Surgery , The History of the Medical Department of the United States Navy, 1945–1955, 1957.
S. Bane‐Jones , The Evolution of Preventive Medicine in the United States Army, 1607–1939, 1968.
S. Neel , Medical Support of the U.S. Army in Vietnam, 1965–1970, 1973.
D. H. Robinson , The Dangerous Sky: A History of Aerospace Medicine, 1973.
R. C. Engelman and and Robert J. T. Joy , Two Hundred Years of Military Medicine, 1975.
Albert E. Cowdrey , The Medic's War: Korea, 1987.
Graham A. Cosmas and and Albert E. Cowdrey , Medical Service in the European Theater of Operations, 1992.

Rod Paschall