Science, Technology, War, and the Military
Science, Technology, War, and the Military
Engineers initiated that conjunction early in the nineteenth century. President Thomas Jefferson modeled West Point, the first school of engineering in the United States, on the French state technical schools. The goal of the military academy was not only to train officers for technical service in the army but also to cultivate a pool of engineering talent for the young republic. West Point graduates, both in and out of the service, worked on roads, canals, bridges, and other elements of infrastructure. While the country quadrupled its territory in the nineteenth century, these engineers designed, built, and operated the railroad network that tied the new nation together. In the process, the army divided its personnel into combat and civil engineers, the latter taking up a dominant role in the development of America's water resources.
The military, science, and technology also found themselves jointly engaged in other enterprises in the nineteenth and early twentieth centuries. Both army and navy led exploring expeditions, such as the Lewis and Clark expedition for the Louisiana Purchase (1804–06) and the Charles Wilkes expedition of 1838–42 to the Antarctic and Pacific Oceans. The army developed a meteorology branch that would become the nucleus of the National Weather Bureau. The American System of Manufacture, which caused a stir in Europe after a strong showing at the Great Exhibition of 1851 in London, was based on techniques of large‐scale factory production first developed in government arsenals and private factories producing small arms. Beginning in the nineteenth century, manufacturing standards and contracting protocols were established by army and navy agencies buying goods and services in the private sector. The National Academy of Sciences was created during the Civil War to help the federal government deal with the avalanche of inventions and proposals that poured into Washington, many having to do with military matters.
As the nineteenth century gave way to the twentieth, the relationship deepened between war and science and between the military and technology. War and the military contributed significantly to the development of such technologies as steel, radio, and aviation. Though none of these fields had their roots in the military, all were shaped by military developments and in turn became indispensable components of military capability.
In spite of this historically close relationship, the military services kept technology at arm's length in World War I. Some science and technology was institutionalized during the war. For example, the National Research Council (NRC) was formed as the working arm of the National Academy of Sciences to assist the services in the war effort. And the National Advisory Committee for Aeronautics (NACA) was created in 1915 to keep aviation developments apace with the hothouse activity in Europe precipitated by the war. But generally, when the services wanted scientific and technical talent, they simply inducted into uniform the individuals deemed necessary. Nobel laureate Robert Millikan, for example, left his academic post at the California Institute of Technology first to serve the navy as a civilian member of the NRC and then to accept a major's commission in the U.S. Army Reserves to head the Signal Corps Science and Research Division. He and others like him helped the services make significant advances in submarines, radio, aviation, sound‐and‐ranging techniques for artillery firing, and other areas. Nonetheless, the services emerged from World War I feeling that science and technology had served them poorly. The famed Naval Consulting Board, for example, chaired by Thomas Alva Edison, fell into hopeless wrangling over the creation of a naval research laboratory and contributed little to the war effort. Better institutional arrangements were clearly necessary if the military in the future was to realize the full potential of science and technology.
Just such arrangements appeared in World War II. This conflict was the first in history in which the weapons deployed at the end of the war were significantly different from those that opened it. Many of the new developments—radar, jet propulsion, ballistic missiles, the atomic bomb—were developed largely or entirely in the course of the war. For all the major combatants, this required the mobilization of the full resources of the state, including, of course, its scientific and technical talent.
In the United States, an entirely new institution sprang up to meet the need. Clearly, much technical work continued to be done in the traditional way through contracts with industry and through research and testing in government laboratories and arsenals. But a significant portion of the most innovative and important research and development in World War II was done through the Office of Scientific Research and Development (OSRD). This small, independent branch of government was the responsibility of Vannevar Bush—an inventor, teacher, and former dean of engineering at MIT. Originally constituted as the National Defense Research Committee (NDRC) and modeled on the NACA, which Bush had chaired, OSRD soon took shape as a clearinghouse of scientific and technical talent that could be applied to military problems.
The principles behind OSRD, which were to continue into the postwar world, were three. First, instead of trying to do all its own research, the government contracted with scientists and engineers to perform some of it. Second, instead of inducting them into service, as had been done during World War I, the government left its contractors in place, usually in university research laboratories. Third, developments sprang from two different sources: scientists and engineers might respond to military requests for new techniques or devices, or they might propose new developments themselves.
The entire range of research, from basic research through development of working prototypes, was open to exploration. The OSRD examined proposals from scientists and funded those with merit, and the office also took on problems from the military services and sought out researchers and laboratories to work on them. Radar, for example, the largest area of wartime research outside the atomic bomb project, was divided into more than 100 separate research undertakings and distributed to laboratories and test centers around the United States. OSRD scientists actually flew combat missions with prototype equipment to test it out and bring field results back to the laboratory for further refinement.
Just before war's end, Bush and his colleagues submitted to President Franklin D. Roosevelt a manifesto entitled Science, the Endless Frontier, calling on the government to perpetuate the wartime experience of OSRD in a national research establishment. The purpose was to guarantee the economic and military security of the country by keeping its scientific and technical talent funded and focused on projects of national interest. The proposal ran afoul of political concerns over the autonomy that Bush wanted the scientists to have in setting their own agenda. Only in 1950 did the proposal finally become law, creating the National Science Foundation (NSF). Not the peacetime OSRD that Bush had recommended, NSF left military and medical research and development to other agencies and concentrated on basic, civilian research.
Meanwhile, the military services—the army, navy, and after 1947 the air force as well—took independent steps to institutionalize the scientific and technical assistance that had proved so critical in World War II. While uniformed officers in the United States and other countries had historically been skeptical of technological innovation, they now embraced research and development as the key to national security. The world wars may have been wars of industrial production, but the dramatic weapons innovations of the last conflict, culminating in the atomic bombs dropped on Hiroshima and Nagasaki, led many officers to believe that quality would displace quantity as the determinant of victory in the future.
The services empaneled their own technical consultants, such as the Scientific Advisory Board of the air force; created or continued their own research laboratories, such as the Naval Research Laboratory; and supported research arms at universities around the country, such as the army's Applied Physics Laboratory at the Johns Hopkins University. The developments flowing from these sources resulted in new weapons that succeeded each other in the nation's arsenal at a rate never before seen in peacetime. As the United States slid into a cold war with its former ally, the Soviet Union, a standing military establishment emerged for the first time in the nation's history. Within that establishment, the services competed with each other for the right to develop and deploy ever newer and more sophisticated weapons and thus secure a place on the nation's front line of defense.
Soon this formula produced what President Dwight D. Eisenhower called the “military‐industrial complex.” In his farewell address in 1961, Eisenhower warned of “the unwarranted influence, whether sought or unsought,” of contractors grown dependent on military funding. In private, he spoke of a “delta of power” linking the Department of Defense, Congress, and industry in a mutually reinforcing conflict of interest that shaped U.S. foreign and defense policy and threatened the future of the country. Along with this danger, Eisenhower cautioned that “public policy could itself become the captive of a scientific‐technological elite.”
Eisenhower's warning did not significantly divert the military‐industrial complex. In 1960, more than half the research and development done in the United States—government, corporate, and university‐based—was military. The military services, or defense‐related agencies such as the Atomic Energy Commission, became the principal supporters of research in nuclear physics, computers, microelectronics, space, and other scientific and technical fields. Furthermore, military considerations had second order consequences in seemingly nonmilitary areas of scientific and technical development. The National Defense Education Act, for example, funded graduate study in science and technology for thousands of American students. More broadly, military funding supported a significant percentage of university research in the Cold War and helped to shape these institutions. The national interstate highway system was instituted in the Eisenhower administration in part to facilitate the mobilization and movement of military forces in the event of national emergency. The national space program was launched on military missiles and took its rationale from the Cold War competition with the Soviet Union for the hearts and minds of the world's people. Even social sciences such as psychology and international relations were mobilized in the name of national security.
Preparation for strategic war with the Soviet Union engaged most of the scientific and technical research bent to the Cold War. Nuclear weapons competition dominated the entire conflict, moving from the U.S. monopoly after World War II through the Soviet explosion of its first nuclear device in 1949 followed by the race to thermonuclear weapons in the early 1950s and other innovations such as tactical nuclear weapons and the neutron bomb in the years to follow. At first, airplanes were the delivery vehicles for these weapons, spawning enormous research and development efforts in aircraft development, antiaircraft defense, early warning systems, and electronic countermeasures. By the end of the 1950s, however, ballistic missiles had become the delivery system of choice, and research and development turned to more powerful launch vehicles, improved guidance systems, multiple independently targetable reentry vehicles (MIRVs), ballistic missile defense, and other esoteric technologies of strategic warfare. Throughout the Cold War, U.S. science and technology proved generally superior to that of the Soviet Union, and yet the Soviets displayed a remarkable capability to mimic U.S. achievements and keep the arms race close.
The contest finally climaxed in the midst of the most far‐reaching and expensive gambit of all, a program by the United States to develop a nationwide ballistic missile defense system. President Ronald Reagan's Strategic Defense Initiative (1983–93) invested some $40 billion in ballistic missile defense that critics said could not work; its supporters claimed that the Soviet Union finally had to admit defeat when faced with the prospect of trying to match the effort. In any case, the collapse of the Soviet Union in the late 1980s ended the Cold War and initiated a scaling back of the military‐industrial complex, the end of which is not yet in sight.
Beginning with the Vietnam War, a large amount of attention became focused on conventional and unconventional war. Precision‐guided munitions and “smart” bombs employed advanced microelectronics to achieve unprecedented levels of accuracy and discrimination, seeking not only to hit the desired target but to avoid collateral damage. Sensing devices such as night‐vision scopes and heat‐seeking technology helped combatants find and target the enemy. New types of antipersonnel weapons such as cluster bombs and claymore mines entered the deadly arena of unconventional jungle warfare and other nontraditional fighting environments. Even psychology and social sciences were enlisted in the struggle against enemies who chose not to fight in the traditional Western style.
Weapons and equipment developed for strategic war with the Soviet Union came to be enlisted in nonnuclear war against enemies around the world. Thus the F‐117 stealth fighter, an attack airplane virtually invisible to traditional radar, played a role in both the U.S. invasion of Panama in 1989 and in the Persian Gulf War of 1991. The latter conflict witnessed a rout of the Iraqi Army by United Nations Coalition forces because the sophisticated arsenal of the United States was able to virtually eliminate the command, control, and communications of the Iraqis before the ground engagement began. In the minds of some advocates, the Gulf War witnessed the apotheosis of airpower that had been predicted in some quarters since the 1920s.
The tremendous impact of science and technology on war during the second half of the twentieth century mirrored the equally momentous impact that war had on science and technology. In addition to the phenomena already mentioned, military demand created the discipline of operations research, pioneered the techniques employed in the commercialization of nuclear power, introduced many important medical practices and products, and developed technologies such as high‐resolution photography and the global positioning navigation system that subsequently entered the civilian economy.
[See also Atomic Scientists; Consultants; Hiroshima and Nagasaki, Bombings of; Industry and War; Panama, U.S. Military Involvement in; World War I: Domestic Course; World War I: Postwar Impact; World War II: Domestic Course.]
Harvey M. Sapolsky , The Polaris System Development: Bureaucratic and Programmatic Success in Government, 1972.
Merritt Roe Smith, ed., Military Enterprise and Technological Change: Perspectives on the American Experience, 1985.
Michael S. Sherry , The Rise of American Air Power: The Creation of Armageddon, 1987.
Richard Rhodes , The Making of the Atomic Bomb, 1988.
Donald A. Mackenzie , Inventing Accuracy: An Historical Sociology of Nuclear Missile Guidance, 1990.
Stephen Peter Rosen , Winning the Next War: Innovation and the Modern Military, 1991.
Stuart W. Leslie , The Cold War and American Science: The Military‐Industrial Complex at MIT and Stanford, 1993.
Paul N. Edwards , The Closed World: Computers and the Politics of Discourse in Cold War America, 1996.
Paul A.C. Koistinen , Beating Plowshares into Swords: The Political Economy of American Warfare, 1606–1865, 1997.