The 1940s Science and Technology: Topics in the News

Updated About content Print Article Share Article
views updated

The 1940s Science and Technology: Topics in the News



Before the 1940s, archaeologists spent most of their time putting the artifacts they found in chronological order. They were mostly interested in how old a found object might be relative to other objects. In the 1940s, however, they began to look at their finds in a different way. While it was still important to date an object, it became even more interesting to try to work out how it had been used. Objects from the past could help scientists learn about how people lived and how ancient societies worked.

The way archaeologists in the 1940s used objects to explain ancient societies was first to look at where an object was found (known as the microcontext), and then compare it with similar objects found at more distant sites (known as the macrocontext). The pioneer of this form of scientific approach was archaeologist John W. Bennett (1918–). In 1943, he found many copper and shell artifacts in the southeastern United States. Based on where they were found, he decided they must have some religious or ritual significance. Similar finds in Georgia and Oklahoma convinced him that the religious cult that used these objects was widespread.

Context became more important to other scientists as well. Anthropologists study human beings, the way they respond to their physical environment, and the societies in which they live. During the 1940s, anthropologists began to study the way society and culture influence the personalities of individuals. Margaret Mead (1901–1978) wrote Coming of Age in Samoa (1928), the classic text using this approach. Her book showed that Samoan girls experienced none of the psychological problems prevalent among American teenagers. This research demonstrated that human psychology was influenced by cultural context, not "preprogrammed" into the brain. During World War II (1939–45), Mead worked for the National Research Council and the Office of War Information (OWI). Coming of Age in Samoa was republished for the armed forces by the OWI in 1945. It was used by army planners as they began to reconstruct shattered communities.

Anthropologist Ruth Benedict (1887–1948) was also sponsored by the OWI. After researching Japanese culture and society, she concluded that the wartime behavior of the Japanese could be explained by their culture. The Japanese, she argued, were expected to suppress their emotions and to obey authorities. Benedict's work helped explain behavior that many Americans had considered barbaric. It also influenced U.S. policy toward Japan after 1945. Both anthropologists and archaeologists in the 1940s took a more flexible approach to the connection between individuals and their environment. By doing so, they were able to enhance understanding of the past and influence policies for the future.


During World War II, aircraft were used for many military purposes, from dropping bombs to airlifting supplies and spying on the enemy. Military needs brought great technological improvements, but conventional airplanes with propellers and piston engines had reached their limits by 1945. By then, research had turned toward the jet engine. Other developments in flight technology during the 1940s included the helicopter, the pilotless winged missile, the long-range rocket, and the first true space vehicle.

Balloon Bombs

The prevailing wind direction across the Pacific is from the west. During World War II (1939–45), the Japanese took advantage of this fact to launch balloon bombs. These were high-altitude balloons carrying bombs intended to fall on the United States. Thousands of balloons were launched, and around three hundred of them came ashore, some as far inland as Iowa and Kansas. Some of the bombs caused wildfires with no casualties, but in Oregon six people were killed. Somehow, the U.S. government managed to keep the balloon-bomb threat secret from the American public, and the Japanese eventually abandoned the idea. Documents from 1944 have since revealed that the Japanese military also was intending to use balloons to spread deadly diseases such as anthrax and plague throughout the United States.

The demands of war meant that military aviation development had to be carefully organized. Physicist Joseph S. Ames (1864–1943) headed the National Advisory Committee for Aeronautics (NACA). NACA funded research into aerodynamics (the study of the motion of air), engine design, and construction methods and materials. Partly as a result of NACA support, the United States was able to produce 296,000 military aircraft in the 1940s. But NACA's major influence was on airplane design. By 1945, aircraft were faster, more controllable, and covered greater distances with the same fuel load. By 1949, the Boeing B-39 strategic bomber could carry a four-and-one-half-ton bomb for 9,950 miles. The useful range for heavy bombers had doubled in just three years.

By the early 1940s, designers realized that airplanes driven by propellers could not be improved much further. Though the jet engine had been patented by British inventor Frank Whittle (1907–1996) in 1930, the first jet aircraft did not fly until 1940. The first American jet fighter, the P-59A Airacomet, was not tested until 1942. Early jets lacked power and used a huge amount of fuel. Even so, the German Messerschmitt Me-262 became the first jet aircraft to fly in combat, on July 28, 1944. The first American jet to see military service was the Lockheed F-80 Shooting Star, which was first tested in January 1944. But the F-80 was not used in combat until the Korean War (1950–53).

After 1945, jet propulsion would drive aircraft farther and faster than had ever seemed possible. But until 1947, a barrier lay in the way. As an aircraft flies faster, air pressure builds up on forward-facing surfaces. At around the speed of sound (741 miles per hour at sea level, and slower at higher altitudes), air particles form a barrier (known as the sound barrier) that prevents conventional planes from going any faster.

Aircraft must have special streamlined shapes and extremely powerful engines to penetrate this barrier safely. When they do so, a shock wave creates a loud sound known as a sonic boom. On October 14, 1947, Captain Charles E. "Chuck" Yeager (1923–) flew the rocket-powered X-1 aircraft faster than the speed of sound. He became the first human to break the sound barrier in controlled, level flight. Yeager later flew the plane at its maximum speed of 957 miles per hour.

The X-1's rocket motor produced six thousand pounds of thrust. The rocket was the product of research begun in Germany in the 1920s by scientist Wernher von Braun (1912–1977). Von Braun's V-2 rocket was designed to be used as a long-range missile. It climbed to a height of fifty miles before falling out of control back to earth. More than three thousand of these rockets fell on European cities during the war, most of them on London. After 1945, many V-2s, along with several German rocket scientists, were brought to the United States. American scientists combined the V-2 with their own rocket motor, the Corporal, to create Project Bumper. On February 24, 1949, General Electric and the Jet Propulsion Laboratory (JPL) launched a rocket called Bumper Five. It reached an altitude of 410 kilometers above Earth, becoming the first true space vehicle. Less than fifty years after the first-ever engine-powered airplane flight, the space race had begun.


Genetics is the branch of biology that explains the way characteristics such as hair and eye color are passed from one generation of a species to the next. In the 1940s, geneticists discovered that the units that specify these characteristics, or genes, are located within chromosomes. Every living thing is made up of billions of small building blocks called cells. Chromosomes are tiny, threadlike structures that exist within each cell. They are made from two types of substances: proteins and nucleic acids. Each chromosome contains a large number of genes, the individual codes that determine an individual's inherited characteristics. Genetics made great strides in the 1940s. But geneticists did not manage to explain how chromosomes were able to carry and transmit information to succeeding generations.

In 1944, at the Rockefeller Institute, a team of scientists made the first step toward solving the puzzle of chromosomes. They discovered that it was nucleic acids, not proteins, in a cell that determined its organism's genetic traits (inherited characteristics). Their research was carried out on a simple organism, the pneumococcal bacterium. But they believed correctly that the same conclusion would be true for humans. A few years later, researcher Barbara McClintock (1902–1992) came up with the idea of "jumping genes." McClintock's theory was that genes could move around on each chromosome between generations. Their influence on one another would change, depending on where they sat relative to each other. Many scientists rejected her idea at first, but it was confirmed in the 1960s. McClintock won the Nobel Prize in 1983 for her work in genetics.

Before the advances in knowledge made during the 1940s, biologists had been split between two different ideas about the way characteristics are inherited. Geneticists looked for genetic information within the organism itself. Evolutionary biologists, who are followers of Charles Darwin (1809–1882), however, saw the environment around the organism as more important than heredity. At a meeting at Princeton, New Jersey in January 1947, the two sides came together for the first time in twenty years. New discoveries in genetics had showed how Darwin's theory of evolution by natural selection could work. Suddenly, geneticists and evolutionary biologists had very little about which to argue. Both groups had turned out to be right.


The creation of the first digital computer systems emerged from collaboration among the military, universities, and private businesses. Early computers could weigh as much as thirty tons and were controlled by a series of plugs and wires. But these bulky machines gave the Allied nations a huge advantage in military intelligence during World War II, making it possible for them to decode messages sent by the Germans and the Japanese.

In 1937 International Business Machines (IBM), a manufacturer of typewriters and adding machines, began a joint research project with Harvard University to create a machine that could do calculations automatically. IBM did not expect to make money. Rather, it hoped the success of the project would improve its reputation with the scientific community. The machine that grew out of this collaboration was known as the IBM Automatic Sequence Controlled Calculator, or the Harvard Mark 1 for short. It was demonstrated to Harvard faculty by Howard Aiken (1900–1973) in 1943. The Mark 1 was no more powerful than a 200-era palm-sized scientific calculator. But for the first time, a computer was shown to follow a sequence of commands (a program) and produce accurate final results from raw data. The Mark 1 was used by the U.S. Navy from 1944 onward.

The Mark 1 was an electromechanical machine, relying on electrical currents to move parts of the machine around to create different configurations. It was not unlike a manual telephone exchange and consequently was very slow. But computers soon sped up. In 1944, British mathematician Alan Turing (1912–1954) built a digital computer known as the Colossus. Colossus was used to crack the Enigma code used by the German military to send secret messages. The Enigma code-breaking machine gave Allied military commanders a key advantage when planning the invasion and occupation of mainland Europe. Using the same numbers-based logic system as that employed by Turing, Americans J. Presper Eckert (1919–1995) and John Mauchly (1907–1980) built the ENIAC (Electronic Numerical Integrator and Computer) at the University of Illinois in 1946. The first general-purpose digital computer, ENIAC was about the size of a three-axle truck. In the days before microchips and transistors, the ENIAC contained eighteen thousand fragile glass vacuum tubes and weighed thirty tons.

Tiny Transistors

Early electronic devices used vacuum tubes to amplify electronic signals (make them stronger). They were made of glass and were fragile, heavy, and large. Several tubes were often needed, and they generated a lot of heat and used a lot of electricity. Then in 1947, scientists at the Bell Telephone Laboratories found they could amplify electronic signals by using tiny devices known as transistors. Made from microscopic parts, transistors were tiny, operated at low temperatures, and used around one-twentieth of the power of vacuum tubes. They were also very tough, making them ideal for portable equipment. By the mid-1950s, almost every electronic device contained transistors.

Computers themselves were in their infancy in the 1940s. But the new field of computer science was already raising many questions about logic, language, and the workings of the human mind. Norbert Wiener (1894–1964) invented the term "cybernetics" to describe his work on the similarities between automatic machines and the human brain. The word itself comes from the Greek word for "steersman." Wiener saw the human nervous system as a system of control and feedback mechanisms, rather like the rudder of a ship. This was exactly the principle used in designing the new computers. Researchers such as Gregory Bateson (1904–1980) and his wife Margaret Mead also asserted that the human brain behaves rather like a machine, in that it has memory, it can associate pieces of data, and it can make choices. Late-twentieth-century research into artificial intelligence has revealed as

many differences as similarities between brains and machines. But Wiener's predictions that cybernetics would provide control mechanisms for artificial limbs and mechanized industry have turned out to be accurate.


The principle of radar, that radio waves bounce back from objects and can be detected by a receiver, was established in 1930 by Lawrence Hyland (1897–1989) at the Naval Research Laboratory. The term "radar" stands for radio detection and ranging. A workable radar device was patented in 1935 by Scottish scientist Robert Watson-Watt (1892–1973). But it was not until 1940 that American radar research became a priority.

In the summer of 1940, Henry Tizard (1885–1959) and a team of scientists arrived in the United States from Britain with the aim of sharing military secrets. The Tizard mission brought with it a machine called the resident cavity magnetron. It was able to produce radiation of much greater intensity than anything American technology could manage at the time. Alfred L. Loomis (1887–1975), head of radar research at the National Defense Research Committee (NDRC), said that the magnetron had advanced American radar research by two years. The NDRC quickly developed the magnetron into an airborne intercept system. By April 1941, the "Rad Lab" at the Massachusetts Institute of Technology (MIT) had built the AI-10, a radar machine capable of detecting airplanes and submarines.

The British were desperate for a radar system to detect attacks from German night bombers. Watson-Watt inspected the AI-10 in 1941, but found that its radar "shield" was full of holes. He relocated stations and made other adjustments to complete the shield. Soon the Rad Lab had designed the ASV (Air-to-Surface Vessel) radar. The ASV allowed aircraft to detect ships up to five miles away. It was soon installed in B-18 planes to patrol the Atlantic coast. The American system was an even greater success in Britain, where radar stations were positioned along the south and east coasts. German Luftwaffe (airforce) commanders could not understand how the Royal Air Force (RAF) fighter squadrons knew the German bombers were on the way, or how they managed to find them in the dark.

Radar was one of the most important technological developments of World War II. By 1942, the Rad Lab had a budget of $1.15 million per month, and by 1945 it employed around five hundred physicists. The rapid development of radar during the war also was useful after 1945. It made possible the rapid expansion of civilian air transport in the late 1940s and 1950s.

Controlling the Weather

American scientists were so confident in the 1940s that they tried to modify the weather. In 1943, Irving Langmuir (1881–1957) and Vincent Schaefer (1906–1993) began to look at ways to make rain. Schaefer eventually came to the conclusion that precipitation (rain, snow, and hail) is created inside supercooled clouds. Such clouds exist below the freezing point, but they also contain both ice crystals and droplets of water. As the ice crystals grow bigger, the water droplets shrink. At a certain point the ice crystals become so big and heavy that they fall to the ground. If they melt on the way down, it rains. If they stay frozen, it snows. Schaefer and Langmuir tried to make precipitation happen artificially. In 1946, Schaefer flew over Mount Greylock in Massachusetts and threw dry ice (a solid form of carbon dioxide) into a cloud. Flying under the cloud, Schaefer noticed a snow flurry. On the ground, his colleague Langmuir was caught in a shower of rain.


Harnessing the power of the atom is the most significant scientific achievement of the 1940s, and possibly of the entire twentieth century. Nuclear weapons (also known as atomic bombs) brought World War II to an abrupt, and some would say early, conclusion. The new technology helped make the United States a dominant force in global affairs and created an entirely new political world order after 1945. Nuclear weaponry created an atmosphere of fear and distrust between nations. But many argue that its deterrent effect helped to prevent a third world war in the twentieth century.

The first controlled atomic chain reaction was achieved in 1942, on an old squash court under the stands of the abandoned Stagg Field football stadium at the University of Chicago. Nobel Prize-winning physicist Enrico Fermi (1901–1954) constructed a nuclear pile (reactor) from six tons of uranium metal and fifty tons of uranium oxide encased in four hundred tons of graphite. On December 2, 1942, the control rods were removed. The pile achieved critical mass when enough of the material had become radioactive to trigger the chain reaction and cause nuclear fission (the splitting of atoms). This was the first-ever controlled release of nuclear energy.

The Atomic Age

Americans became very excited by the possibilities of the atomic age shortly after atomic bombs were dropped on Japan. Within hours, the bartender at the Washington Press Club had invented the Atomic Cocktail, made from pernod and gin. People speculated about potential nuclear-powered cars and aircraft. Some predicted that artificial atomic suns could be created to control the weather. In July 1946, a bomb test on Bikini Atoll in the Pacific Ocean gave its name to the two-piece bathing suit; while in 1947, the Manhattan telephone directory listed forty-five businesses with "atomic" in their names. Popular songs of 1946 and 1947 included "Atom Buster," "Atom Polka," and "Atom Bomb Baby." It was to be a few years before the American public began to appreciate the new dangers of the atomic age.

Fermi's work was done in a spirit of scientific inquiry. He and his team simply wanted to see if it could be done. But World War II inspired President Franklin D. Roosevelt to fund research into building the atomic bomb. He was persuaded to do so by German-born Jewish physicist Albert Einstein. Although Einstein was opposed to violence, he feared what would happen if Nazi Germany developed an atomic bomb before the United States did. Less than a month after Fermi's success, the Roosevelt

administration authorized $400 million for the top-secret Manhattan Project. Headed by General Leslie R. Groves (1896–1970), the Manhattan Project set up research facilities to produce plutonium at Hanford, Washington, and Oak Ridge, Tennessee. In early 1945, the Hanford site began producing pure plutonium, which was perfect for making an atomic bomb.

The Manhattan Project bomb-making research facility was located in New Mexico, at Los Alamos. Under the direction of J. Robert Oppenheimer, scientists at Los Alamos had to solve two problems. First, they had to understand what would happen in the fraction of a second after the chain reaction began, but before the explosion occurred. Understanding this event was essential in order to control the bomb. Second, they had to stop the plutonium from reaching critical mass and exploding too soon. The solution to the second of these problems was to use conventional explosives to encase the plutonium. When these explosives were detonated, the shock wave crushed the plutonium, forcing it to critical mass and triggering the chain reaction that caused the bigger, atomic explosion. Because it was designed to squeeze inward at first, this type of weapon was known as an implosion bomb.

The first atomic bomb was exploded at Alamagordo, in the New Mexico desert, two hundred miles from Los Alamos, on July 16, 1945. The bomb had the power of twenty thousand tons of dynamite. Oppenheimer described the flash as having the "radiance of a thousand suns." Within a month, atomic bombs were made available to the military. On August 6, 1945, the Superfortress bomber Enola Gay dropped a uranium bomb called "Little Boy" on the Japanese city of Hiroshima. It destroyed four square miles of the city in a few seconds. Three days later, a plutonium bomb nicknamed "Fat Man" destroyed one-third of the Japanese city of Nagasaki, killing forty thousand people in the blink of an eye. Not long after the bombs were dropped, Einstein was asked how he thought World War III would be fought. He said he did not know, but that World War IV would be fought with sticks and stones.

A September 1945 Gallup poll showed that 65 percent of Americans thought the atomic bomb was "a good thing." Two years later, only 55 percent held that opinion. The dangers of atomic weapons were obvious, and efforts were made to prevent their spread. But the defensive policies of the late 1940s meant that the United States tried to keep nuclear technology to itself. This protectiveness increased the mood of hostility between the Soviet Union and the United States. Efforts to keep the atomic bomb in American hands soon failed. The Soviet Union exploded its own atomic device in 1949.

Because they did not want to be responsible for the destruction of humanity, many scientists opposed further research into atomic weapons. But the government wanted further development. Physicist Edward Teller (1908–) was keen to develop an even more formidable weapon, the hydrogen bomb. The first atomic bombs worked by splitting atoms to release huge amounts of energy in a process known as nuclear fission. Hydrogen

bombs do the opposite; they merge two hydrogen atoms together in a process called nuclear fusion. This is the same chemical reaction that takes place inside the sun. Hydrogen bombs (also known as thermonuclear bombs) are a thousand times more powerful than the bombs that were dropped on Japan. Fearing the Soviet Union's atomic weapons program, President Harry S Truman authorized funding for hydrogen bomb research in 1950.

Nuclear weapons raised many ethical and moral questions. Public opinion was split over the need for atomic bombs, while physicians and biologists warned that radiation from the explosions, also called fallout, was dangerous. Military researchers even went so far as to test the effects of radiation on humans. Soldiers were intentionally exposed to radiation from bomb tests or were given doses of radioactive material by army doctors, who then measured the health effects. Although fear of nuclear accident and war would dominate politics for the next forty years, atomic research in the 1940s did have several positive effects. These included new cancer treatments and the development of nuclear power used to drive ships and generate electricity.


The 1940s Science and Technology: Topics in the News