Computers: The Dawn of a Revolution

views updated

Computers: The Dawn of a Revolution


By the end of the twentieth century, computers could be found in devices from wristwatches to automobiles, from medical equipment to children's toys. But while scientists and philosophers had dreamed of the possibility of automating calculation nearly one hundred years earlier, very little progress was made toward modern computers before 1940. As scientists and engineers worked to face the challenges of World War II—including cracking codes and calculating the physics equations to produce atomic weapons—they finally made computers a reality. In a few short years, the theoretical vision of computing was brought together with existing technologies such as office machines and vacuum tubes to make the first generation of electronic computers.


Mechanical calculating machines had their origins in the mid-nineteenth century with the work of Charles Babbage (1792-1871), whose "Analytical Engine" was intended to use gears and punched cards to perform arithmetical operations. Babbage's design anticipated many of the concepts central to modern computers, such as programming and data storage. Interestingly, a fragment of Babbage's machine survived to the 1940s, when it was discovered in an attic at Harvard and helped inspire the computer innovator Howard Aiken (1900-1973). Punched cards were also used to collect and store data beginning with the 1890 census. The man responsible for this innovation, Herman Hollerith (1860-1929), founded the company that became International Business Machines (IBM), the largest and perhaps the most important computer company of the twentieth century. Another nineteenth-century development that proved most vital to the evolution of computers was the mathematical work of George Boole (1815-1864). Boole showed that the binary number system, which has only two symbols 0 and 1, could be used to perform logical operations by letting 0 stand for false and 1 for true. Later, this "Boolean Algebra" was used to represent the on and off states of electronic switches and formed the framework for electronic computers.

As with many technological achievements, ideas and visions have often spurred ahead practical plans for computers. One of the most important visionaries who helped to bring about the invention of digital computers was Alan M. Turing (1912-1954), a British philosopher and mathematician who in the 1930s wrote a number of important papers proposing ideas about theoretical machines designed to test mathematical or logical statements. Turing's imaginary machines helped to establish the actual potential and limitations of computers. Turing also helped to design and build some of the very first electronic computing machines for use in deciphering encrypted transmissions during World War II, including in 1943 a device called the "Colos sus" made from 1500 vacuum tubes.

There is much dispute about who really built the first modern computer. World War II was a great impetus for computer research in Germany as well as among the Allied nations. While Konrad Zuse (1910-1995), a German, did design and construct several computing machines in the 1930s and 1940s, it was in Britain and the United States that computing research advanced furthest and fastest. Following his work with Colossus during the war, Turing joined the staff at Britain's National Physical Laboratory (NPL) to assist in the development of electronic digital computers. Turing produced an important program report that included one of the first designs for an electronic stored-program device. The project based on Turing's ideas, the Automatic Computing Engine (ACE) took some years to advance, but a prototype machine was finally produced in 1948. Computer research proceeded elsewhere in Britain as well; advanced machines were developed in the 1940s at the universities of Cambridge and Manchester.

In the United States, early computer projects flourished at industrial research laboratories and university campuses, driven ahead by commercial, scientific, and wartime interests. During the 1930s, Vannevar Bush (1890-1974) at MIT built a "Differential Analyzer" machine that used an array of gears, shafts, wheels, wires, and pulleys driven by electric motors to solve differential equations. At the Bell Telephone Laboratories of AT&T, a binary calculator that used telephone relays was built in 1939, the same year that engineers at IBM, with the help of Harvard professor Howard Aiken, started to build the electromechanical Mark I computer, which when finished consisted of 750,000 moving parts. In 1940, John Atanasoff (1903-1995) and Clifford Berry (1918-1963) of Iowa State College built the first computer to use vacuum tubes, a technology that would dominate the next decade.

War efforts were the impetus behind a major computer project at the Moore School of the University of Pennsylvania. There, beginning in 1942, engineers built the vacuum-tube based Electronic Numerator, Integrator, and Computer (ENIAC) to calculate ballistic tables for the Army Ordnance Corps. The 30-ton (27 metric ton) ENIAC was housed in 48-foot-tall (15-meter) cabinets, and was made out of 18,000 vacuum tubes and miles of wires. It performed calculations one thousand times faster than its electromechanical predecessors. John von Neumann (1903-1957), a Hungarian immigrant mathematician working on the atomic bomb project atLos Alamos, collaborated with Herman Goldstine of the Moore School to produce an important report that described the storage of programs in a digital computer. Two engineers who worked at the Moore School, J. Presper Eckert (1913-1995) and John Mauchly (1907-1980), subsequently conceived of a computer they called the Universal Automatic Computer (UNIVAC). They built the first UNIVAC for the U.S. Census in 1951; eventually, nearly 50 UNIVAC computers were constructed. The corporation founded by Eckert and Mauchly became one of the important early computer companies in the U.S., helping to move computers into the business world.

All of these very early computers were unreliable behemoths, filling up entire rooms with cases lined with fragile vacuum tubes that generated vast amounts of heat. While the conceptual limits of computing as described by Turing and others seemed boundless, the physical and practical limits of these machines were quite obvious to all who used them. But in 1948, three scientists at Bell Laboratories—John Bardeen (1908-1992), William Shockley (1910-1989), and Walter Brattain (1902-1987)—published a paper on solid-state electronic "transistors." The transistor took advantage of a discovery that had been made by physicists of substances, such as silicon, that have the property of conducting electricity in one direction but not the other—called "semiconductors." Small, cool transistors soon replaced the large, hot vacuum tubes as "switches" in the electronic circuits that make up computers. This discovery started the evolution of ever-smaller computer devices that continued to the end of the century and beyond. Another 1948 discovery by a Bell Labs scientist, Claude Shannon (1916- ), provided a vital theoretical framework to explain the electronic transmission of information. By 1949 all the elements for the rapid development of computer technology and a computer industry were finally in place.


During the 1950s, the computer became an important force in business as well as science. The large computer systems, known as mainframes, cost anywhere from several hundred thousand to several million dollars. Most were made, sold, and maintained by IBM, along with a handful of much smaller competitors that came to be known as the "seven dwarves." All computers required special air-conditioned rooms and highly trained staff to operate them. Information entered the computer through paper tape or punched cards; output would be delivered (sometimes days later) in the form of more cards, or on large paper printouts. As the storage capacity of computers improved, the first software programs were developed: John Backus of IBM produced the earliest version of FORTRAN in 1954. Programming languages enabled scientists and engineers to use computers much more effectively. As computers became more accessible, new applications were found for them. The business functions performed by typewriters, adding machines, and punched-card tabulators were gradually taken over by computers. Eventually, sophisticated software programs allowed people without technical training, even children, to use computers for work, study, and entertainment.

In the two decades that followed the introduction of the transistor, computers became smaller, cheaper, faster, simpler, and much more common. No branch of science was left untouched by the rapid development of the computer, and many activities of everyday life were transformed as well. The replacement of the transistor by the microprocessor in the 1970s made far smaller machines practical, and new uses for computers helped maintain a growing market for small machines even as mainframes remained essential to business, education, and government. But by the 1980s, micro, or "personal" computers (PCs), began to challenge the supremacy of mainframes in the workplace and in fact networks of inexpensive PCs replaced mainframes for many applications.

Large computers remained essential, however, especially to the evolution of computer communication including the Internet. It is through the Internet, a large system of computers linked together to transmit messages and images quickly throughout the world, that most individuals have come to experience and understand the computer and its contributions. Computers have provided easy access to large volumes of information and have made difficult calculations and investigations feasible. They have made it possible to travel into outer space, and to communicate instantaneously with people across the globe. While some have raised concerns about the effect of computers on personal privacy and independence, the computer has brought much of great value to commerce, science, and humanity—far more even than Babbage, Turing, or von Neumann could have imagined.


Further Reading

Aspray, William, ed. Computing Before Computers. Ames: Iowa State University Press, 1990.

Burks, Alice, and Arthur W. Burks. The First Electronic Computer: The Atanasoff Story. Ann Arbor: University of Michigan Press, 1988.

Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996.

Ceruzzi, Paul. Reckoners: The Prehistory of the Digital Computer from Relays to the Stored Program Concept, 1935-1945. Westport, CT: Greenwood Press, 1983.

Cohen, I. B., and Gregory W. Welch, eds. Makin' Numbers: Howard Aiken and the Computer. Cambridge, MA: MIT Press, 1999.

McCartney, Scott. ENIAC: The Triumphs and Tragedies of the World's First Computer. Walker Publishers, 1999.

Williams, Michael. A History of Computing Technology. Englewood Cliffs, NJ: Prentice Hall, 1985.

About this article

Computers: The Dawn of a Revolution

Updated About content Print Article