Early Computers

views updated

Early Computers

To understand the development of the computer industry during the latter half of the twentieth century, one must look to the demand for computing power before and during World War II, when computing was a means to a very specific and urgent end. Between 1935 and 1945 there was a great need for ballistics computations and other statistical work in support of military efforts; this was time-consuming work carried out by people using rudimentary calculators. During this time, there was a rush to invent single purpose digital computing machines to speed up the calculation of the ballistics problems, firing tables, and code-breaking calculations. Several such machines were created by governmental organizations, industrial companies, and office machine companies in the United States, Great Britain, and Germany.

Differential Analyzer

Although there were a number of calculators available for business use in the 1920s, they were not powerful enough to solve scientific computational problems. The first serious attempt at building a computer for scientists was made by Vannevar Bush (18901974), an engineer at Massachusetts Institute of Technology (MIT). In the 1930s, Bush and one of his students, Harold Locke Hazen, built an analog computer called the "differential analyzer." It was a collection of gears, shafts, and wires. It was better than the calculators of the time, but it was still slow and cumbersome, often needing two or three days of set-up time before it could solve a problem.

A faster and more accurate differential analyzer was built in 1935, but it, too, required adjustments with screwdrivers and hammers to prepare it for a run. It weighed 110 metric tons (200,000 pounds), had 2,000 vacuum tubes and several thousand relays, took up several hundred square feet (where one square meter equals about 10.8 square feet), and had about 150 motors and 322 kilometers (200 miles) of wires. Duplicates of the differential analyzer were set up at the U.S. Army's Ballistics Research Laboratory and at the Moore School of Electrical Engineering, at the University of Pennsylvania.

Bell Telephone Laboratories Model 1

Other computing devices were being built to serve purposes beyond what the differential analyzer was designed to do. The telephone company needed computing power to help set up telephone connections. At that time, when a rotary telephone was dialed, a number was transmitted to a machine that converted each digit to a four-pulse code. This was not fast enough for telephone switching demands, so Bell Telephone Laboratories engineers and scientists, including American engineer George R. Stibitz (19041995), began studying the binary number system . Stibitz felt that the binary system would be suited for the computation since a relay, called a flip-flop, had been developed. The relay could detect the presence or absence of a current (a binary system).

Stibitz built his "Complex Number Computer" in 1937. It converted decimal digits to binary, then converted them back to decimal for the answers. Push buttons were used to make it easier to operate, and in October 1939 it was sent to Bell Labs' New York office under the name of "Model 1."

The Model 1 may have been the world's first time-sharing computer, because several departments from Bell Labs accessed it remotely, with teletype machines. It may also have been the first remote job-entry computer. Mathematicians at Dartmouth College, in New Hampshire, accessed it through a teleprinter to submit problems to the computer in New York City. The answers came back on a telephone line hook-up in about one minute.

Harvard Mark I

In 1937 American mathematician Howard H. Aiken (19001973), then a Harvard graduate student, proposed building a machine based on the work of early computer researchers Charles Babbage and Herman Hollerith. He intended it to be an electromechanical Analytical Engine. The project was called the IBM Automatic Sequence Controlled Calculator (ASCC), or the Harvard Mark I. It was begun in 1939, and like the other computers of its day, it was huge. The Mark I was 15.2 meters (50 feet) long and 2.4 meters (8 feet) tall. Many of its 800,000 components were taken from IBM punched-card machines. It had 805 kilometers (500 miles) of wire, required an enormous amount of energy to run, and weighed about 5.5 metric tons (10,000 pounds). Every day, tons of ice were required to keep the machine cool.

The Mark I's memory had 72 adding registers made of 24 wheels each, and 60 special purpose registers using manual switches. The registers held 23-digit numbers plus the computational sign. Electric contacts were used to sense the number from the wheels. Clutches were used to transfer the number to another wheel for the calculation. Addition could be done in three-tenths of a second, multiplication in five seconds, and division in 11 seconds.

The machine input came from paper tape. Data was punched on three tapes and instructions on a fourth. It used two electric typewriters for output. There was no keyboard, and it was set up for a run by adjusting 1,400 switches. The Mark I computer was presented to Harvard in 1945. IBM had financed the research and construction costs of approximately half a million dollars. The Harvard Mark I was the first fully automatic computer to come into operation. It was already obsolete, however, as scientists and engineers would soon replace the electromechanical components with fully electronic models.

The Z1 in Germany

While Aiken was working on his computers at Harvard, a young engineering student in Germany was also thinking of computing machines that could perform long series of calculations. This person was German inventor Konrad Zuse (19101995), who decided on a design similar to Babbage's Analytical Engine, consisting of a storage unit, an arithmetic unit, and a control unit. The control unit would be directed by punched tape to deliver instructions to a selection mechanism, which connected the storage and arithmetic units.

Zuse decided to make it a binary device with a mechanical memory unit, using movable pins in slots to indicate zeroes and ones. This resulted in a compact memory that used only about 0.8 cubic meters (27 cubic feet), which was connected to a crude mechanical calculating unit. The machine was called the Z1 and was produced in 1938.

In 1941 Zuse completed the Z3. It had 2,000 relays and could multiply and divide, as well as extract square roots, in only three seconds. It was a compact machine that acted under program control. Push buttons were used on the control panel. One could push a button to convert decimal numbers into binary and then push again to convert them back. The Z3 was destroyed when an Allied bomb fell on Zuse's apartment building in 1944. Zuse's Z-machines were not too different from the microcomputers in use today.

Atanasoff-Berry Computer (ABC)

In 1939 John Vincent Atanasoff (19031995), a mathematician and physicist at Iowa State College (now Iowa State University), and a graduate student he recruited, Clifford Berry (19181963), began working on the machine that would be called the ABC (Atanasoff-Berry Computer). It used a binary number system and would use electronic technology. Numbers were stored on electric capacitors on two Bakelite drums and were read off as the drums rotated. Each drum could store 30 binary numbers of up to 50 binary digits.

Data were input through punched cards in decimal, five 15-digit numbers and a sign. This was converted to binary before doing the calculations. The computer was manually operated, with an operator pushing a button to show where the numbers should go, then putting a card in a holder and activating it by closing a contact. The card was read by rows of brushes, similar to the card readers developed later for computer use. To store intermediate results, Atanasoff designed a system to burn the cards with electric sparks. The burnt areas had less resistance than the rest of the card, so the numbers could be read by applying electric current to the card and reading the voltage.

The memory was to be bigger than the 300 bits available on commercial machines, so they needed units that would be cheaper and smaller than vacuum tubes. Their choice was paper electrical condensers, which looked like miniature cigarettes. A memory built of condensers would have saved money but would have required recharging from time to time. Atanasoff developed a procedure for recharging that he called "jogging." The computer's memory would be regularly "jogged" by the computer by resetting the data as they were read out. This was similar to a person "jogging" his or her own memory by repeating some phrase or word. The concept of "jogging" influenced the design of later computers built after World War II.

The prototype of the ABC contained what Atanasoff called an "abacus," a plastic disc mounted on a shaft turned by an electric motor. Each side of the disc had 25 condensers arranged as spokes on a wheel, which gave it the capacity of 25 binary digits. It proved to Atanasoff and Berry that an electronic machine using binary numbers with a condenser memory was possible. It also showed that the "jogging" technique was feasible. However, Atanasoff and Berry were unable to complete the ABC as their work was interrupted by World War II.

Developments in Great Britain

As the Germans advanced across Europe, a team of scientists and engineers came together in a mansion north of London at the Government Code and Cipher School. Their goal was to design machines that would break the codes generated by the Enigma, the cryptographic machine that the Germans used to encode and send messages.

The German Enigma used two typewriters. The original message was typed on one machine, a key was selected for encoding, and the coded message was automatically typed on the other electric machine. Every message could be sent with a different key, giving about a trillion possible code combinations. The key was changed three times a day, to make it more difficult to break the code.

British mathematician Alan Turing (19121954), one of the scientists on the project, developed an algorithm to sift through all the possible combinations and find the key. This was implemented on a special purpose computer called the Bombe. It was an electromechanical relay machine with wheels similar to those of the Enigma. Its goal was to work out the solution of the cipher as quickly and accurately as possible. It did not decode the messages, but found the key. The messages were then decoded by people. The Bombe, of which there were several individual machines, is credited with saving many lives during World War II.

When the Germans replaced the Enigma with a more sophisticated machine, known to the British as Fish, the British also attempted to build a better machine. The result was the Heath Robinson, an electronic machine using vacuum tubes. The Heath Robinson was named after a British cartoonist whose drawings of far-fetched machines were well known. For input, Heath Robinson used two synchronized photoelectric paper tape readers that could read 2,000 characters per second. This is the equivalent of a 300-page book being read in just over five minutes. Its output was a primitive printer that could print 15 characters per second. An adder carried out binary calculations used to break the codes of the German machine. It proved how fast and powerful electronic computing could be, but it still could not keep up with the demands being placed on it.

This led to the development of Colossus. The Colossus is considered to be an electronic computer, but it was a special purpose computer, unsuited for anything else other than deciphering codes. It was completed in December 1943. It had 1,800 vacuum tubes that counted and compared numbers and performed simple arithmetic calculations. It was fed information on paper tape at a rate of 5,000 characters per second, more than twice the rate of the Heath Robinson. It had no internal memory, but the users adjusted its operation as it came close to deciphering a message. The program was fed into it by an array of switches and phone jacks. Data were entered separately on tape.

Yet 5,000 characters per second was still not fast enough, so several machines were used in parallel, which today is called parallel processing. The parallel processing machine, of which there are different types, speeds up computing by performing tasks together, in parallel, instead of doing them sequentially.

Colossus had five different processors working in parallel. Each processor read a tape at 5,000 characters per second, so the total was 25,000 characters per second. This was made possible by the addition of shift registers, which allowed Colossus to read the tapes in parallel, and an internal clock that kept the parts of the machine working in synch. It was the best code breaker of its day.

ENIAC (Electronic Numerical Integrator and Computer)

John Mauchly (19071980), an American meteorologist interested in doing weather calculations, set out to build an inexpensive digital computer. He wanted to replace the calculators that were not fast enough for his needs. In 1941 he discussed his ideas with J. Presper Eckert (19191995), an American engineer, and in April 1943 the Moore School of Engineering received a contract from the Ballistics Research Laboratories to build a computer to calculate shell (munitions) trajectories. Mauchly and Eckert started work on the Electronic Numerical Integrator and Computer (ENIAC).

ENIAC had 18,000 vacuum tubes, 70,000 resistors, 10,000 capacitors, 6,000 switches, and 1,500 relays. It took up 162 square meters (1,800 square feet) and weighed 33 metric tons (60,000 pounds). It required 160 kilowatts of power and was 30.5 meters (100 feet) long, 3.1 meters (10 feet) high and 0.9 meters (3 feet) deep. It required two great 20-horsepower blowers to cool it. It generated 150 kilowatts of heat and its cost was more than $486,000.

ENIAC was 500 times faster than the Harvard Mark I and could perform 5,000 operations per second. It did an addition in two-tenths of a millisecond , a multiplication in 2.8 milliseconds, and a division in 24 milliseconds, which was extraordinary for that time.

ENIAC was a decimal machine working with numbers up to 20 digits long. The numbers were sent to the central processing unit by a transmitter made of relays connected to an IBM card reader. They were fed through the card reader at 125 cards per minute. Earlier machines, like the Harvard Mark I, were programmed with punched cards or paper tape so the program could be changed easily. This worked because their computational speed, being electromechanical, matched that of the paper tape readers and card readers. However, ENIAC's speed of 5,000 operations per second outstripped that of the card and tape readers. For that reason, Mauchly and Eckert decided to wire the machine specifically for each problem. This was similar to the plug boards used in electronic business machines or punched card equipment.

Each of ENIAC's problems was set up on a plug board similar to that used by punched card machines. If the program was complicated, it could take several days to set up. The later idea of creating a computer that could use a stored program came from this time-consuming effort. ENIAC became operational in November 1945, too late to help with the war effort, but it was a model for computers to come. It was highly regarded for its simplicity and carefully planning. The ENIAC was dismantled in October 1955.

EDVAC (Electronic Discrete Variable Automatic Computer)

When Hungarian mathematician John von Neumann (19031957) was told of the work on the ENIAC, he arranged to become a consultant to the project. He played a key role in the design of the subsequent EDVAC, starting in 1944. This new machine was to remember its instructions in an internal memory. This would remove the need to plug and unplug and replug, as in the ENIAC. The instructions could be changed internally.

EDVAC was given delay-line storage instead of vacuum tubes. The delay lines used the binary number system. With 1,024 bits, which was their capacity, they could be used to store 32 32-bit words. It was estimated that the EDVAC would require between 2,000 and 8,000 words of storage, necessitating between 64 and 256 delay lines. This was a large amount of equipment, but it was still smaller than the ENIAC. When completed in 1951, the EDVAC had some 3,500 tubes, but its importance lies in the fact that it embodied the stored-program concept and the "von Neumann machine," two ideas that would greatly influence the design of computers. The stored-program computer is still in use today. A stored-program computer allows the instructions to be fed in with the data. Earlier computers, such as the Bell Labs Model 5 and Heath Robinson, had separate tapes for the data and the instructions.

A von Neumann machine is the model on which most machines are built today. It executes the instructions sequentially, as opposed to a parallel processor, which supports multiple operations at the same time. EDVAC became operational at the end of 1951 and was active until 1962. Its separate componentsmemory, central control, arithmetic unit, and input and outputwere introduced by von Neumann.

Manchester Mark I and EDSAC (Electronic Delay Storage Automatic Calculator)

After security was lifted from the ENIAC project, the dean of the Moore School organized a summer school to make sure that those outside of the project would know of its results. The lectures took place in July and August of 1946. They attracted many of the leading scientists of the day. Among them were British computer scientists, including Maurice V. Wilkes.

ENIAC was explained in detail during the lectures, but the EDVAC was not discussed since it was still a classified project. In retrospect, one can see the connection between these lectures and various projects carried out by the governments, universities, and industrial laboratories in Great Britain and the United States. Great Britain was the only European country not so devastated by the war that it could carry out a computer project.

British mathematician Max Newman (18971984), one of the researchers of the Colossus computer, launched the computer project at Manchester University in England. One of his university colleagues, British engineer Sir Frederic Williams (19111977), developed a memory system based on the cathode ray tube (CRT) . A primitive machine using this memory was developed in 1947, but it did not have input or output devices. The program was entered using push-buttons; the results were read from the tubes. Williams' project prevailed over Newman's original plans. The university's Manchester Mark I was completed in 1948, incorporating the stored-program concept. It proved that the idea was achievable. It stored 128 40-bit words on the tubes and had additional memory in the form of magnetic drums.

Inspired by the Moore School lectures, Cambridge University professor Maurice V. Wilkes (1913 ) started work on a stored-program computer. By February 1947 he had built a successful delay line storage that could store bit patterns for long periods of time. Encouraged by this, he went on to construct the full machine. Despite the short supply of electronic components in postwar Britain, the EDSAC (Electronic Delay Storage Automatic Calculator) began to take shape. Its control and arithmetic units were stored in three long racks, each 1.8 meters (six feet) tall. The vacuum tubes were exposed to keep them from overheating. Input was a tape reader, output was on a teleprinter, and the programs were punched on telegraph tape.

Although EDSAC was very largeit had 3,000 tubes and consumed 30 kilowatts of electric powerit was smaller than the ENIAC and had one-sixth the tube count. It could perform an addition in 1.4 milliseconds, and its users developed one of the first assembly languages, as well as a library of programming procedures called subroutines.

Whirlwind

In the early 1950s, the United States and the Soviet Union were engaged in the Cold War. MIT engineers were working on a computer to help the U.S. military with its computational needs. After considering an analog machine, it was finally decided that the machine would be digital, and would require a large memory to store the information needed to control an aircraft trainer, a "real-time" exercise. The requirements of the memory were beyond the capabilities of the CRTs and delay-lines of the day.

Jay Forrester (1918 ), an engineer working on the project, thought of alternative designs. He eventually settled on a three-dimensional design, the "core memory." He also experimented with different media from which to construct the memory. He tried rolled-up bits of magnetic tape, then tried iron-bearing ceramics that were molded in the shape of tiny rings and mounted on grids. These became the magnetic "cores" of the powerful computer that would be named Whirlwind. The Whirlwind began working in 1951. It helped to coordinate New England military radar units in scanning the skies for Soviet planes during the Cold War.

Two decades after Whirlwind debuted, times had changed. At the time of Aiken and his colleagues, there were only a handful of computer designers and engineers, and most knew each other. By the 1960s and 1970s, the field was replete with practitioners, enough to populate a small city, and the pioneer computers seemed almost forgotten.

see also Babbage, Charles; Digital Computing; Early Pioneers; Memory; Vacuum Tubes.

Ida M. Flynn

Bibliography

Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996.

Moreau, Rene. The Computer Comes of Age: The People, the Hardware, and the Software. Cambridge, MA: MIT Press, 1984.

Ritchie, David. The Computer Pioneers: The Making of the Modern Computer. New York: Simon and Schuster, 1986.

Shurkin, Joel. Engines of the Mind: The Evolution of Computers from Mainframes to Microprocessors. New York: W. W. Norton & Co, 1996.

About this article

Early Computers

Updated About encyclopedia.com content Print Article

NEARBY TERMS

Early Computers