Computer, Digital

views updated Jun 27 2018

Computer, Digital

Resources

A digital computer is a device that processes information by manipulating symbols according to logical rules. Digital computers come in a wide variety of types, ranging from tiny, special-purpose devices embedded in cars and other devices to the familiar desktop computer, the minicomputer, the mainframe, and the supercomputer. The fastest supercomputer, as of late 2006, can execute up to 135.5 trillion instructions (elementary computational operations) per second, almost four times faster than the record in 2003; this new record is also certain to be broken soon. The impact of the digital computer on society has been tremendous. It is used to run everything from spacecraft to factories, healthcare systems to telecommunications, banks to household budgets. Since its invention during World War II, the electronic digital computer has become essential to the economies of the developed world.

The story of how the digital computer evolved goes back to the beyond the calculating machines of the 1600s to the pebbles (in Latin, calculi ) that the merchants of imperial Rome used for counting, to the abacus of the fifth century BC. Although the earliest devices could not perform calculations automatically, they were useful in a world where mathematical calculations, laboriously performed by human beings in their heads or on paper, tended to be riddled with errors. Like writing itself, mechanical helps to calculation such as the abacus may have first developed to make business easier and more profitable to transact.

By the early 1800s, with the Industrial Revolution well under way, errors in mathematical data had assumed new importance; faulty navigational tables, for example, were the cause of frequent shipwrecks. Such errors were a source of irritation to Charles Babbage (17921871), a young English mathematician. Convinced that a machine could do mathematical calculations faster and more accurately than humans, Babbage, in 1822, produced a small working model of what he called his difference engine. The difference engines arithmetic was limited, but it could compile and print mathematical tables with no more human intervention than a hand to turn the handles at the top of the device. Although the British government was impressed enough to invest £17, 000 in the construction of a full-scale difference enginea sum equivalent to millions of dollars in todays moneyit was never built. The project came to a halt in 1833 in a dispute over payments between Babbage and his workmen.

By that time, Babbage had already started to work on an improved versionthe analytical engine, a programmable machine that could perform all types of arithmetic functions. The analytical engine had all the essential parts of the modern computer: a means of entering a program of instructions, a memory, a central processing unit, and a means of outputting results. For input and programming, Babbage used punched cards, an idea borrowed from French inventor Joseph Jacquard (17571834), who had used them in his revolutionary weaving loom in 1801.

Although the analytical engine has gone down in history as the prototype of the modern computer, a full-scale version was never built. Among the obstacles were lack of funding and manufacturing methods that lagged well behind Babbages vision.

Fewer than 20 years after Babbages death, an American by the name of Herman Hollerith (1860 1929) was able to make use of a new technology, electricity, when he submitted to the United States government a plan for a machine that could compute census data. Holleriths electromechanical device tabulated the results of the 1890 U.S. census in less than six weeks, a dramatic improvement over the seven years it had taken to tabulate the results of the 1880census. Hollerith went on to found the company that ultimately emerged as International Business Machines, Inc. (IBM).

World War II was the driving force behind the next significant stage in the evolution of the digital computer: greater complexity, greater programmability, and greater speed through the replacement of moving parts by electronic devices. These advances were made in designing the Colossus, a special-purpose electronic computer built by the British to decipher German codes; the Mark I, a gigantic electromechanical device constructed at Harvard University under the direction of U.S. mathematician Howard Aiken (19031973); and the ENIAC, a large, fully electronic machine that was faster than the Mark I. Built at the University of Pennsylvania under the direction of U.S. engineers John Mauchly (19071980) and J. Presper Eckert (19191995), the ENIAC employed some 18, 000 vacuum tubes.

The ENIAC was general-purpose in principle, but to switch from one program to another meant that a part of the machine had to be disassembled and rewired. To avoid this tedious process, John von Neumann (19031957), a Hungarian-born American mathematician, proposed the concept of the stored programthat is, the technique of coding the program in the same way as the stored data and keeping it in the computers own memory for as long as needed. The computer could then be instructed to change programs, and programs could even be written to interact with each other. For coding, von Neumann proposed using the binary numbering system, which uses only 0 and 1, as opposed to the decimal system, which uses the ten digits 0 through 9. Because 0 and 1 can readily be symbolized by the on or off states of a switched electric current, computer design was greatly simplified.

Von Neumanns concepts were incorporated in the first generation of large computers that followed in the late 1940s and 1950s. All these machines were dinosaurs by todays standards, but in them all the essential design principles on which todays billions of digital devices operate were worked out.

The digital computer is termed digital to distinguish it from the analog computer. Digital computers manipulate symbolsnot necessarily digits, despite the namewhile analog computers manipulate electronic signals or other physical phenomena that act as models or analogs of various other phenomena (or mathematical variables). Today, the word computer has come to be effectively synonymous with digital computer, due to the rarity of analog computation.

Although all practical computer development to date has obeyed the principles of binary logic laid down by von Neumann and the other pioneers, and these principles are sure to remain standard in digital devices for the near future, much research has focused in recent years on quantum computers. Such devices will exploit properties of matter that differ fundamentally from the on-off, yes-no logic of conventional digital computers.

See also Analog signals and digital signals; Computer, analog; Computer software.

Resources

BOOKS

Campbell-Kelly, Martin and William Aspray. Computer: A History of the Information Machine. Boulder, CO: Westview Press, 2004.

Hennessy, John L. and David A. Patterson. Computer Architecture: A Qualitative Approach. 4th ed. San Francisco: Morgan Kaufmann, 2006.

Laing, Gordon. Digital Retro: The Evolution and Design of the Personal Computer. San Francisco: Sybex, 2004.

Stokes, Jon. Inside the Machine: A Practical Introduction to Microprocessors and Computer Architecture. San Francisco: No Starch Press, 2006.

Computer, Digital

views updated May 23 2018

Computer, digital

A digital computer is a programmable device that processes information by manipulating symbols according to logical rules. Digital computers come in a wide variety of types, ranging from tiny, special-purpose devices embedded in cars and other devices to the familiar desktop computer, the minicomputer, the mainframe, and the supercomputer. The fastest supercomputer, as of early 2003, can execute up to 36 trillion instructions (elementary computational operations) per second; this record is certain to be broken. The impact of the digital computer on society has been tremendous. It is used to run everything from spacecraft to factories, healthcare systems to telecommunications, banks to household budgets. Since its invention during World War II, the electronic digital computer has become essential to the economies of the developed world.

The story of how the digital computer evolved goes back to the beyond the calculating machines of the 1600s to the pebbles (in Latin, calculi) that the merchants of imperial Rome used for counting, to the abacus of the fifth century b.c. Although the earliest devices could not perform calculations automatically, they were useful in a world where mathematical calculations, laboriously performed by human beings in their heads or on paper , tended to be riddled with errors. Like writing itself, mechanical helps to calculation such as the abacus may have first developed to make business easier and more profitable to transact.

By the early 1800s, with the Industrial Revolution well under way, errors in mathematical data had assumed new importance; faulty navigational tables, for example, were the cause of frequent shipwrecks. Such errors were a source of irritation to Charles Babbage (1792–1871), a young English mathematician. Convinced that a machine could do mathematical calculations faster and more accurately than humans, Babbage, in 1822, produced a small working model of what he called his "difference engine." The difference engine's arithmetic was limited, but it could compile and print mathematical tables with no more human intervention than a hand to turn the handles at the top of the device. Although the British government was impressed enough to invest £17,000 in the construction of a full-scale difference engine—a sum equivalent to millions of dollars in today's money—it was never built. The project came to a halt in 1833 in a dispute over payments between Babbage and his workmen.

By that time, Babbage had already started to work on an improved version—the analytical engine, a programmable machine that could perform all types of arithmetic functions. The analytical engine had all the essential parts of the modern computer: a means of entering a program of instructions, a memory, a central processing unit, and a means of outputting results. For input and programming, Babbage used punched cards, an idea borrowed from French inventor Joseph Jacquard (1757–1834), who had used them in his revolutionary weaving loom in 1801.

Although the analytical engine has gone down in history as the prototype of the modern computer, a full-scale version was never built. Among the obstacles were lack of funding and manufacturing methods that lagged well behind Babbage's vision .

Less than 20 years after Babbage's death, an American by the name of Herman Hollerith (1860–1929) was able to make use of a new technology, electricity , when he submitted to the United States government a plan for a machine that could compute census data. Hollerith's electromechanical device tabulated the results of the 1890 U.S. census in less than six weeks, a dramatic improvement over the seven years it had taken to tabulate the results of the 1880 census. Hollerith went on to found the company that ultimately emerged as International Business Machines, Inc. (IBM).

World War II was the driving force behind the next significant stage in the evolution of the digital computer: greater complexity, greater programmability, and greater speed through the replacement of moving parts by electronic devices. These advances were made in designing the Colossus, a special-purpose electronic computer built by the British to decipher German codes; the Mark I, a gigantic electromechanical device constructed at Harvard University under the direction of U.S. mathematician Howard Aiken (1903–1973); and the ENIAC, a large, fully electronic machine that was faster than the Mark I. Built at the University of Pennsylvania under the direction of U.S. engineers John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), the ENIAC employed some 18,000 vacuum tubes.

The ENIAC was general-purpose in principle, but to switch from one program to another meant that a part of the machine had to be disassembled and rewired. To avoid this tedious process, John von Neumann (1903–1957), a Hungarian-born American mathematician, proposed the concept of the stored program—that is, the technique of coding the program in the same way as the stored data and keeping it in the computer's own memory for as long as needed. The computer could then be instructed to change programs, and programs could even be written to interact with each other. For coding, von Neumann proposed using the binary numbering system, which uses only 0 and 1, as opposed to the decimal system, which uses the ten digits 0 through 9. Because 0 and 1 can readily be symbolized by the "on" or "off" states of a switched electric current , computer design was greatly simplified.

Von Neumann's concepts were incorporated in the first generation of large computers that followed in the late 1940s and 1950s. All these machines were dinosaurs by today's standards, but in them all the essential design principles on which today's billions of digital devices operate were worked out.

The digital computer is termed "digital" to distinguish it from the analog computer. Digital computers manipulate symbols—not necessarily digits, despite the name—while analog computers manipulate electronic signals or other physical phenomena that act as models or analogs of various other phenomena (or mathematical variables). Today, the word "computer" has come to be effectively synonymous with "digital computer," due to the rarity of analog computation.

Although all practical computer development to date has obeyed the principles of binary logic laid down by von Neumann and the other pioneers, and these principles are sure to remain standard in digital devices for the near future, much research has focused in recent years on quantum computers. Such devices will exploit properties of matter that differ fundamentally from the onoff, yes-no logic of conventional digital computers.

See also Analog signals and digital signals; Computer, analog; Computer software.


Resources

books

Lee, Sunggu. Design of Computers and Other Complex Digital Devices. Upper Saddle River, NJ: Prentice Hall, 2000.

White, Ron, and Timothy Downs. How Computers Work. 6th ed. Indianapolis, IN: Que Publishers, 2001.

other

Associated Press. "Study: Japan Has Fastest Supercomputer." December, 2002. (cited January 5, 2003). <http://www.govtech.net/news/news.phtml?docid=2002.11.15-30715>.

Computer, Digital

views updated May 23 2018

Computer, digital

The digital computer is a programmable electronic device that processes numbers and words accurately and at enormous speed. It comes in a variety of shapes and sizes, ranging from the familiar desktop microcomputer to the minicomputer, mainframe, and supercomputer. The supercomputer is the most powerful in this list and is used by organizations such as NASA (National Aeronautics and Space Administration) to process upwards of 100 million instructions per second.

The impact of the digital computer on society has been tremendous; in its various forms, it is used to run everything from spacecraft to factories, health-care systems to telecommunications, banks to household budgets.

The story of how the digital computer evolved is largely the story of an unending search for labor-saving devices. Its roots go back beyond the calculating machines of the 1600s to the pebbles (in Latin, calculi ) that the merchants of Rome used for counting and to the abacus of the fifth century b.c. Although none of these early devices were automatic, they were useful in a world where mathematical calculations performed by human beings were full of human error.

The Analytical Engine

By the early 1800s, with the Industrial Revolution well underway, errors in mathematical data had grave consequences. Faulty navigational tables, for example, were the cause of frequent shipwrecks. English mathematician Charles Babbage (17911871) believed a machine could do mathematical calculations faster and more accurately than humans. In 1822, he produced a small working model of his Difference Engine. The machine's arithmetic functioning was limited, but it could compile and print mathematical tables with no more human intervention needed than a hand to turn the handles at the top of the model.

Babbage's next invention, the Analytical Engine, had all the essential parts of the modern computer: an input device, a memory, a central processing unit, and a printer.

Although the Analytical Engine has gone down in history as the prototype of the modern computer, a full-scale version was never built. Even if the Analytical Engine had been built, it would have been powered

by a steam engine, and given its purely mechanical components, its computing speed would not have been great. In the late 1800s, American engineer Herman Hollerith (18601929) made use of a new technologyelectricitywhen he submitted to the United States government a plan for a machine that was eventually used to compute 1890 census data. Hollerith went on to found the company that ultimately became IBM.

Mammoth modern versions

World War II (193945) marked the next significant stage in the evolution of the digital computer. Out of it came three mammoth computers. The Colossus was a special-purpose electronic computer built by the British to decipher German codes. The Mark I was a gigantic electromechanical device constructed at Harvard University. The ENIAC was a fully electronic machine, much faster than the Mark I.

The ENIAC operated on some 18,000 vacuum tubes. If its electronic components had been laid side by side two inches apart, they would have covered a football field. The computer could be instructed to change programs, and the programs themselves could even be written to interact with each other. For coding, Hungarian-born American mathematician John von Neumann proposed using the binary numbering system, 0 and 1, rather than the 0 to 9 of the decimal system. Because 0 and 1 correspond to the on or off states of electric current, computer design was greatly simplified.

Since the ENIAC, advances in programming languages and electronicsamong them, the transistor, the integrated circuit, and the microprocessorhave brought about computing power in the forms we know it today, ranging from the supercomputer to far more compact personal models.

Future changes to so-called "computer architecture" are directed at ever greater speed. Ultra-high-speed computers may run by using super-conducting circuits that operate at extremely cold temperatures. Integrated circuits that house hundreds of thousands of electronic components on one chip may be commonplace on our desktops.

[See also Computer, analog; Computer software ]

digital computer

views updated Jun 27 2018

digital computer A computer that operates on discrete quantities (compare analog computer). All computation is done within a finite number system and with limited precision, associated with the number of digits in the discrete numbers. The numerical information is most often represented by the use of two-state electrical phenomena (on/off, high voltage/low voltage, current/no current, etc.) to indicate whether the value of a binary variable is a “zero” or a “one”. Usually there is automatic control or sequencing (through a program) of operations so that they can be carried through to completion without intervention. See also discrete and continuous systems.

About this article

digital computer

All Sources -
Updated Aug 13 2018 About encyclopedia.com content Print Topic