Information Processing: Historical Perspectives

views updated

INFORMATION PROCESSING: HISTORICAL PERSPECTIVES

Since the beginning of time, humans have attempted to derive methods to compute and to process data more efficiently. One of the earliest computing devices was the abacus developed in ancient Egypt during the thirteenth century. The abacus is a frame comprised of beads strung on wires used to add, subtract, divide, and multiply. Although this primitive device preceded pencil and paper, it is still used in the twenty-first century.

THE FIRST CALCULATING MACHINES

To increase the speed and accuracy of computing, John Napier, who was a mathematician, invented logarithms, which greatly assisted arithmetic calculations. He also invented "Napier's bones" in the early 1600s. This tool was a table made from wood or bones that included multiplication inscriptions. In 1642 Frenchman Blaise Pascal (16231662) invented the first adding machine, called the Arithmetic Machine. Gottfried Leibniz (16461716) expanded on Pascal's ideas and in 1671 developed the "step reckoner," which could perform addition, subtraction, multiplication, and division, as well as evaluate square roots.

In 1834 Charles Babbage (17911871) designed the forerunner of the computer, the mechanical Analytical Engine. It was designed to perform complicated calculations such as multiplication, division, addition, and subtraction. The Analytical Engine failed to be produced because of its mechanical nature. The mechanical parts were extremely slow and were subject to routine breakdowns. Although this machine was never actually produced, it influenced the design of modern computers. It included the four components of modern computing: input, storage, processing, and output. The machine allowed data input and it included a storage location to hold data for processing. It also had a processor to calculate numbers and to direct tasks to be performed, as well as an output device to print out information.

In 1884 Herman Hollerith (18601929) used electric components to devise a computer that the U.S. government used to help tabulate data for the 1890 U.S. census. This machine received hand-fed punched cards and allowed metal pins to pass through the holes into cups filled with mercury, completing an electric circuit. Hollerith later improved the design and started the Tabulating Machine Company in 1896. Later the company became International Business Machines (IBM) Corporation.

THE FIRST MODERN-DAY COMPUTERS

Howard Aiken (19001973), a Harvard professor, is credited with building the first digital computer, called the Mark I. This machine was similar to Babbage's Analytical Engine and was constructed out of switches and relays (metal bars surrounded by coils of wire). This 5-ton machine took five years to build, which rendered it obsolete before it was even completed.

At Iowa State University, John V. Atanasoff (19031995) and his graduate assistant, Clifford Berry (19181963), designed the first electronic digital special-purpose computer in the 1930s. The Atanasoff-Berry Computer used vacuum tubes for storage and arithmetic


functions. Improving on this design, John Mauchly (19071980) and John Presper Eckert, Jr. (19191995) of the University of Pennsylvania designed the first large-scale electronic digital computer used for general purposes in 1945. Built by IBM, the electronic numerical integrator and computer, or ENIAC, weighed 30 tons and spanned 1,500 square feet. This huge machine used 18,000 vacuum tubes for storage and arithmetic calculations.

Eckert and Mauchly started their own company, which was later known as Remington Rand Corporation, and designed the Universal Automatic Computer (UNIVAC) in 1951. The UNIVAC became the first commercial computer made available to business and industry. This machine used magnetic tape to store input and output instead of the punched cards used in previous machines. IBM capitalized on the concept of commercial applications and developed the IBM 701 and the IBM 752 computer systems. Because of their smaller size relative to the UNIVAC I, the IBM models cornered over 70 percent of the industrial computer market.

Transistors replaced vacuum tubes and sparked the evolution of second-generation computers. Transistors,


invented in 1947, were less expensive than vacuum tubes, generated less heat, and produced more reliable computers. Computers made with transistors yielded greater demand as a result of their small size, lower cost, and better reliability.

As the demand for computers increased, computer programmers were becoming consumed with the tedious process of programming the computers to function. Computer programmers used machine language to give instruction to the computer. Machine language is binary code (comprised of 0s and 1s) that a computer understands directly. Each different computer model had a unique programming language. For example, UNIVAC had different machine language than that used with the IBM 752. To ease the task of programming computers, machine language was replaced with assembly language. Programmers used assemblers to convert or translate English-like code, developed using assembly language, into machine language. This low-level language improved the speed at which programs could be written.

The use of integrated circuits improved computer development, which resulted in third-generation computers. Integrated circuits, developed in 1958, used miniature-size transistors that were mounted on small chips of silicon about a quarter of an inch long on each side. These microchips allowed scientists to develop even smaller, faster, and reliable computers. IBM used microchips to develop the 360 series of computers. Instead of punched cards, users interacted with their computers using keyboards, monitors, and operating systems.

During the third-generation era, high-level programming languages were introduced. While third-generation computers were performing more complex data manipulation, communicating with the computers also became more complicated. Programming languages such as COBOL and FORTRAN were developed in the 1950s to make programming the computer easier. These high-level languages used compilers or interpreters to convert the English-like code into machine language.

FOURTH-GENERATION COMPUTERS

Fourth-generation computers were led by the development of the microprocessor. Called a semiconductor, this processor was produced in 1971 by a company called Intel. The semiconductor was a large-scale integrated circuit that contained thousands of transistors on a single chip. The development of this chip led to the invention of the first personal computer. With this invention, the use of computers spread from large businesses and the military to small businesses and homes.

IBM introduced its first home computer in 1981 and Apple developed the Macintosh home computer in 1984. The Intel 4004 chip fit all the components of a computer into one tiny chip. This innovation eventually led to the development of handheld devices. Handheld devices are portable computers that have many of the capabilities of a desktop computer. One popular handheld device is the personal digital assistant that allows a user to schedule and organize information.

THE INTERNET AND WORLD WIDE WEB

The powerful capability of microprocessors allowed small computers to link together to form networks or the Internet. The Internet, conceptualized in the late 1960s by researchers from the Advanced Research Projects Agency of the U.S. Department of Defense, is a network of computer networks that enable communication among computer users. The Internet facilitated the use of electronic mail, which is a commonly used form of communication.

The Internet has been enhanced by the World Wide Web (WWW), which enables computer users to search, view, and disseminate information on a plethora of subjects from Web sites. The WWW was developed in 1990 by Tim Berners-Lee (1955 ). The Internet, coupled with the WWW, has changed profoundly the way industrialized nations communicate, disseminate, and process information.

FIFTH-GENERATION COMPUTERS

Fifth-generation computing devices are currently under development. The focus of this generation involves making computers behave like humans. This phenomenon was called artificial intelligence by John McCarthy (1927 ) at the Massachusetts Institute of Technology in 1957. The area of artificial intelligence includes gaming, expert systems, natural languages, neural networks, and robotics. Gaming involves creating games that allow users to play against the computer. Expert systems are computer applications that perform the tasks of a human expert, such as diagnosing an illness. Natural languages allow computers to understand natural human languages such as English or Chinese. Neural networks attempt to function like the brain of a human or animal. Robotics includes creating computers that can use human senses such as seeing and hearing.

Although scientists are having a great deal of difficulty in making computers behave and think like humans, there have been some advances in this field. In the area of gaming, programmers have developed computer games that can "outthink" humans. In the area of natural languages, voice recognition software has been developed to convert spoken words to written words. It allows users to speak to the computer and in return the computer dictates what the user says into the form of words of the screen.

Information processing or data processing has become synonymous with computers. The development of the computer, the Internet and the WWW has vastly improved the way that information can be processed. These tools have provided society with more information processing capability than ever before. As the ever-changing world continues to evolve, one can be certain that more information-processing innovations are soon to follow.

see also Hardware; Information Processing; Office Technology

bibliography

The five generations of computers. (n.d.). Retrieved November 17, 2005, from http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp

History of computers. (n.d.). Retrieved November 17, 2005, from http://www.hitmill.com/computers/history/index.html

A history of computers. (n.d.). Retrieved November 17, 2005, from http://www.maxmon.com/history.htm

Introduction to computers. (n.d.). Retrieved November 17, 2005, from http://www97.intel.com/discover/JourneyInside/TJI_Intro/default.aspx

Schneider, David I. (2003). An introduction to programming using Visual Basic .NET (5th ed.). Upper Saddle River, NJ: Prentice Hall.

Ronda B. Henderson

About this article

Information Processing: Historical Perspectives

Updated About encyclopedia.com content Print Article