The History, Development, and Importance of Personal Computers

views updated

The History, Development, and Importance of Personal Computers


The personal computer was introduced in 1975, a development that made the computer accessible to individuals. Up to that time computers had been very large and expensive, operated mainly by big companies. The first modern computers were created in the 1950s and have a long theoretical and technical background. The use of computers has profoundly effected our society, the way we do business, communicate, learn, and play. Its use has spread to all literate areas of the world, as have communication networks that have few limits. The personal computer has inspired new industries, new companies, and created millionaires and billionaires of their owners. It has also changed the English language and refocused the power in many businesses from the men who procure the money to those who create the product.


Human beings have devised many ways to help them do calculations. Before the creation of the modern large computer and its refinement, the personal computer, a number of discoveries and inventions were necessary. The decimal system, a binary mathematical system, and Boolean algebra are required to make computers work. The eighteenth-century discovery of electricity was also essential, as was the knowledge of how to use it in the mid-nineteenth century. The first automatic calculator appeared in the seventeenth century, using wheels and gears to do calculations. In the nineteenth century, Joseph Jacquard (1752-1834) invented a loom using punch cards attached to needles to tell the loom which threads to use in what combinations and colors. With it, he wove complex patterns in cloth, still called a Jacquard design. In the same century Charles Babbage (1792-1871) designed a "Difference Engine" to calculate and print out simple math tables. He improved it with his "Analytical Engine" using punch cards to perform complex calculations, though he never had the funds to build one. Thus, by the end of the nineteenth century, many elements necessary to make a modern computer work were in place: memory cards, input devices, mathematical systems, storage capabilities, power, and input systems.

The genesis of the modern computing machine came in 1888 when Herman Hollerith (1860-1929), an American inventor, devised a calculating machine to tabulate the U.S. Census for 1890. It was mainly a card reader, but it was the first successful working computer, the grandfather of modern computers. More than 50 of them were built and sold. Hollerith's company, the Tabulating Machine Company, was the start of the computer business in the United States. When Hollerith sold out in 1911, the name was changed to the Computing-Tabulating-Recording Machine Company. In 1924, this company became International Business Machines Corporation (IBM). IBM dominated the office equipment industry for nearly 25 years with its calculators, electric typewriters, and time clocks.

Digital electronic computers appeared in 1939 and 1944, but they were only interim steps in computer development. These computers were huge and expensive, used by large companies to do bookkeeping and math quickly and accurately. They were analog computers, controlled by relays or switches, and needed huge air conditioning units to keep them cool. Because of this, and the cost of one unit, the use of computers was very limited. The first general-purpose electronic digital computer, ENIAC, was constructed in 1939. Its major components were vacuum tubes, devices that control electric currents or signals. These tubes commonly powered radios and television sets at the time. The programming of ENIAC was a long, tedious process.

A new more advanced computer was built in 1951 by Remington Rand Corporation. Called UNIVAC, it was the first commercially available computer. It was very expensive, very large, and still powered by vacuum tubes. IBM manufactured its first large mainframe computer in 1952 and offered it for sale to companies, governments, and the military. For nearly 30 years, IBM was the most successful company in information technology. The invention of the transistor in 1947 began the trend toward small computers and the personal computer. Created by three scientists at Bell Labs, for which they received Nobel prizes, the transistor is a device that does the job of a vacuum tube at a fraction of its size. It is solid with no moving parts, durable, and begins working immediately without the need to warm up like a vacuum tube. Transistors vary in size from a few centimeters in width to a thousandth of a millimeter. They are smaller, lighter, less expensive to make, cheaper to use, and more reliable than tubes. They use very little power and had replaced tubes by the early 1960s. Transistors control all of the operations in a computer as well as peripheral devices.

The first fully transistorized large computer was built by Control Data corporation in 1958, and IBM unveiled their own version in 1959. These were expensive machines, designed to work for large corporate tasks. All the essential parts of a modern personal computer had been invented by the early 1960s. A computer chip is a tiny piece of silicon, a non-metallic element, with complex electronic circuits built into it. The integrated circuit links transistors together to create a complete circuit on a single chip. Microprocessors are groups of chips that do the computing and contain the memory of a computer. With these devices, the working parts of a computer can be contained on a few computer chips. This innovation continued to shrink the size of computers. Very large computers, like the Cray and IBM machines, were called mainframes or minicomputers. By the end of the 1960s many industries and businesses had come to rely on computers and computer networks, and the personal computer was just around the corner.

Besides the hardware that makes up a computer, the most important element in making it work is the program that instructs it in what to do. The first programmer was Ada Byron (1815-1852), daughter of the British poet Lord Byron. She created theoretical steps to be used in Babbage's machines. BASIC was the first modern programming language, a simple system that almost anyone could learn. Soon it was necessary to create more complex sets of languages and instructions, and this was called software. Microsoft Corporation was started in 1976 by Bill Gates (1955- ) to create and market software for personal computers. As computers increased in power, speed, and the variety of functions they performed, the size and complexity of programs also expanded. Many modern programs contain tens of millions of lines of instructions in complex codes. Some are essential to the running of the machine and are built into it. In early computers, the user had to create his own program, but it is nearly impossible to buy a computer today that can be programmed by an individual. Software is delivered to the computer by way of floppy or compact discs or is already installed in the computer. It enables a user to create written documents, display pictures, sound, play games, make charts, and gain access to the Internet.


Enormous changes have come about in the past 30 years as a result of the development of computers in general, and personal computers in particular. This creation ranks as one of the most important inventions of the twentieth century. The computer is used in government, law enforcement, banking, business, education, and commerce. It has become essential in fields of scientific, political, and social research as well as aspects of medicine and law. Everyone is affected by the manipulation and storage of data. There are negative consequences of these developments. There are those who engage in fraudulent acts, malicious mischief, and deception. These activities have spawned the need for computer security and a new category of technical crime fighters.

At first, the personal computer was defined as a machine usable and programmable by one person at a time and able to fit on a desk. It was inexpensive, accessible, simple enough for most people to use, and small enough to be transportable. The claims for the identity of the first personal computer are numerous and depend on definition. One of the first small computers was a desktop model built by Hewlett Packard in 1972. It had all the basics: a language, memory storage device, a keyboard, and a display terminal. However, because it was built for scientists and engineers, it was not available on the general market. The first personal computer available for purchase was the Altair 8800. It was introduced, described, and pictured in the January 1975 issue of Popular Electronics magazine. It came in kit form ready to be assembled and was aimed at hobbyists who liked to build their own radios and other electronic devices.

The first personal computer that was fully assembled and offered for sale on the general market was Apple I. It was built by 25-year-old college dropout Steven Wozniak (1950- ) in his garage in Sunnyvale, California. With his friend, Steven Jobs (1955- ), Wozniak showed the new machine at the first Computer Show in Atlantic City in 1976. It astonished viewers with its small, compact size and speed, but did not sell. Wozniak redesigned it. When Apple II was unveiled, encased in a plastic cover, with color graphics, BASIC, and an accounting program called VisiCalc, orders soared. No established company was willing to invest in a machine built in a garage, so Jobs and Wozniak created the Apple Computer Company in 1977. They moved out of the garage and hired people to manufacture the machine.

Soon many individuals and companies leapt into the personal computer market. Some computers were designed for the knowledgeable hobbyist, while others followed the lead of Apple. Those computers were made for those who wanted the computer to do something and didn't care how it worked. Tandy (called Radio Shack today); Texas Instruments, which had built the first electronic calculator; Commodore; and other companies began to build personal computers for sale. Some prospered, some failed.

When IBM finally got into the personal computer market in 1981, it had an immediate impact, even though it had serious limitations. Its computer had no hard disk drive and no software or graphics. But it did have the magic letters on the front—IBM. Many customers felt that if IBM, already called "Big Blue," built a computer, it had to be good. It even convinced many people that since IBM was building personal computers, then they were here to stay. IBM sold 20,000 machines in the first few months and could have sold 50,000, but they were not geared up to manufacture that many. Its design and refinements have been followed by many other manufacturers. IBM PCs, or clones, now dominate the computer market.

Just as few computer owners program their machines, few transport them. For that, a new type of personal computer has appeared: the laptop computer. It is popular among students, researchers, and business travelers, as is the new palm or hand-held computer.

Large mainframe computers changed the way businesses ran and kept records. Personal computers changed the way individuals did business, kept family records, did their taxes, entertained, and wrote letters. Even those who fear or shun computers use them or come into contact with them every day. When they use an ATM to deposit or draw out money, they are using a dedicated computer. When paying for groceries or gasoline with a credit card, a computer is involved. The internal systems of their automobiles are run by computers. Computer literacy has become a necessary skill for technical or scientific jobs and is becoming a requirement for many jobs, such as bank tellers, salesmen, librarians, and even waiters in restaurants who use computers as part of their daily work.

Today the definition of a personal computer has changed because of varied uses, new systems, and new connections to larger networks. A personal computer is now one that is used by a single operator at a business, a library, or his own home. Most personal home computers are used by individuals for accounting, playing games, or word processing. They have become an appliance that provides entertainment as well as information. They are affordable, and anyone can learn to use them. An increasing number of people do business at home on their own personal computers, or one provided by the company, and only need to travel to a place of employment a few days a week.

Personal computers are also widely used by small enterprises like restaurants, cleaning shops, motels, and repair shops. They are often linked together in networks in larger businesses like chambers of commerce, publishing companies, or schools. These computers look and behave like personal computers even when they are linked to large computers or networks. They are no longer programmable by the operators and seldom transportable.

The importance and impact of the personal computer by the beginning of the twenty-first century rests in one part on the development of the computer and in another on the creation of a new system of communications—the Internet—that depends on personal computers and could not have become so widespread without them. Together, computers and the Internet—with its attendant World Wide Web and e-mail—have made a huge impact on society, and every day radical changes are made in the way educated people all over the world communicate, shop, do business, and play.

The Internet, the World Wide Web, and e-mail are actually three distinct entities, allied and interdependent. The Internet is a network of computers that stretches around the world and consists of phone lines, servers, browsers, and clients. It began during the Cold War in a communications network linking researchers at the United States Department of Defense (DOD) and military contractors. In 1969 it was vital to be able to maintain contact in the event of a nuclear attack. When those tensions eased, the network continued as a convenient way to communicate with research groups and companies all over the world. This network was developed at the Advanced Research Projects Agency and was initially called ARPAnet.

At first ARPAnet's primary use was for electronic mail, or e-mail, beginning between 1965 and 1971. It took years of refinement and increased communication capabilities, like fiber optic cables for telephone lines, for users to be able to communicate with each other despite differing types of computers, operating languages, or speed.

ARPAnet continued to grow, still used mostly by military contractors and the DOD. In the 1970s it was opened to non-military users, mainly universities. The first host was installed at UCLA, the second at Stanford, both in California. By 1971 software was being created to enable messages to be sent to and from any computer. E-mail then became accessible to all. International connections were available by 1973. In 1983 ARPAnet was split into military and civilian sections, and the civilian network was dubbed the Internet. It is now defined as the physical structure of the network—its phone lines, servers, and clients.

The World Wide Web enhances the Internet. It is a collection of sites and information that can be accessed through those sites. Tim Berners-Lee worked at CERN in Switzerland and wrote software in 1989 to enable high-energy physicists to collaborate with physicists anywhere in the world. This was the beginning of the World Wide Web, which became an essential part of the Internet in 1991. The Web has multimedia capabilities, provides pictures, sound, movement, and text. It is made up of a series of electronic addresses or web sites. The Internet and the World Wide Web became easier and more useful when Web browsers were invented to locate, retrieve, and display this information in both text and pictures.

According to some estimates, there were approximately 40 million personal computers as the twenty-first century dawned, and most of them were connected to the Internet. No business hoping to sell products to a large audience in the new century will be able to ignore personal computers or the Internet. Any individual who wishes access to a wide range of information or to buy goods and services will need a personal computer wired to the Internet to do it.

Personal computers have changed the way we do business. Computers have created new businesses and changed others. They have altered the focus in boardrooms from the people who procure money to those who create or make decisions about new product. It has also made millionaires and billionaires of those who entered the business early. Undoubtedly, the effects of the social, economic, and cultural revolution spawned by the development of the personal computer will continue to be felt in the twenty-first century.


Further Reading

Kidder, Tracy. The Soul of a New Machine. New York: Avon Books, 1981.

Oakman, Robert L. The Computer Triangle: Hardware, Software and People. New York: Wiley, 1995.

Shurkin, Joel. Engines of the Mind: Evolution of Computers from Mainframe to Microprocessor. New York: Norton Publishers, 1996.

Veit, Stan. Stan Veit's History of the Personal Computer. Asheville, NC: WorldComm Press, 1993.

About this article

The History, Development, and Importance of Personal Computers

Updated About content Print Article