Microcomputers

views updated

Microcomputers

Before the introduction of the personal computer or microcomputer to the general market in the 1970s, computers were physically large, complex, often unreliable, and expensive pieces of machinery. However, they operated in much the same way as they do now, loading programs from secondary storage into memory, accepting input, and executing instructions and generating output. Similarities between those early machines and the types that were to follow ended there howeverthe early machines were manufactured and sold into a small market, as computers were not yet consumer items. The hardware was expensive as semiconductor devices were only just becoming mainstream, and computers of the time were often accommodated in large purpose-built installations that catered to their temperamental operating requirements. The machinery needed air conditioned rooms and was maintained by squads of specialized professionals behind closed doors. As such, computers were out of the realm of the average person's experience.

The types of programs these machines usually ran generally contributed to their esoteric status. Programs dealing with the management of financial transactions in large batches that took many hours to run did not automatically generate much interest in most people. As a result, computing machinery was a somewhat mysterious phenomenon, and its role was restricted to operations principally involving electronic data processing to support financial management, administrative control in large organizations, and specialized scientific research. However, all this was soon to change.

In 1971 the Intel Corporation of Santa Clara, California (an established manufacturer of semiconductor devices), responded to a particular design problem in a spectacularly successful way. Another company was considering manufacturing hand-held calculators. These calculators were still somewhat primitive because they were incapable of anything other than simple arithmetic operations and could not be programmed by the user. Nonetheless, they were becoming a fashion accessory at the time. The standard approach to the construction of calculators was to implement a design of several separate semiconductor devices. Although this approach was workable, it was somewhat inflexibleif a modification was made to the specification of the calculator, a significant amount of adjustment to the hardware design was required. This generally meant costly re-construction of prototype designs from scratch.

In response to the problem, the engineers at Intel proposed an approach whereby most of the complexity of the design was shifted into one particular device called a 4-bit microprocessor, designated the 4004. On this one occasion, it meant a lot of extra work in order to develop the microprocessor, but once the microprocessor had been completed, its function could be modified by supplying it with a different program to run. Essentially, the calculator had become a restricted type of computer. The microprocessor could accommodate changes in requirements imposed upon it by varying the controlling program it contained, thereby changing its functionality. This avoided the extra delays and re-working that would otherwise be needed, if the calculator had been developed using the conventional design approach of fixed-function devices. In addition, the programmability of the microprocessor meant that its use could be extended well beyond the limits of a hand-held calculator.

Within a few years Intel and some other manufacturers had built more powerful microprocessors using 8-bit and then 16-bit architectures. This opened the way for small-scale computers to be developed around designs based on mass-produced microprocessors. Within five years there was an array of different small computers available for people to purchase, making computing machinery finally available to consumers. These computers were termed microcomputers to distinguish them from the commercially available, business orientated minicomputers and mainframe computers .

Nearly all of the microcomputers were quite primitive thoughmost had very small memory capacity, limited input and output device support, and could only be programmed in low-level languages. As such, they were mainly curiosities and were targeted by technically minded hobbyists and enthusiastsmostly people who understood computing technology, but were unable to buy their own minicomputer from those that were commercially available, because they were too expensive. Steadily, as interest grew, more user-friendly machines were produced and interpreted programming languages became available, enabling people to develop and distribute programs more easily. In the early 1980s, these small computers were given the opportunity to become very useful in a practical sense, following the development of the spreadsheet by Dan Bricklin. No longer were microcomputers to remain solely within the realm of the technically inclined; for the first time they offered more general users a means of automating very labor-intensive tasks and were truly becoming a business tool.

The IBM Corporation became interested in acquiring a portion of the microcomputer market in the early 1980s. This one step did more to establish credibility in personal computing and microcomputer technology than all previous events. IBM was the epitome of corporate computing; it manufactured large computing systems for industrial and commercial computing environments. For them to be involved in the world of microcomputers legitimized the whole concept. From that moment onward, microcomputers became a part of mainstream civilization.

During the first decade of microcomputer history, the hardware was hampered by operating systems that did not support multitasking (multiprogramming). This was recognized as something of a drawback, but it was more difficult to counter than was earlier imagined. Multitasking operating systems were steadily developed for microcomputers, but they faced several difficulties including the lack of inter-operability, the requirement of significant amounts of expensive random access memory (RAM), and the need to maintain support for older legacy applications. Eventually, many of these problems were overcome as hardware capabilities for microcomputers were enhanced and standards for networking and communication were adopted.

The sheer size of the microcomputer market has meant that more research and development spending has been directed at perfecting the technology, and now modern microcomputers offer performance levels that were only dreamed of in the earlier years. Features like voice recognition and integration with the Internet and television technologies promise more for the future. Microcomputers have become indispensable parts of client/server environments in just about all areas of commercial, industrial, educational, and domestic activity.

see also Mainframes; Minicomputers; Supercomputers.

Stephen Murray

Bibliography

Little, Gary B. Inside the Apple IIe. Bowie, MD: Brady Communications Company Inc., 1985.

Norton, Peter. Inside the IBM PC. New York: Brady Communications Company Inc., 1986.

Rodwell, Peter. The Personal Computer Handbook. London: Dorling Kindersley Ltd., 1983.

Zaks, Rodnay. The CP/M Handbook with M/PM. Berkley, CA: Sybex Inc., 1980.