The Development of Integrated Circuits Makes Possible the Microelectronics Revolution

views updated

The Development of Integrated Circuits Makes Possible the Microelectronics Revolution

Overview

True revolutions in technology are relatively rare. They mark radical departures from one way of life to another. The use of tools, the invention of movable type, and the construction of the atomic bomb are examples of developments that changed society in a fundamental way. The microelectronics revolution that followed the development of the integrated circuit in 1959 has again remade the world as we know it. In our lifetimes, it has propelled us into the era we call the information age.

Background

The year 1958 marked the tenth birthday of the transistor, a tiny device used to boost electrical signals as they are transferred through it. Transistors were slow to find wide commercial application until 1954, when they were used to produce portable radios. Compared with the bulky vacuum tubes they were designed to replace, transistors were small, cheap, and consumed little power.

The problem with transistors was that the more complex the system that required them, the greater the number of transistors needed. Not only that, but each transistor was equipped with two or three connectors that needed to be attached to something else—for example, electrical components called diodes, resistors, and capacitors—and all the connections that held the packet, or circuit, together had to be made by hand. The workers, often women, who assembled the tiny pieces had to pick them up with tweezers and wire them together. A malfunction among a few of the hundreds of soldered joints in a transistor spelled ruin.

These drawbacks became increasingly apparent toward the late 1950s. At the time, the Korean War was at its height, and because the military had a particular interest in making things smaller, lighter, and more reliable, it led the drive to miniaturization. One proposal the Air Force found attractive was to do away with the individual components and the wire leads altogether. That approach would entail making everything from a single crystal, that is, a single block of a substance called a semiconductor that has properties between those of metal and glass. This concept of making a complete circuit from a single material was dubbed the "monolithic integrated circuit," from the Greek word for "single stone."

The idea of a solid block with no connecting wires was first proposed in 1952 by Geoffrey A. Dummer of Britain's Royal Radar Establishment. Although several years later the RRE awarded a contract to a company to build such a device, the project never progressed very far. Dummer later attributed the failure of the United Kingdom and Europe to exploit electronic applications to war-weariness. Only in 1960 did the RRE form a team dedicated to studying the idea. U.S. companies such as RCA and Westinghouse also attempted similar projcts, usually with support from the military. But there, too, researchers worried that the individual elements of a monolithic integrated circuit would be inferior to components made separately.

In 1958, working alone in the laboratory at Texas Instruments, a physicist named Jack St. Clair Kilby (1923- ) wrote in his notebook that he thought resistors, capacitors, transistors, and diodes could all be assembled into a circuit on a single silicon wafer. As Kilby saw it, he could make all the components on one side of a piece of silicon, using a technique known as batch processing that included the interconnections as part of the manufacturing process. Kilby ran the idea by his boss, who told him to test the principle by first making the circuit in the ordinary way, using separate components—but to make them all out of silicon. Because silicon was not available, Kilby used germanium, a greyish-white element. The result was crude, but it worked. Kilby was able to show that integrated circuits could be constructed from a single piece of semiconductor material.

Kilby next turned to improving and refining the techniques he would need to make his circuits. Then in January 1959, a rumor that RCA was planning to patent an integrated circuit of its own spurred Texas Instruments to submit an application in Kilby's name, titled "Miniaturized Electronic Circuits." One month later, the company unveiled its so-called solid circuit at a press conference at the annual Institute of Radio Engineers show. The circuit was no bigger than a pencil point and performed as well as circuits many times larger.

In the meantime, at rival company Fairchild Semiconductor in northern California, Nobel laureate Robert N. Noyce (1927-1990) was also developing a scheme for making multiple devices on a single piece of silicon with an eye to reducing size, weight, and cost. On July 30, 1959, Fairchild filed an application with the Patent Office in Noyce's name titled "Semiconductor Device-and-Lead Structure."

Nearby, William Shockley (1910-1989), head of Shockley Transistor Corporation, was working on his own integrated circuit. Shockley shared the 1956 Nobel Prize in physics with Walter H. Brattain (1902-1987) and John Bardeen (1908-1991) for the invention of the transistor. He called his circuit the Shockley diode (a valve for electrical current), and he hoped AT&T would purchase it in bulk for use in electronic switching systems for their telephone network. But the components were quirky and switched from "off" to "on" unpredictably. Unlike the methods used by Noyce and Kilby, Shockley diodes were made on both sides of the silicon slice, and constructing them was a painstaking process. For this reason, Shockley's method had been rejected earlier by Noyce and others who went on to prove they were right to abandon it. But Shockley persisted for another year, refusing advice from his staff that they try to produce a simpler device.

Integrated circuits are made by a combination of processes that use chemicals, gases, and light to create transistors by building up thin layers of semiconductor on a silicon wafer, and then etching away or adding material according to a pattern worked out in advance. In a complex chip, for example, the cycle of steps is carried out 20 times or more to form the three-dimensional elements of the chip's circuitry.

Bell Laboratories, where the transistor had been invented, also turned to producing integrated circuits. Their approach was to eliminate as many components and interconnections as possible. This approach turned out not to work, but in the attempt, a scientist named M. M. "John" Atalla realized a breakthrough that had been sought from the early days of transisters, called the field-effect transistor. In 1960, taking his discovery one step further, Atalla and a colleague created the first metal-oxide-silicon, or "MOS" transistor, on which most integrated circuits and microchips are now based.

Impact

The first patent to be awarded for integrated circuits went to Noyce on April 25, 1961, and Fairchild was the first company to introduce the new circuits into the market. But by October Texas Instruments, which had been turning out individual circuits by hand, produced an array of silicon circuits the size of a grain of rice that contained two dozen transistors along with the other necessary components. By the year 2000, a fingernail-size sliver of silicon was able to contain millions of transistors, and hundreds of chips could be made on a single wafer.

President John F. Kennedy's (1917-1963) call to put a man on the Moon by the end of the decade created a market for the integrated circuit overnight—nowhere would the advantages of miniaturization be more welcome than aboard spacecraft. Electronics companies such as Motorola and Westinghouse rushed to catch up with pioneers Fairchild and Texas Instruments. Business Week magazine announced an "impending revolution in the electronics industry."

That revolution has indeed occurred. Integrated circuits have enhanced our lives in countless ways. The microelectronics industry, to which integrated circuits gave birth, has created millions of jobs. Computers that once would have occupied a space the size of a house have become small, available, and cheap enough for almost anyone to own. Machines run more cleanly and efficiently, medical technology saves lives, and banks the world over exchange money through electronic networks, all thanks to integrated circuits. In poorer countries, technologies built on integrated circuits have decreased the cost of capital investment required for industrialization and development, allowing those countries to compete in the global marketplace.

We have adjusted very quickly to the microelectronics revolution. Washing machines, digital clocks and watches, the scoreboard in a ballpark, the bar code on your groceries, and the collar that lets only your cat to go in and out of its catflap are just a few of the mundane applications of integrated circuits that we take for granted every day.

Microelectronics is about information—ever-increasing amounts of information. And the ability of integrated circuits to store and process this information has redefined the meaning of power. Who controls information, who has access to it, how much it costs, and the uses to which people put it are questions that bear more and more on the conduct of our private and public lives.

Like the transistor from which it developed, the integrated circuit has had repercussions that were totally unpredictable. It is argued that there are limits beyond which miniaturization cannot go, but for now at least, devices have not stopped shrinking. One instance is microelectromechanical systems, a hybrid of machines and electronics the size of a speck of dust that combine the ability of computers to think and of machines to do things. Still largely at the stage of laboratory prototypes, their potential has barely been tapped. But microelectronic devices are not only growing smaller. So-called power electronics uses larger and larger transistors—about the size of a postage stamp—to handle greater amounts of electrical power, for example, to control electric motors.

GISELLE WEISS

Further Reading

Riordan, Michael, and Lillian Hoddeson. Crystal Fire: TheBirth of the Information Age. New York: W. W. Norton, 1997.

Ryder, John D., and Donald G. Fink. Engineers and Electrons. New York: IEEE Press, 1984.

"The Solid-State Century: The Past, Present, and Future of the Transistor." Scientific American (special issue, 1997).

About this article

The Development of Integrated Circuits Makes Possible the Microelectronics Revolution

Updated About encyclopedia.com content Print Article

NEARBY TERMS

The Development of Integrated Circuits Makes Possible the Microelectronics Revolution