The computers of today are smaller, faster, and more powerful than their predecessors from the 1940s. The underlying philosophy of the ancient computers and their modern cousins, however, is exactly the same. The task remains the identical: to manipulate and interpret information that is expressed as either a number 0 or a number 1. This packaging of information is referred to as the binary bit. Binary bit computers operate according to the laws of classical physics. The quantum computer utilizes the concepts of quantum physics to produce a computer that operates differently from the computers now available. The concept of quantum computers arose and was explored in the 1970s and 1980s. American physicist Richard Feynman (1918–1988) observed that the simulation of quantum systems should be performed with quantum computers. As computer chips became smaller, with more circuitry packed onto a chip, it became apparent to some physicists and computer scientists that this trend of decreasing size would ultimately approach atomic dimensions. At such small sizes, the laws of classical physics do not operate. Thus, a computer based on classical physics could not function.
Quantum computing refers to the current theoretical use of quantum physics in the processing and memory functions of computing. Certain properties of atoms or nuclei could allow the processing and memory functions to cooperatively function. These quantum bits, or qubits, would be the computer’s processor and memory. The operating speed of qubits is much faster than current technologies permit. Quantum computing is well suited for tasks like cryptography, modeling of data, and the indexing of very large databases. It is, however, not suitable for tasks like word processing and electronic mail (e-mail).
Qubits operate differently from the current binary system of computing. Now, the binary bit, or 0 and 1, method of information storage assigns a value to one set of number at a time. For example, a 0 has only one value and must be read before the next piece of information. In contrast, quantum computers encode information according to quantum mechanical states. These states concern the spin of electrons and the position in space of photons. Rather than having a discrete value, a point of information in the quantum computer could exist as 0 or 1, as both at the same time, or as something in between 0 and 1. Thus, instead of being one information point, the event can contain many pieces of information at the same time. This phenomenon is referred to as superposition. A binary computer is not capable of operation in a superpositional manner.
Put another way, a quantum computer would be capable of doing a computation on many different numbers at once, then using these results to arrive at a single answer. This property makes a quantum computer potentially much faster and more powerful than a classical computer of equivalent size. For example, in a code-breaking function like cryptography, factoring a number having 400 digits—which could be necessary to break a security code—would take a fast modern day supercomputer millions of years. A quantum computer, however, could complete the process in about one year. Another advantage of a quantum computer has to do with the space required to house the machine. For example, while today’s supercomputers occupy a large room and require specially cooled and isolated rooms, scientists have calculated that a quantum computer capable of the same or greater computational power would theoretically be no larger, and might actually resemble, an average coffee cup.
The orientation of the photons in a qubit also may serve another function. Scientists, including German–American physicist Albert Einstein (1879– 1955), noticed that if the pattern of light emission of one photon is measured, the light emission state of another photon behaves similarly, no matter how far away the second photon is from the first. The phenomenon is called entanglement. Entanglement effectively wires qubits together, even though no wires are physically present, and makes the electric transfer of information transfer from one qubit to another conceivable. Entanglement is not yet practically use-able. However, such information transfer has been demonstrated in the laboratory.
The potential of entanglement also imposes a great limitation on quantum computing. How qubits can be isolated so as not to be affected by stray external atoms is not yet known. The inner workings of a quantum computer must somehow be separated from its surroundings, while at the same time being accessible to operations like loading of information, execution of information, and reading-out of information. Currently, the best approach involves the exposure of liquids to magnetic fields, much like the technique of nuclear magnetic resonance. Atoms in the liquid can orient themselves in the field, producing the entanglement behavior. Between 2005 and 2006, University of Michigan researchers built semiconductor chips that hold ions within a vacuum with the use of an electromagnetic field, what is called an ion trap. Such devices may lead scientists in the future to developing quantum computers. Research is being performed in several countries, including the United States, at both the government and military levels.
Gould, Robert J. Electromagnetic Processes. Princeton, NJ: Princeton University Press, 2006.
Rae, Alastair I.M. Quantum Physics: A Beginner’s Guide. Oxford, UK: Oneworld, 2005.
Williams, C.P., and S.H. Clearwater. Explorations in Quantum Computing. New York: Springer-Verlag, 1998.
"Quantum Computing." The Gale Encyclopedia of Science. . Encyclopedia.com. (October 18, 2018). http://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/quantum-computing-0
"Quantum Computing." The Gale Encyclopedia of Science. . Retrieved October 18, 2018 from Encyclopedia.com: http://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/quantum-computing-0