Skip to main content

Quantum Computing

Quantum Computing

Resources

The computers of today are smaller, faster, and more powerful than their predecessors from the 1940s. The underlying philosophy of the ancient computers and their modern cousins, however, is exactly the same. The task remains the identical: to manipulate and interpret information that is expressed as either a number 0 or a number 1. This packaging of information is referred to as the binary bit. Binary bit computers operate according to the laws of classical physics. The quantum computer utilizes the concepts of quantum physics to produce a computer that operates differently from the computers now available. The concept of quantum computers arose and was explored in the 1970s and 1980s. American physicist Richard Feynman (19181988) observed that the simulation of quantum systems should be performed with quantum computers. As computer chips became smaller, with more circuitry packed onto a chip, it became apparent to some physicists and computer scientists that this trend of decreasing size would ultimately approach atomic dimensions. At such small sizes, the laws of classical physics do not operate. Thus, a computer based on classical physics could not function.

Quantum computing refers to the current theoretical use of quantum physics in the processing and memory functions of computing. Certain properties of atoms or nuclei could allow the processing and memory functions to cooperatively function. These quantum bits, or qubits, would be the computers processor and memory. The operating speed of qubits is much faster than current technologies permit. Quantum computing is well suited for tasks like cryptography, modeling of data, and the indexing of very large databases. It is, however, not suitable for tasks like word processing and electronic mail (e-mail).

Qubits operate differently from the current binary system of computing. Now, the binary bit, or 0 and 1, method of information storage assigns a value to one set of number at a time. For example, a 0 has only one value and must be read before the next piece of information. In contrast, quantum computers encode information according to quantum mechanical states. These states concern the spin of electrons and the position in space of photons. Rather than having a discrete value, a point of information in the quantum computer could exist as 0 or 1, as both at the same time, or as something in between 0 and 1. Thus, instead of being one information point, the event can contain many pieces of information at the same time. This phenomenon is referred to as superposition. A binary computer is not capable of operation in a superpositional manner.

Put another way, a quantum computer would be capable of doing a computation on many different numbers at once, then using these results to arrive at a single answer. This property makes a quantum computer potentially much faster and more powerful than a classical computer of equivalent size. For example, in a code-breaking function like cryptography, factoring a number having 400 digitswhich could be necessary to break a security codewould take a fast modern day supercomputer millions of years. A quantum computer, however, could complete the process in about one year. Another advantage of a quantum computer has to do with the space required to house the machine. For example, while todays supercomputers occupy a large room and require specially cooled and isolated rooms, scientists have calculated that a quantum computer capable of the same or greater computational power would theoretically be no larger, and might actually resemble, an average coffee cup.

The orientation of the photons in a qubit also may serve another function. Scientists, including GermanAmerican physicist Albert Einstein (1879 1955), noticed that if the pattern of light emission of one photon is measured, the light emission state of another photon behaves similarly, no matter how far away the second photon is from the first. The phenomenon is called entanglement. Entanglement effectively wires qubits together, even though no wires are physically present, and makes the electric transfer of information transfer from one qubit to another conceivable. Entanglement is not yet practically use-able. However, such information transfer has been demonstrated in the laboratory.

The potential of entanglement also imposes a great limitation on quantum computing. How qubits can be isolated so as not to be affected by stray external atoms is not yet known. The inner workings of a quantum computer must somehow be separated from its surroundings, while at the same time being accessible to operations like loading of information, execution of information, and reading-out of information. Currently, the best approach involves the exposure of liquids to magnetic fields, much like the technique of nuclear magnetic resonance. Atoms in the liquid can orient themselves in the field, producing the entanglement behavior. Between 2005 and 2006, University of Michigan researchers built semiconductor chips that hold ions within a vacuum with the use of an electromagnetic field, what is called an ion trap. Such devices may lead scientists in the future to developing quantum computers. Research is being performed in several countries, including the United States, at both the government and military levels.

See also Abacus; Nanotechnology.

Resources

BOOKS

Gould, Robert J. Electromagnetic Processes. Princeton, NJ: Princeton University Press, 2006.

Huang, Fannie, ed. Quantum Physics: An Anthology of Current Thought. New York: Rosen Publishing Group, 2006.

Rae, Alastair I.M. Quantum Physics: A Beginners Guide. Oxford, UK: Oneworld, 2005.

Williams, C.P., and S.H. Clearwater. Explorations in Quantum Computing. New York: Springer-Verlag, 1998.

Brian Hoyle

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Quantum Computing." The Gale Encyclopedia of Science. . Encyclopedia.com. 18 Oct. 2018 <http://www.encyclopedia.com>.

"Quantum Computing." The Gale Encyclopedia of Science. . Encyclopedia.com. (October 18, 2018). http://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/quantum-computing-0

"Quantum Computing." The Gale Encyclopedia of Science. . Retrieved October 18, 2018 from Encyclopedia.com: http://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/quantum-computing-0

Learn more about citation styles

Citation styles

Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

http://www.mla.org/style

The Chicago Manual of Style

http://www.chicagomanualofstyle.org/tools_citationguide.html

American Psychological Association

http://apastyle.apa.org/

Notes:
  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.