Skip to main content

Physics: The Quantum Hypothesis

Physics: The Quantum Hypothesis


The quantum hypothesis, first suggested by Max Planck (1858–1947) in 1900, postulates that light energy can only be emitted and absorbed in discrete bundles called quanta. Planck came up with the idea when attempting to explain blackbody radiation, work that provided the foundation for his quantum theory.

Planck found that the vibrational energy of atoms in a solid is not continuous but has only discrete (distinct) values. Light energy is determined by its frequency of vibration, f. Energy, E, is described by the equation: E = nhf, where n is an integer and h is Planck's constant, equal to 6.626068 × 10−34 Joule-second (J-s).

Planck published his theory “On the Law of Distribution of Energy in the Normal Spectrum” in the German journal Annalen der Physikm, stating: “Moreover, it is necessary to interpret UN ‘the total energy of a blackbody radiator’ not as a continuous, infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts.”

Historical Background and Scientific Foundations

Early in his life as a physics and mathematics student and, later, as a theoretical physicist, Max Karl Ernst Ludwig Planck was especially interested in the law of the conservation of energy, also called the first law of thermodynamics. It states that the increase in the internal energy of a thermodynamic system is equal to the amount of heat energy input into the system, minus work done by the system on its exterior surroundings.

Planck was also interested in the entropy law, the second law of thermodynamics, which states that entropy (disorder) of an isolated system, which is not in equilibrium, is inclined to increase over time and eventually approach a maximum value at equilibrium. This became the topic of his doctoral dissertation at the University of Munich, leading to Planck's quantum hypothesis.

In the late 1800s, physicists were having problems validating several laws of Newtonian mechanics, particularly the second law of thermodynamics. Their primary question was whether it meant that entropy results from the motions of a collection of molecules, as stated by Austrian physicist Ludwig Boltzmann's (1844–1906). Statistical (probabilistic) interpretation was controlled by energy and related physical quantities, as declared by German chemist Wilhelm Ostwald's (1853–1932) absolute energy interpretation. The debate led Planck to develop quantum theory, for which he won the Nobel Prize in 1918.

The Science

Planck explained entropy using Scottish theoretical physicist James Clerk Maxwell's (1831–1879) work with electrodynamics. This describes microscopic oscillators (radiating atoms) that produce the heat radiation emitted by blackbodies. (A blackbody is defined theoretically as any object that absorbs all light falling upon it, reflects none of it, and thus appears black. However, when a blackbody is heated, it emits radiation “light.”) Blackbody radiation is the amount of radiant (heat) energy emitted at various frequencies for specific temperatures of a blackbody.

Many physicists had tried to explain blackbody radiation. They all failed, however, until Planck published his historic theory in 1900. Using the Boltzmann equation, he suggested that the total energy from blackbody oscillators could be divided into finite parts through a process called quantization. Planck described energy as being made of a finite number of equal parts. He included the constant h = 6.55 × 10−27 erg-second (1 erg = 10−7 joule), which he called the quantum of action, known today as Planck's constant.

In 1900 Planck proposed that heat energy E is emitted only in definite amounts called quanta. Thus, his equation became E = hf, where h = 6.626 × 10−34 J-s and f = frequency. Planck maintained that only certain specific energies could appear, and they were limited to n whole-number multiples of hf. Thus, E = nhf.

With this equation, Planck was able to explain blackbody radiation, showing that the hotter an object gets, the more radiation it produces. Since a blackbody absorbs all radiation frequencies, it should, he believed, under physical principles held at that time, radiate equally at all frequencies. Planck found instead that blackbodies emit more energy at some frequencies, and fewer at others. He didn't understand what this meant. The radiating atom appeared to contain only discrete quanta of energy, not the continuous energy he expected. This went against classical physics.

Near the end of 1900 Planck convinced himself that the second law of thermodynamics was not an absolute law. In addition, he assumed that his formulas correctly showed that blackbody oscillators radiate in only discrete amounts of energy, or quanta, not continuously. He published his quantum hypothesis in December 1900.

When Planck introduced the idea of energy quanta to the scientific community, it is unclear whether he really understood the relevance of quantum discontinuity: He was mostly interested in the accuracy displayed by his new law and its constant. Later, he called his equation “a fortuitous guess.”

Influences on Science and Society

Quantum theory, the first evidence that the tiny world of atoms could not be accurately described with classical physics, became the basis of quantum mechanics—the branch of physics that studies the emission and

absorption of energy and the motion of particles at the atomic level. Quantum theory revolutionized scientific thought with respect to atomic and subatomic processes. It is held in the same regard as Albert Einstein's (1879–1955) theories of relativity, which revolutionized scientific thought with respect to space and time.

Einstein used Planck's idea of light quanta in 1905 to explain photons and the photoelectric effect mathematically, the first scientific work utilizing quantum mechanics. In 1907 Einstein showed the quantum hypothesis's wide application by using it to interpret the temperature dependence of the specific heats of solids. Two years later, in 1909, he wrote on the quantization of light and wave fluctuations, describing wave-particle duality—the theory that objects exhibit properties both of waves and particles.

In 1913 Danish physicist Niels Bohr (1885–1962) was the first to use the quantum hypothesis to explain atomic structure and spectra. He showed the association between electrons' atomic energy levels and light frequencies emitted and absorbed by atoms. In addition, Bohr postulated that an atom would not emit radiation while it was in one of its stable states, but only when it traveled between them. The frequency of this radiation would equal the difference in energy between those stable states, divided by Planck's constant. This showed that atoms could not absorb or emit radiation continuously, but only in finite steps called quantum jumps.

For about twenty years physicists worked on the mathematics of quantum theory—finally expressing it mathematically in the 1920s. In 1924, French physicist Louis de Broglie (1892–1987) proposed that all forms of radiation, not just light, exhibited wave-particle duality. He suggested that particles, such as electrons, exhibit wavelike properties in certain circumstances. The de Broglie wavelength is equal to Planck's constant divided by momentum (mass times velocity).

Following de Broglie's work, German physicist Werner Heisenberg (1901–1976), using matrix mechanics, and Austrian physicist Erwin Schrödinger (1887–1961), using wave mechanics, used the wave function to relate the probability of finding a particle at a given point in space and time. In the late 1920s both de Broglie and Schrödinger introduced the concept of standing waves to explain their existence only at discrete frequencies and, consequently, only in discrete energy states.

In 1927 Heisenberg announced his uncertainty principle, which provides an absolute limit on the accuracy of certain measurements. It states that the action of measuring the position x of a particle disturbs the particle's momentum p so that Dx × Dp is greater than or equal to h;, where Dx is the uncertainty (difference) of the position of a particle along a spatial dimension, Dp is the uncertainty of the momentum, and h is Planck's constant. This means that one cannot know position and momentum simultaneously at the atomic level because a photon being used to measure an electron will alter its position and momentum when the electron is bounced off of it.

Planck's quantum hypothesis spawned subfields of quantum mechanics, quantum chromodynamics (the study of the strong interaction), quantum electrodynamics (the study of the relativistic aspects of electromagnetism), quantum electronics (the study of quantum mechanics based on interactions of electrons and photons), quantum gravitation (the current attempt to unify quantum mechanics and general relativity), quantum statistics (the study of particles in statistical mechanics), and various quantum field theories.

Modern Cultural Connections

Before the quantum hypothesis was introduced, the central concepts of physics were: (1) all matter consists of discrete particles with properties of gravitational mass and electrical charge, (2) light was a continuous electromagnetic wave that travels at a constant speed, (3) continuous electromagnetic fields are created by discrete charged

particles, and (4) local interactions of electromagnetic charges are limited by the velocity of electromagnetic waves.

Quantum theory changed these to: (1) all matter is composed of discrete particles and also has wave properties, (2) light has discrete particle properties, (3) discrete statistical fields exist, and (4) instantaneous nonlocal matter interactions exist, called instant action at a distance.

Modern physics is founded on general relativity and quantum mechanics, two theories that have passed many comprehensive and strenuous tests for validity. They contradict each other, however, when physicists attempt to join equations of quantum mechanics (involving the strong, weak, and electromagnetic forces) with equations of general relativity (associated with the gravitational force).

Physicists are searching for what is called a grand unified theory, or “theory of everything”—that will combine the four fundamental forces of nature into one all-inclusive force. This would form a far more complete understanding of space-time (a four-dimensional system consisting of three spatial coordinates and one coordinate of time) than is currently available. So far, none has been found.

Quantum theory explains the dynamics seen in the subatomic world. It has also contributed greatly to other sciences, such as applied chemistry and nuclear physics, and various technologies, such as the laser, electron microscope, and computer. Its complexity makes it difficult to use, however, because it contains an extraordinarily large number of subatomic particles, and requires numerous constants for its various equations.

Specifically, the concept of the quantum—the smallest unit in which energy can be measured—led to the field of particle physics. A classification of elementary particles, called the standard model, is used within particle physics. It describes the strong force, the weak force, and the electromagnetic force using gluons; W−, W+, and Z bosons; and photons, respectively. The standard model also contains 24 fundamental particles (12 particles and 12 antiparticle pairs).

In conclusion, some historians have written that Planck himself did not think these quantum jumps actually existed and that his quantum hypothesis was contrary to laws of classical mechanics and classical electrodynamics. Others point to Einstein as the physicist who first recognized the essence of quantum theory, arguing that Einstein was the first to identify the quantum discontinuity. Credit for the beginnings of quantum theory is generally given to Planck, with Einstein cited as the first to apply it to scientific pursuits. In reality, a field as complex as quantum theory took many scientists to develop it into a major branch of physics.

Primary Source Connection

The following article was written by Rushworth M. Kidder, a senior columnist for the Christian Science Monitor until 1990, when he founded the Institute for Global

Ethics. Kidder is the author of Reinventing The Future—Global Goals For The 21st Century. Founded in 1908, the Christian Science Monitor is an international newspaper based in Boston, Massachusetts. The article, originally published in 1988, describes how quantum mechanics has the potential to change the way the world culture is perceived, as well as how subatomic particles behave.


CAMDEN, MAINE—CALEB THOMPSON may not realize it, but he's preparing to encounter quantum mechanics—not in some far-off never-never land, but on an ordinary Tuesday here at Camden-Rockport High School. Surrounded by the intent faces of his lab partners in Mr. Bentley's second-period honors physics class, Caleb drops a steel ball bearing down a curved aluminum track. When it collides with a second ball bearing, both drop to the floor, landing on a carbon-paper-covered sheet of typing paper. Fellow student Peter Killoran, down on all fours with pencil in hand, labels the marks as the ball bearings drop.

“We're trying to prove that momentum is conserved,” Peter explains. When enough marks have been recorded, he and his partners will compute distances and angles and analyze the vectors, or patterns. Their purpose: to demonstrate Newton's third law of motion, one of the landmarks of classical physics.

But why, in the age of quantum mechanics, string theory, and the search for a possible fifth and sixth force, are these students still learning about Newton's laws? Sitting at his cluttered desk at the front of the room, William N. Bentley explains.

“This is the bridge,” says Bentley, who has taught the highly regarded physics course here for five years. “This is the first time [the students] consider subatomic particles.”

The ball bearings, of course, are hardly subatomic. But their interactions, he notes, behave according to the same Newtonian laws that govern the collisions of everything from meteors to quarks. To bring home that point, Bentley will have his students perform another vector analysis in a few weeks. At that time, however, they'll do it on a photograph showing the tracks left by subatomic particles after they collide in a particle accelerator. And finally, at the very end of the school year, they'll arrive on the doorstep of quantum mechanics.

“All of a sudden,” Bentley says, “we'll see the limitation of Newtonian mechanics.”

Like millions of 20th-century high school students, Bentley's class has grown up in a century shaped by the Newtonian world view. From their earliest years, in and out of science classes, these students have been taught that the material world has an objective reality. They've been told that its objects possess predictable attributes unaffected by observers. And they've been assured that those attributes can best be understood by reducing matter to its fundamental constituents.

Caleb and Peter may not go on to careers in physics. But one thing is sure. The world they will inhabit—the world of the 21st century—will be profoundly affected by quantum mechanics.

Not only will they benefit from the tangible results of this science—results that will presumably extend well beyond today's solid-state electronics, laser beams, superconductivity, holography, and other cutting-edge quantum technologies. They may also be influenced by the intangibles of this radically different way of comprehending the universe.

What will be the cultural impact of those intangibles? Will quantum mechanics, as it becomes more widely discussed, taught, and understood, change their world view? Will it, like the discoveries of Copernicus, Galileo, Newton, and Einstein, have a revolutionary effect on humanity's sense of itself and its universe? Will it produce what is often described as a “paradigm shift”—a fundamental change in the patterns of thinking that shape our vision of reality? Will quantum physics begin to produce a kind of quantum metaphysics? And if so, how might that metaphysics shift mankind's personal, social, political, and theological outlook?

Teasing out the meaning

THESE days, practicing physicists are divided about the relevance of such questions. Some stick to their mathematical formulas and their laboratory experiments, choosing not to think much about the philosophical ramifications. Others insist that the discoveries of quantum mechanics have been misappropriated—especially by proponents of Eastern religions eager to prove their own theses.

But still others feel that the weirdness of quantum mechanics cannot be brushed aside. They do worry, however, that generalizations about its deeper significance have been stretched too far, too fast. We simply don't know enough, they say, to justify some of the more heady assertions about matter, mind, and the universe.

Nobel laureate Murray Gell-Mann, professor of theoretical physics at the California Institute of Technology, is particularly wary. One of the most highly regarded thinkers about quantum matters, he decries the misuse of language among those who would tease out deep meanings from quantum physics.

Physicists, he observes, often turn to everyday language for analogies in their attempts to explain what they're encountering. That's fine, according to Dr. Gell-Mann. The problem arises, he says, when nonphysicists try to make a reality of what, for the physicists, is essentially just a metaphor.

Case in point: the idea that the fundamental particles have no size (or “extension”) at all. “It's just words to say they don't have extension,” says Gell-Mann.

Fellow Nobel Prize-winning physicist Steven Weinberg, who describes himself as “the most unphilosophical of physicists,” agrees—and explains why. “One of the lessons of quantum mechanics,” says Professor Weinberg from his office at the University of Texas, “is that the ideas that are useful in describing nature at the level of ordinary life may not even be meaningful when you get down to the level of the subatomic world.”

“I don't think it's correct to say that particles have a definite size,” he says, “And I don't think it's correct to say they don't.”

The real problem, according to Nobel laureate Sheldon Glashow of Harvard University, lies in language itself, which “just doesn't work” to explain the complexities of quantum mechanics. “There's no reason to expect [our ordinary] language to have any relevance to the way things are,” he says.

That, however, is the kind of puzzle that boggles the non-scientist's mind—and causes even the physicists interviewed for this series to admit, in many cases, that they themselves don't really understand quantum mechanics.

Most, however, would agree with the assessment of a Brandeis University historian of science, Samuel Schweber. He describes quantum mechanics as “a deep revolution.” It is, he says, “so deep that—in some sense, in having affected so many different areas of thought and of intellectual life—we really have not assessed as yet the full impact of it.”

If scientists, sociologists, and historians ever do assess that impact, what will they find? What are the philosophical elements of quantum physics that could reshape mankind's world view?

Classical physics assumes that, somewhere outside ourselves, there is a fixed and objective universe. It may be immeasurably large or small. It may be inextricably entangled and complex. But it's really there. All that keeps us from knowing it in detail are the limitations of our measuring devices and our computational abilities.

Quantum mechanics takes a different view. The material world is not lumpy but wavelike, not made of things as much as of fields. Moreover, its attributes seem to vary depending on the vantage point of the observer, somewhat like the location of a rainbow.

That does not mean, however, that quantum physics denies the existence of matter. “Nobody's saying there's not a world out there,” says physicist John Ellis of the European Laboratory for Particle Physics (CERN) in Geneva. The well-defined laws governing that world may surprise us in their strangeness. But it's a world, all the same.

Princeton University physicist Robert Dicke agrees—although he emphasizes that reality depends not simply on an objective world but on our view of it. “Matter has a very real, solid existence,” says Dr. Dicke, an emeritus professor who has taught quantum mechanics to generations of students, “as far as our view of it goes. But what is out there may not be nearly so well defined and solid and uniquely characterizable as our view of it.”

What are the ramifications of this different sense of reality?

“I think that's what is new,” says CERN physicist John Bell, “is the idea that what I call ‘muddle’ is permanently tolerable.”

This “muddle,” or confusion, he says, is not simply “a phase in the construction of the theory that is to be transcended and replaced by some deeper models in another phase of the theory.

“Somehow we have come to the end of the human capacity to form sharp pictures of what is going on,” he adds, “and more and more we will have to rely on recipes that we don't understand.”

That tolerance for “muddle,” says Bell, is “the most characteristic feature of the orthodox school of quantum mechanics.”

This lack of clarity, in fact, has its quantum counterpart in the electron. Once thought of as solid, definite objects, electrons are now pictured more as clouds or smears surrounding the proton. The result: At the quantum level, the edges of things are inherently fuzzy.

Is that true at human scale as well? It must be, argues Princeton University professor David Gross—since all the objects we encounter are made of atoms, and the outsides of all atoms consist of fuzzy clouds of electrons.

And as with objects, so with people. “You're a bit smeared out on the edges,” he says with a chuckle, “but you don't notice it.”

If matter is not quite the clear, basic, and all-determining thing that the classical world view suggests, then what is real? Some physicists point to something they describe—with hesitation, and sometimes even embarrassment—as mind or consciousness.

It's a concept that arises most prominently in discussions of the “observer-created reality”—where the presence of an observer is thought to cause matter to behave in ways it does not otherwise behave. Professor Gross puts the point most simply. “There isn't a real world out there that you can observe,” he says, “without changing it.”

That's a far cry, however, from saying that only the observer is real—or that everything is only as real as you make it. In fact, what physicists mean by an observer-created reality has to do with probabilities and measurement theory. The fact that the world shifts when it is measured means simply that you cannot make precise predictions about it. That's because the world is composed of probabilities rather than definite, fixed states.

Why does the world change when you observe or measure it? A physicist explains it by saying that a measurement causes this so-called “probability amplitude”—a mathematical way of picturing the probability as though it were a wavy line on a graph—to collapse into a much straighter line.

But measuring implies a measurer. Hence the observer becomes vitally important. As the presence of so many different interpretations of quantum physics suggests, however, the extent of the observer's importance remains a source of considerable discomfort for physicists.

“What is the meaning of the probability amplitude?” muses theoretical physicist Freeman Dyson of the Institute for Advanced Study in Princeton, N.J. “Does it just describe our state of ignorance, or does it describe something real?”

“Those questions really aren't answered,” he continues. “And so I think in a certain sense that it's true to say that it has brought mind and consciousness back to the center of things—that you won't really understand quantum mechanics deeply unless you also understand the nature of mind.”

Then is man, as the possessor of that mind, somehow restored to a central role in the universe—a role from which he was first banished when Copernicus found that the earth was not the center of the solar system? For most physicists, that's stretching things. Quantum cosmology, for example, points to what astrophysicist George Helou calls “the ongoing decentralization, a deemphasis of the importance of man.” In an expanding universe, one can argue that every point is at the center—in only a tiny fraction of which stands an observer.

“We're not only not the center,” says University of Chicago astronomer David Schramm. “We're just all part of the whole, riding with the wave of the big bang. We're certainly not [in] a significant role.”

Questions about man's role

BUT questions about mankind's place in the universe continue to reverberate. “If it could ever be shown that the human mind is somehow a quantum mechanical effect like a semiconductor or a laser,” says Nick Herbert, a physicist and science writer, “and that these concepts not only apply as metaphors but as direct descriptions of the way we are inside, then I think that would be a tremendous revolution.”

“My guess is that the mind is somehow connected with these ideas,” he adds.

Why can matter be described only in terms of probabilities? Because, physicists say, it's governed by random activity.

“Quantum mechanics asserts that at the very foundations of existence there is an essential randomness,” says Rockefeller University physicist Heinz Pagels.

And that's a sharp break from classical physics.

In the world of Newtonian mechanics, the paths of particles and their velocities are the determining causes. Discover the initial paths and velocities of each particle, and you hold the key to every subsequent action and reaction in the universe: You could, in theory, explain everything.

But even Newton, who occupied himself with theological studies in his later years, recognized that an ultimate first cause must have produced those initial conditions. To be sure, actions might appear random. But trace them back far enough and you discover the will of the creator.

Quantum mechanics takes a different view. Trace actions back to their sources, and you discover not God but randomness. Even the big bang is viewed as a random event. There was a tiny but non-negligible probability that it might happen. And all of a sudden, in an entirely random way, it did.

Such probabilities challenge a commonplace of classical physics: the idea that the laws governing matter as we know it are the only conceivable laws. Some physicists point out that if a few of the basic constants of the physical universe—the ratio of the mass of the proton to that of the electron, for instance—were just slightly different, the universe would be a resoundingly different place.

Yet are those the only conceivable constants? That's a question that puzzles John Preskill, a California Institute of Technology physicist. “Is it possible,” he asks, “that [such constants] really were, at some stage in evolution of the universe, sort of picked out from some probability distribution which was determined by the very early quantum state of the universe?” If so, then there is a probability that different constants could also have arisen. If the many-world's thesis holds, in fact, they might already be in place in other universes.

So is there another universe out there where, say, the fine structure constant—a number based on electrical charge, Planck's constant, and the speed of light that is central to many calculations in physics—is something other than 1/137.0365? If there is, most physicists agree, it won't matter to us: We'll never be in a position to observe it.

In another way, however, randomness does affect us: when a particle, for no other reason than that it has a probability of doing so, suddenly breaks loose from an atom and flies away, as happens in radioactive decay. Once the particle flies away, its motions are governed by the strictly ordered laws that characterize the visible universe.

But how can such order grow up out of randomness? The analogy of an opinion poll helps. On a nationwide scale, one can discover, say, that 80 percent of the public approves of a particular presidential candidate. That fact tells you nothing about your next-door neighbor, however: He may be 100 percent against the candidate. And so on up and down your street: Ask resident after resident, and you get what may appear to be a chaotic distribution. Only when you take the nation as a whole, or at least a representative sample of it (as most polls do), can you get an accurate assessment.

In a similar way, randomness tells you nothing about how a particular particle will behave. It may have a 20 percent chance of suddenly disappearing from its present location and showing up elsewhere—even though it would have to pass through an apparently insurmountable “wall” of energy to get there.

That effect may be random. But it's by no means in-significant. It's known as “quantum tunneling,” and it underlies the operation of semiconductors. It's the principle, in other words, upon which transistors and computer chips are constructed.

Why do some particles “tunnel” and others don't? The choice, it seems, is purely random. Yet the result, reined in by technology, produces the extreme precision of the modern-day computer.

If a single particle can tunnel through such a barrier, why can't a collection of them—a whole tennis ball, for example, or a whole human body?

Again, it's a matter of probabilities. “If I threw a ball against the wall,” says astronomer Schramm, “the probability is much, much higher that the ball will bounce back than that the ball will go penetrating through the wall. But the probability of it going through the wall is not absolutely zero.”

As with balls, so with people. “In principle,” says Dr. Gross, gesturing across the dining room at the Princeton University Faculty Club, “there is some non-negligible probability that I'm located on the other side of the room. But that probability is so small that one would have to wait many lifetimes of the universe for me to jump over there and back again.”

When Gross says “jump,” he's choosing his words carefully. One of the basic ideas of quantum mechanics is that energy comes in unbreakable packets. A particle, in going from one state to another, does not move continuously between them. It's instantly translated, “jumping” to its next state. Hence the metaphor of a quantum leap. “Quantum mechanics means things are discrete,” says Fermilab cosmologist Michael S. Turner, adding that “nothing in nature is continuous.”

“If there's anything that mathematical logic says,” asserts John A. Wheeler, a Princeton emeritus professor, “it's that you can't have a continuum. It says that that's an illusion, a myth, an idealization.”

So are space and time also “discrete”—coming in chunks, rather than in a steady flow? Is life not a thread but a collection of blips?

If so, says Dr. Turner, then “we live on a lattice, not on a continuum. As you get down to the most fundamental level, you can either be here or there, but nowhere in between.” Why? Because the ability to move from state to state is determined not only by the particle itself but by the parameters of the entire system.

Not surprisingly, then, various sciences are beginning to study systems in their entirety. These days, says Harvard physicist Roy Schwitters, “people are learning how to deal mathematically with assemblies of matter on a large scale: They can do physics in very complex and chaotic systems.”

That has practical significance, he says, in allowing research into such things as superconductivity. But it also has metaphysical ramifications—as Stephen Toulmin, a Northwestern University historian of science, points out.

“The world as we in fact encounter it does not consist of atoms each of which behaves in an entirely independent way,” he says. Instead, it's a world of “food chains, ecological systems, organs, organisms, families, communities.” To understand the world properly requires what he calls “chain thinking.”

It's no accident, says Dr. Toulmin, that the more interdisciplinary sciences like anthropology and ecology are currently held in much higher regard than in earlier periods, when classical physics reined. Why? Because, rather than breaking the world into isolated parts—looking for the basic building blocks of matter, for instance, as physics has done for so many decades—these sciences attempt to study a complex system as a whole.

What effect beyond the sciences?

WHAT effect could such thinking have beyond the sciences? One impact might be political. Historian Toul-min predicts that “the sense of every individual political nation as having its own absolute sovereignty will [give] way to a much deeper and more universal appreciation of the interdependence of the social units.”

But how much do these ideas owe to quantum physics?

“The changes that are specifically associated with quantum mechanics are very difficult to disentangle from the larger changes of which they're a part,” Toulmin says. “I don't want to talk about the wisdom of the East—I don't want to talk claptrap phrases about the end of linear causality. All the same, there is a general sense that we learn as much about the world from ecology as we do from breaking it up into tiny little bits.”

That might be expected in studying biology, he observes. But Toulmin extends the idea to physics as well. “We understand how the actions of different parts of the universe are intelligible,” he says, “only if we recognize their interdependence.”

“The question is how we see all these different levels of organization in relation to one another—not how do we reduce them all to a single so-called fundamental level,” he adds. “Fundamental always means totally fragmented.”

For Toulmin, the distinction between “chain thinking” and “fundamentalism” even extends into religious phenomenon. “Fundamentalism as it exists in the religious life of America,” he notes, “is based on the idea that every individual is saved separately.”

By contrast, he says, “the general view in the history of Christianity was that it was always communities that were Christian—and it is only within the congregation that the individual has any hope of salvation.”

So as scientific thought turns away from a preoccupation with fundamental particles and toward complex systems, will it provide a world view in which communities of all sorts—ecological, social, and religious—are seen to operate as wholes?

And will that view, ultimately, extend to the concept of mind itself—seen not as a collection of brain cells or even hemispheres, but as an integrated whole?

“I hope that some generation of humanity will find that the laws of physics and the laws of psychology begin to overlap,” says Freeman Dyson, at Princeton. But, like so many physicists interviewed for this series, he urges caution. At present, he says, that possibility is remote.

“When I see people saying that this is already the case,” notes Professor Dyson, “I see that they're using what for me are very flimsy arguments. And I react against that.”

Where, then, does this leave Caleb and Peter and Mr. Bentley's ball bearings? These young men will be well into their chosen careers when quantum mechanics turns 100 years old in the year 2025. By then, it will have had a chance to sink more deeply into public thought. What will the world be like?

It may be a world where muddle is accepted as inherent in nature. If so, it could be a world which, in the terms of John Bell at CERN, will “encourage the people who are not very disciplined in thinking to feel that a kind of free fantasy is the right approach to intellectual affairs.”

On the other hand, it could be a world in which rigid determinism of all sorts—hereditary, economic, racial, medical, educational—is replaced by an insistence that the observer, and especially the self, has a much larger role to play. If the universe—including, perhaps, the human body—is seen as not a ticking clock but a forum for limitless possibilities, so mankind may also be seen that way.

That could, on the one hand, lead to an intensity of individualism that would end in irresponsibility—a point raised by University of Texas physicist Joe Polchinsky. If the many-world's interpretation takes hold, he says, “people may say, 'Well, it really doesn't matter what I do, since there's another universe where I made the opposite choice—so if I can't change anything, I may as well do what I please.”

He characterizes such a view as “baloney”—a gross distortion of the implications of quantum mechanics. But he worries, nevertheless, about the cultural consequences of that distortion.

On the other hand, the perception of infinite possibilities could free mankind from the limits of fatalism and encourage greater individual responsibility. If reality, after all, is in some way shaped by the observer, the individual may discover that he or she is not simply a spectator but a participant.

Finally, if humanity is seen to be organized into complex systems, this fact could have profound effects on thinking. On one hand, it could engender despair over the loss of a private, exclusive, and ordered self. On the other hand, however, it could mean that wholeness and completeness, rather than analysis and fragmentation, would become the standard pattern for high-order thinking.

However the view changes, it may well have the effect of breaking the hammerlock of naive materialism. In the 21st century, says West German physicist Herwig Schopper, director general of CERN, “the changing world view will be in a way to abandon materialism and to go to what's more abstract.”

For Caleb and Peter, the intellectual impact could be enormous—much like that described by Fermilab director Leon Lederman when he first encountered the world of quantum mechanics. “The goose pimples had goose pimples,” recalls Dr. Lederman, sitting in his office beside a stuffed, two-foot-tall Albert Einstein doll.

But however heady the ideas, Caleb and Peter will also have to negotiate what will still appear to be a “real” world in the 21st century. It will still be world of cars as well as quarks, bridges as well as bosons.

“When I'm driving over a bridge,” says Fermilab cosmologist Edward W. Kolb, “I don't worry about the probability that I'm going to tunnel through the bridge rather than drive over.” However much you ponder quantum mechanics, he says, you still have an ordinary life to lead.

“You have to mow your lawn,” he says with a sigh, “whether you think you know the origin of the universe or not.”

Rushworth M. Kidder

kidder, rushworth m. “how might quantum thinking change us?” christian science monitor (june 16, 1988).

See Also Physics: Heisenberg Uncertainty Principle; Physics: Maxwell's Equations, Light and the Electromagnetic Spectrum; Physics: The Standard Model, String Theory, and Emerging Models of Fundamental Physics; Physics: Wave-Particle Duality.



Baggott, J.E. Beyond Measure: Modern Physics, Philosophy, and the Meaning of Quantum Theory. Oxford, UK: Oxford University Press, 2004.

Columbus, Frank, and Volodymyr Krasnoholovets, eds. Developments in Quantum Physics. Hauppauge, NY: Nova Science Publishers, 2004.

Isaacson, Walter. Einstein, His Life and Universe. New York: Simon & Schuster, 2007.

McMahon, David M. Quantum Mechanics Demystified. New York: McGraw-Hill, 2006.


Kidder, Rushworth M. “How Might Quantum Thinking Change Us?” Christian Science Monitor (June 16, 1988).

Web Sites

Oracle Education Foundation: ThinkQuest. “Quantum Mechanics.” (accessed May 4, 2008).

PhysicsWeb: Institute of Physics Publishing. “Max Planck: The Reluctant Revolutionary.” December 2000. (accessed May 4, 2008).

Public Broadcasting Service: A Science Odyssey—People and Discoveries. “Max Planck: 1858–1947.” (accessed May 4, 2008).

William Arthur Atkins

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Physics: The Quantum Hypothesis." Scientific Thought: In Context. . 14 Dec. 2018 <>.

"Physics: The Quantum Hypothesis." Scientific Thought: In Context. . (December 14, 2018).

"Physics: The Quantum Hypothesis." Scientific Thought: In Context. . Retrieved December 14, 2018 from

Learn more about citation styles

Citation styles gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, cannot guarantee each citation it generates. Therefore, it’s best to use citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.