Philosophy of Physics
Philosophy of Physics
PHILOSOPHY OF PHYSICS
The philosophy of physics investigates the logical, conceptual, metaphysical, and epistemological foundations of the physical sciences, especially fundamental physics. It is concerned with general issues such as the subject matters and aims of physics, the nature of physical laws, the direction of time, and issues specific to particular theories such as the measurement problem in quantum mechanics and the status of the second law in thermodynamics. The philosophy of physics is enormously relevant to traditional metaphysics because it addresses the implications of physical theories for fundamental ontology, the natures of time and space, laws, causal relations, counterfactuals, and natural kinds. This encyclopedia includes entries on general, specific, and metaphysical issues in the philosophy of physics, so this entry will serve mainly as a guide to the main problems in philosophy of physics and to direct the reader to more specific articles.
The best short characterization of physics derives from Aristotle's view that physics is the science of motion and the causes of motion of material bodies; paradigmatically the motions of planets, projectiles, and pointers. The primary aim of fundamental physics has been to find a true theory (or theories) that specifies a fundamental ontology, spatiotemporal structure and laws, and that provides a complete (or as complete as possible) account of the motions of such material bodies. Many natural phenomena (e.g., the tides, the weather, rainbows, the growth of plants, the movements of animals, light, and even mental phenomena) either involve the motions of material bodies or are the causes of motions of material bodies. It follows that the scope of physics includes most everything. A true theory that accounted for the motions of all material bodies would be a theory of everything or at least of everything capable of making a difference to the positions and motions of material bodies.
The possibility of their being a complete physical theory was given a tremendous boost by the development of Newtonian or classical mechanics (see the "Classical Mechanics, Philosophy of" entry). The ontology of classical mechanics consists of dimensionless particles that possess inertial mass and certain other intrinsic properties and that move in a three-dimensional space in accordance with certain laws. Macroscopic material bodies are identified with more or less stable configurations of dimensionless particles and the motions of a particle are described in terms of the change of spatial position (or relative spatial position) over time.
The motion of a particle (and so the motions of the material bodies) is determined by the forces acting on it via the single dynamical law F=m(p)a where F is the total force (the vector sum of all forces acting on particle p) on p, m(p) is the inertial mass of p and a is p's acceleration. A free particle (one on which the total force is 0) moves at a constant velocity. Newtonian forces are determined by the intrinsic natures of particles (their masses, charges, and so on) and their relative positions. For example, the attractive gravitational force particles exert on one another is given by F = Gm1m1/r2; where m1, m2 are the gravitational masses of the two particles and r is the distance between the two particles. Classical mechanics was enormously successful in accounting for the motions of material bodies in circumstances where the total force on a body could be (approximately) determined as in the motions of the planets, comets, projectiles, and so on.
Classical mechanics is usually understood as supporting determinism. This means, roughly, that the state of the universe at time t together with the dynamical laws determines the state of the universe at any other time. Pierre Simon de Laplace made this vivid by imagining a supreme intelligence that ascertains the state of the universe at one time and then, knowing the laws of mechanics, is able to predict the state of the universe at any other time. There are subtle issues concerning the relations between determinism and prediction and also issues concerning whether classical mechanics is genuinely deterministic. It is only given certain qualifications concerning the nature of the force laws and the assumption that the system is isolated (see the "Determinism and Indeterminism" entry). Many philosophers think that the issue of the truth of determinism has significant implications for issues concerning free will (see the "Determinism and Freedom" entry) and more generally the place of mind in nature.
During the latter half of the nineteenth century classical mechanics was extended to include light and other electromagnetic phenomena. This involved introducing electromagnetic fields and dynamical equations (Maxwell's equations) that described the dynamics of electromagnetic fields and interactions between the motions of charged particles and fields. Light was understood as a kind of wave disturbance in the aether—a posited substance that was supposed to fill all space and provide the ground for electromagnetic fields. Also, toward the end of the nineteenth century it became increasingly plausible that matter is composed of atoms of various kinds and that these can be identified with Newtonian particles. By the last decade of the nineteenth century the package of Newtonian mechanics, Maxwellian electromagnetic theory, and the atomic theory of matter looked like good candidates for the sought after complete theory of the motions of material bodies. Of course this turned out not to be so.
One of the main philosophical discussions inspired by classical mechanics concern the natures of space and time. Isaac Newton thought of space as a kind of arena in which particles move. Newtonian space is absolute in that its existence and nature is independent of the particles it contains. It is three-dimesional, infinite in each dimension, homogeneous, and Euclidian. The positions of and so distances between particles are defined in terms of their locations in space. It follows that for Newton there are distinct possible universes in which all the distances among particles are identical but the positions are translated. Famously, G.W. Leibniz argued that it is difficult to make sense of absolute space. He observed that God would have no reason to place the material contents of space in one region of absolute space rather than another and concluded that absolute space offends the principle of sufficient reason.
The dispute between Newton and Leibniz blossomed into a debate between those (absolutists) who think of space as an independent entity that provides spatial structure and those (relationists) who think of spatial relations between particles as primary. There are famous arguments on both sides. Relationists observe that by the lights of Newtonian physics, absolute position and motion are empirically inaccessible. Empiricist considerations suggest to them that we should not believe that absolute space exists. Absolutists respond that although we cannot determine absolute motion, absolute space is required to provide an adequate explanatory theory including the explanation of possible distance relations and of rotations. Relationists reply by arguing that the empirical content of Newtonian theory is that trajectories are physically possible only if they can be embedded into absolute space and satisfy the Newtonian laws, but that reference to absolute space is merely a convenient fiction. Only spatial relations are real. This debate has survived the demise of Newtonian mechanics and continues in discussions of the interpretations of relativity theories (see the entries "Space in Physical Theories" and "Relativity Theory").
Newton also thought of time as absolute. He suggests that time flows throughout the universe at a constant rate. It is assumed that a free particle traverses equal absolute distances in equal intervals of absolute time and so free motion measures absolute time. There is also an absolutist/relativist issue concerning the nature of time. Relativists observe that Newtonian theory provides no empirical access to absolute temporal locations. Again empiricist considerations suggest that physics can do without Newtonian time.
Some relativists claim that the empirical content of classical mechanics involve only facts about temporal sequences of interparticle distances. On a sophisticated relativist account, the laws of classical mechanics specify which sequences of interparticle distances are physically possible (see Julian Barbour's The End of Time and the "Time in Physics" entry for further detail). Exactly how far one can go in dispensing with apparent spatial and temporal structure in favor of spatial and temporal relations while maintaining the empirical core of classical mechanics—or relativity and quantum mechanics—remains a lively topic of discussion.
Another issue concerning time in classical mechanics involves the apparent direction or arrows of time. Many apparently lawful processes, in particular those associated with thermodynamics, are temporally directed. For examples, gasses diffuse, ice in warm water melts, and electromagnetic waves emanate from moving charged particles and the entropy of isolated systems never decreases. In addition, causation, counterfactuals, memory, decision, and so forth are temporally directed. However, the dynamical laws of classical mechanics are temporally symmetric in that for any sequence of particle positions that are in accord with those laws (i.e., is physically possible), the reversed sequence of positions is also physically possible. Where then does the arrow of time come from? Newton seems to have thought of time as possessing an intrinsic direction of flow. But it is hard to see how this flow—whatever "flow" amounts to—can account for the temporal asymmetries. It seems that the solution must lie in physical laws or conditions rather than the metaphysical nature of time.
There has been much work within physics on the problem of reconciling temporally asymmetric processes, in particular those of thermodynamics, with temporally symmetric fundamental laws. (see the "Philosophy of Statistical Mechanics" entry). Ludwig Boltzmann observed that most of the micro states compatible with, say, a block of ice floating in warm water are ones that evolve toward the future in accordance with their dynamical laws to ones in which the ice block melts. Most is determined relative to a natural measure on the set of micro states and Boltzmann understood this to mean that it is very likely that the ice block will melt. However, it turns out that, relative to the same probability measure, it is very likely that the ice cube evolved from one that was more melted in the past! This follows from the temporal symmetry of the laws. One response to this problem is that the explanation of temporal asymmetries lies in the macro state of the very early universe. It is posited that this state was one of enormously low entropy (and satisfies certain further conditions) and it is also posited that there is a probability distribution over micro states that realize this state. This is called "the past hypothesis" (Albert 2000).
It has been argued that it follows from the past hypothesis and the dynamical laws that macroscopic systems that become approximately energetically isolated (e.g., an ice cube in warm water) will satisfy (the appropriate statistical versions of) the laws of thermodynamics. Some philosophers have pursued his idea further and claimed that all of the temporal arrows are ultimately derivable from the past hypothesis and the dynamics. The foundations of statistical mechanics and the relations between fundamental laws of physics and special science laws (see the "Special Sciences" entry) remain controversial philosophical issues.
The idea that the package of classical mechanics, electromagnetic theory, the atomic theory of matter, and statistical mechanics constitute the complete theory of motion was undermined during the first decades of the twentieth century as it became clear that these theories are incompatible with one another and inadequate as a theory of the very small—atomic structure—and the very big—cosmology. One big problem is that in Maxwell's equations the speed of light appears as a constant of nature. It was thought that this speed is relative to the aether. This suggests that it ought to be possible to measure the absolute velocity of the Earth relative to the aether by sending light rays in various directions. However, experiments designed to measure the velocity of the Earth relative to the aether yielded null results. It appeared that measurements of the speed of light yield the same result no matter the velocity of the source or receiver. Obviously, some modification of classical mechanics/electromagnetic theory was required.
H. A. Lorentz proposed modification of the Newtonian laws so that clocks and measuring rods, which are in motion with respect to absolute space, systematically slow down and shorten. As a consequence, although there are facts about the velocities of bodies with respect to absolute space, it also turns out that those velocities cannot be detected. Albert Einstein's special theory of relativity (STR) makes a quite different and revolutionary proposal. It rejects absolute Newtonian space-time as the framework for the motions of matter in favor of Minkowski space-time.
In Minkowski space-time the fundamental notion is that of the space-time interval between events. Einstein posited that the interval between any two events connected by a light ray is 0. This has the consequence that there are no absolute (frame independent) facts about the elapsed time or spatial distance between two events. It also follows that there are pairs of events (events that cannot be connected by a light ray) for which there are no absolute facts about their temporal order. Einstein's proposal entails the same phenomena as Lorentz's as a result of changing the underlying spatiotemporal structure.
The change from Newtonian space and time to Minkowski space-time suggested to Einstein the possibility of accounting for gravitation not as a force between bodies, but rather as a feature of space-time itself. He succeeded in doing this in the general theory of relativity (GTR). The main idea of the general theory is that the geometry of space-time itself has a geometrical structure that is not Euclidian (i.e., flat) but, rather, depends locally on the distribution of matter and energy. According to GTR, bodies freely move on geodesics (the shortest paths between points in space-time) and what counts as a geodesic is given by the geometry. Because gravitation is an effect of space-time in the GTR, not a force as in classical mechanics, it follows that it acts on all bodies in the same way. This is quite different from Newtonian mechanics in which gravitation is a force that acts the same on all bodies only because inertial mass and gravitational mass are equal. Where in Newtonian mechanics space is an inert arena, in the GTR space-time is a dynamical entity that changes over time through interactions with matter. Both the STR and the GTR are spectacularly successful in their empirical predictions (see "Relativity Theory" entry).
The STR and the GTR have been the objects of much discussion in the philosophy of physics. Among the main issue are: paradoxical scenarios; for example, the twin paradox and the possibility of closed causal loops and apparent time travel, the extent to which the metric of space-time is a real fact or is, to some extent, conventional (see the entry on "Conventionalism"), descendents of the absolutist/relationist dispute within relativistic frameworks, the formulation and viability of determinism within relativity theory (see the "Hole Argument" entry), the compatibility relativity, and quantum mechanics.
The other major failure of classical mechanics/electromagnetic theory concerned its inadequacy as accounts of atomic structure and interaction between atoms and light. According to these theories, atoms should be unstable. Further, it was found, contrary to these theories, that matter emits radiation only with certain specific frequencies, that light behaves in particle-like as well as wave-like ways, and that electrons behave in wave-like as well as particle-like ways. Over the first third of the twentieth century a novel theory—quantum mechanics—developed to account for these and many other phenomena. In quantum mechanics the state of a system at t is characterized by a wave function Ψ(t).
Ψ(t) specifies the values of certain "observables" (position, momentum, spin, and so on) and the probabilities of obtaining various measurement results. A novel feature of quantum mechanics is that Ψ(t) specifies the values of only some observables; for example, if it specifies the value of x-spin, say spin up (in which case it is said to be an eigenstate of x-spin with value spin up), it specifies no value for other spin observables (e.g., y-spin). This is an instance of Werner Heisenberg's uncertainty principle. Ψ also specifies the probabilities of the results of measurements of other spin observables. If Ψ1 is an eigenstate of observable O with value v1 and Ψ2 is an eigenstate of O with value v2, then the superposition c1Ψ1+ c2Ψ2 is a well-defined state that specifies no value for O but says that the probability of a measurement of O yielding value v is c2. Ψ(t) evolves deterministically by Schrödinger's law except when measured. When measured, Ψ collapses probabilistically to an eigenstate of the measured observable.
Quantum mechanics is beset with puzzles. The dominant way of thinking about quantum mechanics—the Copenhagen interpretation—holds that an observable possesses a determinate value only when the state is an eigenstate of that observable. What does it mean for an electron to possess a position but no determinate momentum (or the other way round) and yet for there to be a probability of a measurement yielding a particular value? It turns out that, in typical (nonmeasurement) interactions, the macroscopic system will evolve into a state that is not an eigenstate of ordinary properties. This is the situation of Erwin Schrödinger's cat that ends up in a state that is not an eigenstate specifying whether it is alive or dead (see "Quantum Mechanics" entry). What can that mean?
Further, that measurement appears in the fundamental laws is immensely implausible and completely unsatisfactory without a precise characterization of measurement. There is also the novel feature that typical quantum states are nonlocal. As Einstein observed and John Bell demonstrated (see entries on "Einstein, Albert" "Bell, John, and Bell's Theorem," and "Non-locality"), there are quantum states involving pairs of particles for which a measurement on one of the pair instantaneously changes the probabilities of certain measurement results for the other particle. This appears to be a kind of influence at a distance that seems incompatible with special relativities apparent prohibition on superluminal causal influences. Whether or not the conflict is genuine is a subtle issue
For most of its history and up until the present, these problems encouraged an instrumentalistic construal of quantum mechanics (see entries on "Scientific Realism" and "Copenhagen Interpretation"). Instrumentalism amounts to giving up the ambition of a complete true theory of motion. However, in the last few decades a number of realist interpretations of quantum mechanics have been proposed. These include Bohmian mechanics, many world/minds theories, and spontaneous collapse theories. Each of these interpretations specify an explicit ontology (that interprets the wave function realistically and may include other items) and laws governing that ontology that yield results matching (or approximately matching) the predictions of orthodox quantum theory. In some, such as Bohmian mechanics, the dynamical laws are completely deterministic, whereas in others, such as the GRW collapse theory, are probabilistic. Because these interpretations are empirically equivalent (or approximately empirically equivalent), they provide an interesting real example of theory underdetermination (see the entry on "Underdetermination Thesis, Duhem-Quine thesis").
Among the notable features of realist interpretations of quantum mechanics are the difficulty squaring it with relativity theories. Currently, there is no satisfactory quantum version (realist or not) of general relativity. Producing such an account is one of the urgent problems of contemporary physics. Less often appreciated is the difficulty in reconciling quantum mechanics and Einstein's Minkowski formulation of special relativity. A realist understanding of the wave function seems to require (because of nonlocal states) more space-time structure than Minkowski space-time provides. Interpretations of quantum theory and connections with relativity will be of central concern in the philosophy of physics in the twenty-first century.
See also Bell, John, and Bell's Theorem; Classical Mechanics, Philosophy of; Conventionalism; Copenhagen Interpretation; Determinism and Freedom; Determinism and Indeterminism; Einstein, Albert; Hole Argument; Non-locality; Philosophy of Statistical Mechanics; Quantum Mechanics; Relativity Theory; Scientific Realism; Space in Physical Theories; Special Sciences; Time in Physics; Underdetermination Thesis, Duhem-Quine Thesis.
Albert, David Z. Quantum Mechanics and Experience. Cambridge, MA: Harvard University Press, 1992.
Albert, David Z. Time and Chance. Cambridge, MA: Harvard University Press, 2000.
Maudlin, Tim. Quantum Non-locality and Relativity. Oxford, U.K.; Cambridge, MA: Blackwell, 1994.
Barry Loewer (2005)