Particle Physics, Elementary
PARTICLE PHYSICS, ELEMENTARY
Elementary particle physics is the investigation of nature at a level below current understanding. Its driving questions are as follows: What are the basic constituents needed to build everything that is observable? How do these constituents interact with each other? What is the relationship among the constituents and the interactions? This quest has progressed from everyday objects to molecules, molecules to atoms, atoms to electrons and nuclei, nuclei to protons and neutrons, and protons and neutrons to quarks. Does this progression to smaller and smaller components go on forever, or is there in the end a single fundamental particle? Current understanding of elementary particle physics is expressed in what is called the Standard Model. The universe consists of quarks and particles related to electrons called leptons. These basic constituents interact under the influence of only four forces: the strong force binding quarks to make protons, the electromagnetic force holding electrons to protons to give atoms, the weak force responsible for radioactive decay, and the gravitational force tying the Earth to the Sun. Why do these particular forces and constituents exist? Are these constituents fundamental, or are they made of more basic objects? Are the four forces fundamental, or are they different attributes of a more basic force? Are the constituents and interactions observed only different patterns in the basic geometry of space and time? The ultimate goal of elementary particle physics is to have one single explanation for everything.
The process of asking and answering questions is the most basic human endeavor. People design experiments and decide which measurements to make. People determine the implications of those measurements. People imagine theories that they connect to reality through their interpretation of these measurements. In the ensuing competition among ideas, the theory accepted as being closest to underlying reality is the simplest theory that explains existing measurements, predicts the results of new measurements, and suggests experiments to test those predictions. Since elementary particle physics is an expression of the human desire to understand and control nature, its record is as old as recorded human history. In the past, elementary particle physicists have been called philosophers, natural philosophers, chemists, or physicists. These scientists became elementary particle physicists by searching for a new level of reality underlying the complex behavior of matter at the frontiers of their knowledge.
Because the fundamental questions remain the same, history provides a framework that helps to understand the current perspective of elementary particle physics.
Illustrations of Elementary Particle Physics from the Far Past
Ancient Greece: Seeking Unification
The desire to understand the universe in terms of its constituent particles, its fundamental interactions, or the geometry of space and time is illustrated by three rival elementary particle theories from ancient Greece. One focused on the constituents and held that everything could be explained by atoms (the fundamental constituents) and the space between them, the void. The interactions of objects could be explained in terms of the innate properties of the atoms such as their shape or smoothness. Another theory made four basic interactions of nature fundamental. Those interactions had the properties exemplified by Earth, Air, Fire, and Water. The objects were different because of their combination of these interactions. The third theory held that geometry determined the fundamental nature of both the objects and their interactions.
From the Renaissance to the Nineteenth Century: The Basics
Progress in mathematics, technology, observational techniques, and intellectual rigor over the next 2000 years led to the work of physicists such as Nicolaus Copernicus, René Descartes, Galileo Galilei, Johannes Kepler, and Isaac Newton in the sixteenth and seventeenth centuries. Their investigations resulted in a general theory of force as a description of the interaction between any objects, together with a specific mathematical description of the gravitational force. The concept of mass was invented to characterize both a property of objects and the strength of the gravitational interaction. The search for the fundamental constituents of objects, called atoms or elements, was begun by the next generation of chemists exemplified by John Dalton, Joseph Priestly, and Antoine-Laurent Lavoisier. It took another 200 years for physicists such as Benjamin Franklin, Charles-Augustin Coulomb, Michael Faraday, and James Clerk Maxwell to characterize the electric and magnetic forces well enough to combine them into a unified theory of electromagnetism. Along the way, the concept of charge was invented to characterize both a property of objects and the strength of the electric interaction.
Nineteenth Century: Domination and Puzzles
By the end of the nineteenth century, physicists had developed a powerful theory of the universe. All of the esoteric experiments and theoretical work had paid off handsomely for society. The successful synthesis of classical mechanics had given rise to the first Industrial Revolution that was still in full swing. Civil engineers were building larger and more useful structures, railroads and steamships made the large-scale movement of goods and people possible, personal transportation by bicycle and automobile contributed to a growing sense of freedom, and soon people would be able to fly. Atoms were a theoretical construct of debatable reality, but the atomic theory of elements and their classification in the periodic table gave rise to a thriving chemical industry. Synthetic substances were being constructed and manufactured. The profound effects of the equally successful synthesis of electromagnetism had just initiated the second Industrial Revolution. Messages could be sent across long distances first by telegraph and then by telephone. Cities and even individual homes could be illuminated by electric lighting. Large electrical systems turned mechanical energy into electrical energy to run machines. Even the very abstract concept of electromagnetic waves would find practical application in radio.
There were some anomalies that worried the elementary particle physicists. The theory of classical mechanics was fundamentally inconsistent with electromagnetism. Both theories were very successful, but both could not be correct. It was difficult to completely understand the nature of light. It must be a wave as predicted by electromagnetic theory. However, its behavior did not correspond to either a wave or particle. Furthermore, when an element was heated, it emitted light of only certain colors. Another element would emit different colors of light. This was useful for identifying elements, but no one had shown that the theory explained this behavior. It was clear that matter contained two types of electric charges, but no stable configuration of positive and negative charges could be constructed that allowed this to happen. At an even more fundamental level, what were charge and mass? Were they related? Could a particle have any amount, or was there a smallest possible unit? It was also known that not all matter was stable. What caused radioactive decay? The Sun was a puzzle. Geological time was very long, and there was no known energy source that could keep the Sun burning. The structure of the universe was also a mystery. What keeps the stars distributed in the sky when gravitational force should be pulling them together? The big question was whether all these anomalies could be explained with a better understanding of mechanics and electromagnetism, or whether they were the result of other interactions in nature that required a new theory. The most radical possibility was that both mechanics and electromagnetism were not correct, but only approximations of nature. If that were the case, the observed anomalies could never be explained without building a new theory from the ground up.
First Third of the Twentieth Century: A New Framework
The resolution of the inconsistency of mechanics and electromagnetism required a redefinition of the concepts of space and time—the special theory of relativity. Special relativity also allowed the conversion of matter into energy, which explained how the Sun could keep shining for billions of years. Going further, Albert Einstein and his colleagues were able to determine the nature of the gravitational force from the geometry of space-time—the general theory of relativity. Geometry seemed to be the key to understanding everything. Meanwhile the experiments of J. J. Thomson and Ernest Rutherford showed that the many different atoms that made up the chemical elements were not fundamental particles. Atoms consisted of a small, dense, positively charged nucleus surrounded by very light negatively charged electrons. The existing framework of physics based on classical mechanics, electromagnetism, and special relativity predicted that an atomic structure of this type could not be stable. The negative electrons would quickly spiral into the positive protons. A new theory, quantum mechanics (invented by Niels Bohr, Max Born, Werner Heisenberg, Erwin Schrödinger, and others), avoided this catastrophe. Even the weirder predictions of the new theories were confirmed by experiments. Rapidly moving particles did live longer than those at rest, light was bent by gravity, and electrons did exhibit interference patterns. Soon experiments showed that the nucleus of an atom was itself made of positively charged protons and electrically neutral neutrons.
During the first third of the twentieth century, elementary particle physics was radically different than it had been just 30 years before. The universe could be understood in terms of a simple and satisfying model. There were two fundamental constituents, the electron and proton. There were two fundamental interactions, the gravitational and electromagnetic. The constituents of this theory were characterized by their mass and their charge. There was one particle of each kind of charge: positive and negative. The neutron was thought to be made of a positive proton and a negative electron because it decayed into a proton and electron with a lifetime of about 15 minutes. The elements were the atoms of all possible configurations of electrons, protons, and neutrons held together by electromagnetic forces obeying quantum mechanics and relativity. This theory gave stable atoms that could emit only certain colors of light when heated. Light was neither a classical wave nor a classical particle. Light was a particle, called a photon, that behaved as predicted by quantum mechanics. All particles really behaved this way, but with light the behavior was obvious because it was massless. The new formulation of elementary particle physics not only explained many of the old anomalies, it predicted new phenomena. The anti-electron (positron) was discovered in cosmic ray interactions as predicted from the symmetry of relativistic quantum mechanics. The discovery of the antiproton was just around the corner. The spectra from star light were shifted to the red, as predicted by relativity, if those stars were moving away from the Earth and each other. The universe was not collapsing due to gravitation, it was expanding. A fundamental theory was not yet constructed, but its formulation was surely based on quantum mechanics and relativity.
There were details that needed to be explained. For example, the proton and electron had the same magnitude of charge, but the proton was about 2,000 times more massive then the electron. Why should the masses of the fundamental constituents be so different and their charges be exactly the same? If a nucleus consisted of protons and neutrons, the positive protons should repel and tear it apart. How does nature prevent this nuclear catastrophe? Some nuclei were observed to decay with lifetimes ranging from seconds to thousands of years, yet others seemed absolutely stable. This range of lifetimes is allowed by quantum mechanics, but careful measurements of the products of those nuclear decays could not account for all the energy or momentum. Some was missing. Could the very successful principles of conservation of energy, conservation of momentum, and special relativity be incorrect? One way out was to invent a new invisible particle, the neutrino, that carried off the missing energy and momentum. Meanwhile another hypothetical particle, the meson, was invented to hold nuclei together. Unlike the neutrino, the meson was charged and interacted strongly with matter. It might be detected in cosmic rays or even produced in the more powerful versions of the exciting new invention, particle accelerators. It was obvious that the rate of expansion of the universe had to be slowing down due to gravitational attraction, but was the mass of the universe enough to pull it back together, or would it keep expanding? Was there a Big Bang that started the expansion? If so, how did the energy released in the Big Bang result in the galaxies, stars, planets, and particles that make up everything?
The mystery of cosmic rays needed to be investigated. These very-high-energy particles interacted in the Earth's atmosphere. What were they, where did they come from, and how did they acquire such large energies? One component of the cosmic rays was unusual, a charged particle that did not interact as strongly as either a proton or an electron. It was not the expected hypothetical meson because its interaction was not strong enough to hold nuclei together. Further investigation showed that this new particle, called a muon, was just like an electron except it was about 200 times more massive. How did it fit into a framework of elementary particle physics?
Despite this long list of questions, elementary particle physicists were optimistic. To tie together the loose ends, all that was needed was a single theory that unified the forces of electromagnetism and gravity within a framework that encompassed quantum mechanics and general relativity. This would be a unified field theory of everything. Within this theory, it was hoped that the symmetry of space and time would explain how the elementary particles fit together. How could a family that included protons, neutrons, electrons, neutrinos, and mesons also include the particles of light, photons, and this new particle that no one wanted, the muon?
Second Third of the Twentieth Century: Satisfaction Then Confusion
The second third of the twentieth century started well. Experiments had found the predicted hypothetical particles: the neutrino, meson, and antiparticles. Electromagnetism, special relativity, and quantum mechanics were unified in the theory of quantum electrodynamics (QED). This theory explained electromagnetic interactions as the exchange of photons between charged particles. In this theory, elementary particles determined the behavior of the universe. Properties of space were determined by pairs of particles and antiparticles that were everywhere. Called virtual particles, they could not be directly observed, but they did affect the behavior of observable particles in a way that could be predicted and measured. Using QED, physicists calculated precisely the properties of the electric and magnetic interactions of particles. There were certain terms that gave infinity, but they just had to be ignored. Over the next half-century it was learned that those terms cancelled out in the theory, so they did not really exist in nature.
Following the lead of QED, proton and neutron interactions, called the strong interaction, were formulated in terms of the exchange of mesons. If fundamental particles determined everything, a quantum theory of gravity would require the exchange of a new hypothetical particle, the graviton. Although theorists struggled without much success to put these parts together, experimenters were using the new particle accelerators to make additional elementary particles whose existence was not predicted. Other experiments showed that fundamental interactions were not as symmetric as expected. As the second third of the twentieth century progressed, elementary particle physics was beginning to look very complicated indeed.
New Particles: Who Needs Them?
Newer and larger particle accelerators gave protons and electrons ever higher energies. When they smashed into the stationary nuclei of ordinary atoms, new particles emerged. These new particles were not just pieces of the original projectiles or target particles because they were often heavier than either. It was as if a bullet was shot into a wall and created a car. This conversion from energy to mass is exactly what special relativity predicts in its famous equation, E = mc2. These new particles could be distinguished primarily by their different masses. There were soon too many of these elementary particles to remember, but they could be classified into groups. Some of these new particles were related to the proton and neutron and were called baryons (heavy particles). They were more massive than the proton and survived less than a nanosecond. When they finished decaying, either a proton or neutron was left.
Some of these new baryons were electrically positive, some were negative, and some were neutral. They all had an obscure property called spin. The concept of spin was invented in the first third of the twentieth century to explain the behavior of electrons in atoms. It was called spin because its behavior is like that of a spinning top when described by quantum mechanics. Baryon spin always came in half-integer units such as 1/3 or 2/3, never 0 or 1. Careful measurement of the lifetimes of these new baryons revealed that some decayed more than a billion times more rapidly than others.
The other type of new particle that was produced by accelerators was related to the first meson, now called a pi meson. These new mesons were more massive than the pi meson. Their spin was in integer units such as 0 or 1, never 1/3 or 2/3. Again, careful measurement revealed that some of these mesons decayed more than a billion times more rapidly than others.
For every particle that was discovered, its anti-particle was also found. The particle and antiparticle had the same mass but the opposite charges. Neutrino interactions had finally been detected using the intense neutrino fluxes generated by nuclear reactors. These interactions were so rare that most of the neutrinos would pass through the Earth without being stopped. The behavior of neutrinos required an interaction that was much weaker than electromagnetism but much stronger than gravitation. This interaction was called the weak interaction. It was seen to be the mechanism of the slower decays of baryons and mesons. To make the situation even more complicated, experiments determined that there were two distinct types of neutrinos. They had the same properties such as charge, mass, and spin. One produced electrons, but never muons, when it interacted and was called an electron neutrino. The other produced muons, but never electrons, and was called a muon neutrino.
Since there were hundreds of particles, it was hard to believe that they were all fundamental. They could be classified into three major groups: baryons, mesons, and leptons. The leptons included the electron, the muon, the electron neutrino, and the muon neutrino. The photon did not belong to any of these categories. Although the photon and the neutrino were both shown to be massless, electrically neutral, and stable, they did not seem to be related. The neutrino had spin ½, while the photon had spin 1. The photon easily interacted with matter, while the neutrino rarely did. Even though there were hundreds of elementary particles, there were only four fundamental interactions. Perhaps it was the four interactions that were fundamental, and the particles were all possible results of those interactions. On the other hand, it was possible that there were so many particles because they were made of more basic constituents that were the real elementary particles.
It had been assumed that a theory of fundamental interactions would be symmetric. For example, if space were uniform, it should make no difference if an interaction occurred in San Francisco or Minneapolis. Indeed, that type of symmetry agrees with experimental results. It also seemed reasonable that a reaction would give the same results if all particles were swapped for antiparticles, the directions of all particles were reversed, or the reaction run forward or backward in time. Contrary to common sense, experiments showed that nature does have preferences in each of those cases. For example, as a neutrino raced away from the interaction that created it, its spin was always in the opposite direction to its velocity. Antineutrino spin, on the other hand, was always in the same direction as its motion. Reversing the direction of the neutrino would mean that its spin would now be in the same direction as its velocity, something that does not occur. Three ways of changing interactions were special because a combination of all of them—switching all particles for antiparticles (called C symmetry), then switching the directions of all particles (called P symmetry), and running the reaction in the opposite direction (called T symmetry)—was mathematically shown to give the same result for all interactions that satisfied a few reasonable criteria. One of these criteria was that the theory included special relativity. Another was that particles could only be affected by interactions at their location. Measurements showed that C and P symmetries were violated only for the weak interaction. One mystery was that a combination of C and P symmetry for an interaction (i.e., swapping particles for antiparticles as well as swapping the direction of particles) gave almost the same results even in the weak interaction. The CP symmetry violation existed but was very small. It seemed reasonable that nature might respect a symmetry or not. What could cause an interaction to violate a symmetry only a little?
Bigger Is Better
Particle accelerators were the primary tool used to produce these new particles and study their interactions. It was hoped that the extensive investigation of particle properties would uncover an under-lying simplicity in this complex situation. More powerful accelerators were built to obtain higher-energy proton or electron projectiles that could produce more and heavier particles. These accelerators grew from the size of a machine that could be built on a table by a few people to machines that would fill a large aircraft hanger requiring dozens of people to build and operate. Experiments to analyze the properties of particles also got larger. Soon the apparatus would fill several rooms and require ten or twenty people to operate and determine the results. Elementary particle physics had become too large and complex for a single scientist and a few students. It now required a team of scientists and students working together in a collaboration that spanned several different universities.
The Last Third of the Twentieth Century: Consolidation and Puzzles
A More Fundamental Constituent: Quarks
As the second third of the twentieth century drew to a close, it was proposed that all baryons and mesons were made of more basic particles called quarks. The hundreds of known baryons and mesons could be reproduced with only three quarks, called up (u ), down (d ), and strange (s ). A baryon was a combination of three quarks, and a meson was a combination of a quark and an antiquark. The spin of a quark was ½ unit. The charge of an up quark was +⅔ that of a proton, the down quark -⅓, and the strange quark -⅓. Thus a proton was made of two up quarks and a down quark (uud ), a neutron of two down quarks and an up quark (ddu ), and a positive pi meson consisted of an up and an antidown quark.
The quark theory also explained a puzzling set of measurements. When a particle has a spin and a charge, it can be affected by a magnetic field. The strength of this interaction is called its magnetic moment. The magnetic moment of an elementary particle depends on its electric charge, spin, and mass. Measuring the magnetic moment of nuclei in the body is the principle behind the medical diagnostic tool of magnetic resonance imaging (MRI). When the proton's magnetic moment was measured, it was too large for an elementary particle. Since the neutron is electrically neutral, it should have no magnetic moment. However, measurements showed that it also had a large magnetic moment with the opposite sign of the proton. If protons and neutrons were made up of quarks, their magnetic moments would be the sum of the magnetic moments of their quarks. Thus, the neutron would have a magnetic moment. When baryon magnetic moments were measured, they all agreed reasonably well, but not perfectly, with the quark model.
The quarks were bound together to form either baryons or mesons via a force strong enough to over-come the electric repulsion between like charges. This was the real strong force. The force binding protons and neutrons in the nucleus was a remnant of the force between quarks. The force between neutrons and protons was similar to the force between two magnets. In a single magnet, there is a magnetic force between the north and south poles. Since the magnet has both a north and a south pole, it is magnetically neutral. Nevertheless, because the poles are separated, another magnet experiences a reduced amount of that force. Experiments were launched to find free quarks, but none were found. The first concrete manifestation of quarks appeared when high-energy electrons were used to probe inside a proton. The experiment revealed that there were smaller particles inside. It would take some time to develop a theory that explained how the strong force prevented the existence of free quarks.
Tools, Technology, and Discovery
The final third of the twentieth century saw a synthesis of elementary particle physics into what is called the Standard Model. Developing and testing this theory required still larger accelerators producing higher-energy particles and probing smaller distances. These particle accelerators would no longer fit into a building. Their sizes were measured in miles. National laboratories with staffs of hundreds were required to build and operate these machines. Experimental teams grew to include hundreds of physicists from around the world. New and faster communication was needed to exchange data and other information between the accelerator site and the scientists at their home institutions. This need pushed the development of the Internet and motivated researchers at the European Laboratory for Particle Physics (CERN) to invent an easy way for elementary particle physicists to use it. This invention, known as the World Wide Web, is the most recent example of the huge effect that fundamental scientific investigations can have on everyday life. Even particle accelerators became everyday tools used in treating disease and designing integrated circuits. Toward the end of the century, elementary particle experiments were limited more by economics than by desire, ideas, and technology. Building an experiment now costs about the same as a military aircraft. For the cost of a new accelerator, a medium-size city could operate its school system for a few years.
Soon particle accelerators reached higher energies by colliding beams of particles instead of smashing them onto a fixed target of ordinary material. Beams of antiparticles were created and stored so that they could be collided with beams of particles inside the same accelerator. An unexpected new particle was discovered almost simultaneously by teams at two different accelerators: a new electron-positron collider at Stanford Linear Accelerator Center (SLAC), and an older proton accelerator using a fixed target at Brookhaven National Laboratory (BNL). The properties of the new meson could only be explained by the existence of a fourth quark, called the charmed quark (c ). Interestingly, the existence of this quark had been postulated by some theorists to explain why the heavier strange quark did not easily decay to the lighter down quark. As is often true, what does not happen is at least as important as what does.
Consolidation: Particles Are Basic
Quarks were finally fit into a complete theory modeled after quantum electrodynamics. In electro-dynamics the strength of the interaction is determined by the charge of the object. Similarly, the strong force would need a property of the quark that acted like a charge and determined the strength of the interaction. The electric interaction could be described with two charges, positive and negative. The strong interaction, on the other hand, needed three "charges." In electrodynamics, combining all the different kinds of charges gives a neutral charge (a plus charge and a minus charge gives a neutral charge). Similarly, combining a charge with its anticharge also gives a neutral charge since the anticharge of plus is minus. With the strong force one would have to combine its three charges to get neutral. In an analogy with mixing colored light, where combining the three primary colors gives white or neutral, the term "color" was used for the strong charge. This theory was called quantum chromodynamics (QCD).
Now it was clear why there were baryons and mesons. A system of three quarks, each with a different color, would be neutral and thus not attract any more quarks. These were the baryons. A quark and an antiquark would also be neutral as a combination of a color and an anticolor. These were the mesons. The particles that were exchanged between the quarks to give the strong force were called gluons, since they glued together the quarks. Like the photon, the gluon had no mass or electric charge. However, they did have color, which meant that the gluons themselves would interact strongly. This theory predicted that the strong force holding quarks together in a proton would behave differently than the electromagnetic force holding an electron and a proton in an atom. Unlike the electric force that becomes weaker with increasing distance, the strong force did not diminish as the distance between quarks became greater. It was not possible to isolate a single quark.
Another theoretical breakthrough unified quantum electrodynamics (QED) and the weak interactions into an electroweak theory. This theory confidently predicted the mass of the particles exchanged to give the weak force, called the W+, W-, Z0, at about eighty times the mass of a proton. A new type of accelerator colliding protons and antiprotons was built at CERN to reach this energy, and two experiments were built to surround the collision regions. The particles appeared as predicted, and the electroweak theory appeared to be on firm ground. One consequence of this theory was that it predicted the existence of a new hypothetical particle, the Higgs. The Higgs was responsible for masses of the W+, W-, and Z0 particles exchanged in weak interactions.
By the end of the twentieth century, the new accelerators in the United States had unearthed two more new quarks, the bottom (b ) and the top (t ) at Fermi National Accelerator Laboratory (Fermilab), and two new leptons, the tau (τ) at SLAC and the tau neutrino at Fermilab. The Standard Model was almost complete. There were six quarks and six leptons that each came in three families of two. Each quark family consisted of a quark with +⅔ electric charge and -⅓ charge. (u , d ), (c , s ), (t , b ). Each lepton family consisted of a lepton with a charge of -1 and 0, (e , νe), (μ, νμ), (τ, ντ). Even the ⅓ charges of the quarks no longer appeared so odd. If the proton charge were redefined as 3 units of charge instead of 1, the electric charges represented by fundamental particles would be 0, 1, 2, and 3.
In the Standard Model the photon was responsible for the electromagnetic interaction, the W+, W-, Z0 for the weak interaction, and the gluons for the strong interaction. The W+, W-, Z0 had been produced by particle accelerators. Their properties
were measured to be in agreement with electroweak theory. Precision measurements of Z0 decay showed that there were only three massless (or even low mass) neutrinos. This implied that all the families of Standard Model particles had been found. According to quantum chromodynamics (QCD), gluons, like quarks, could not be produced in an isolated state, but their effects in high-energy interactions were distinctive, and those patterns were observed. There were only two particles in the Standard Model that had, as yet, not been discovered. These were the Higgs required by electroweak unification and the graviton needed for the still unformulated quantum theory of gravity. This was a great simplification over the hundreds of baryons and mesons that were previously called elementary particles. However, many elementary particle physicists thought that there were still too many particles and too many interactions for either of them to be truly fundamental.
The obvious similarity of having three quark families and three lepton families made it seem natural to place quarks and leptons into a single theory that would unify electroweak and strong interactions. These grand unified theories (GUT) were constructed, and they predicted that quarks would decay into leptons. This meant that protons could not be stable but must decay into leptons and mesons. The lifetime of such decays would have to be long since protons last long enough to form stars and populate a universe about 10 billion (1010) years old. Unification of the strong and the electroweak interaction was supported by the results from the particle accelerators that were reaching higher and higher energies. The higher the energy of the interaction, the closer the interacting particles became. Experiments showed that the strength of the strong and the electroweak forces changed as the particles became closer. The strengths of the interactions were becoming similar with increasing energy. At the energy at which the strengths of the interactions were the same, they merged into a single interaction. Using these data, GUT theories predicted that the lifetime of the proton was about 1030 years. This is very stable on the scale of the universe but within reach of experiment. Very sensitive experiments were built kilometers underground to shield them from cosmic rays. In these experiments detectors watched thousands of tons of material for several years. No proton decays were found, and it was determined that the lifetime of the proton was at least 100 times longer than the most straightforward grand unified theories predicted.
The Beginning of the Twenty-first Century: Puzzle and Promise
The end of the twentieth century found elementary particle physics in a similar situation as at the end of the nineteenth century. There was a very successful framework of basic constituents and their interactions, the Standard Model. Unfortunately, the framework was incomplete. Just as there was no known way to unify classical mechanics and electromagnetism at the end of the nineteenth century, there was no known way to unify electroweak, strong, and gravitational theories within the Standard Model. At the end of the nineteenth century, the structure of the periodic table was known but could not be explained. At the end of the twentieth century, the organization of quarks and leptons into three families was not explained. There are also many questions about the nature of the elementary particles that remained unanswered. What is mass? What causes the quarks and leptons to have such a wide range of masses? Of all the quarks and leptons, what causes only the neutrinos to have zero mass? Why are there three families and not just one? Why not four? Why do quarks and leptons have separate families? What is charge? Why do the quarks and leptons have electric charges of 0, 1, 2, and 3? Why not charge 4? Why don't they all have the same charge? What is spin? Why do the particles that are the constituents of matter, the quarks and leptons, have ½ unit of spin, whereas the particles responsible for the forces have integer spin? Why don't all elementary particles have the same spin? Why do interactions obey some symmetries and not others? Why is CP violation so small? If a symmetry is violated, why not maximum violation? How does nature determine a property that is between zero and maximum?
From the Smallest to the Largest
Elementary particle physics was getting closer to explaining the origin of the universe. It was clear that the universe is expanding. All known space was once very small but, for some unknown reason, exploded. The energy released in that Big Bang eventually became the galaxies, stars, and planets. Photons, the Big Flash from the Big Bang, filled space having cooled to an energy within 3 degrees of absolute zero. Measurements determined that mass makes up only one billionth of the energy of the universe. Elementary particle physics had satisfied the ancient human desire to describe the birth of everything. The universe started with the Big Bang. Space was very small but contained a lot of energy. Space expanded, and the energy created matter (quarks and leptons) and antimatter (antiquarks and antileptons) that annihilated and turned back into energy. As space continued to expand, the surviving quarks cooled and combined into baryons. This process continued, finally leaving protons and electrons along with a few simple nuclei. These nuclei were gravitationally attracted to make large objects. When the objects were large enough, the gravitational force squeezed together the nuclei igniting thermonuclear reactions that created stars. The light elements such as hydrogen, deuterium, and helium were made in the early universe and became part of stars. Measurements of the ratios of these light elements agreed with the Standard Model predictions. Nuclear reactions inside of stars made more complex nuclei, and eventually the star exploded as a supernova spreading the nuclei through space. The gravitational force pulled together the complex nuclei making new stars and planets. Meanwhile some of the stars had enough mass to collapse into black holes. These black holes have been detected and seem to become the centers of galaxies.
This is a coherent picture, but there are some obvious questions to be asked. When a particle accelerator turns energy into matter, equal amounts of matter and antimatter are made. That should have happened in the Big Bang. Why did the antimatter not annihilate the matter in the very early universe, leaving nothing but energy? Since our existence shows that at least some quarks were left over to form protons and neutrons, what happened to the equal amount of antiquarks that would make antiprotons and antineutrons? After decades of searching, there is no evidence of an equivalent amount of antimatter in the universe. To have only matter left over from the furnace of creation that was the early universe, quark and antiquark formation and annihilation reactions must have occurred at different rates. Within the framework of the Standard Model, the existence of such a rate difference requires both proton decay and that a CP symmetry violation occur. CP symmetry violation has been observed in the decay of some mesons. However, proton decay is a phenomenon that has not been observed. Either the proton lifetime is so long that it will take much larger experiments to detect it, or something is wrong with the Standard Model.
Other measurements of the properties of the universe also provide a challenge for elementary particle physics. The amount of matter in space can be determined by measuring the strength of the gravitational force on stars. Since stars are held in galaxies by gravity, measurements of their orbital velocities determine how the matter is distributed. The surprising result of these measurements is that about 90 percent of the mass of the universe is not accounted for by visible objects. Just what is this dark matter? Does this indicate the existence of new fundamental particles that are too massive to be made in current particle accelerators? At least some of it could be due to neutrinos if they had even a very small mass. A more shocking result comes from measurements of the velocities of supernovas at different distances, and thus different times, in the history of the universe. These measurements indicate that the expansion of the universe is not slowing down but is speeding up. This is possible if space has a stored up energy like a compressed spring. What fundamental interaction allows the fabric of space to store this energy?
Neutrinos Give Hints
Closer to home, the Sun is the star with the most direct impact on humans. To probe its inner workings, a method for looking deep within its core was needed. Neutrinos could provide the means. Because the neutrino interaction is weak, these neutrinos can escape the Sun's core and reach the Earth. Measuring the properties of these neutrinos was thought to be a good way to probe the inner workings of the Sun. However, when the rate of electron neutrinos coming from the Sun was measured, it was much too low. The differences between the predictions from nuclear reactions and the measurements were too big to be explained by uncertainties of the details of the Sun's structure. This very sensitive measurement required detectors operated kilometers underground, often in mines or tunnels through mountains, to shield them from the cosmic rays that bombard the surface of the Earth. As technology improved, larger and even more sensitive underground experiments determined that, although there were too few electron-type neutrinos, other neutrinos were coming from the Sun. The total number of neutrinos of all types from the Sun was equal to the number of predicted electron neutrinos.
The solar neutrino puzzle that spanned almost half a century now had an explanation. Electron neutrinos originating from the Sun's nuclear reactions change into a different type of neutrino before they reach Earth. It is as if a dog walking across the yard became a cat when it reached the other side. This identity confusion is actually possible in quantum mechanics because an object's behavior is determined by probability. In quantum mechanics, a particle can be created with a probability of being each of two or more types. Only one of the types will be detected but which type it is will depend on how the mixture evolves with time. Identity changing requires that the two identities have a small mass difference and the same charge. This weird behavior was originally observed in the decays of mesons that contained strange quarks and later for those that contain bottom quarks.
Meanwhile another neutrino anomaly surfaced. When the protons and nuclei that make up the cosmic rays strike the Earth's atmosphere, they make mesons, baryons, and leptons in a manner that can be replicated at particle accelerators for all but the highest energy cosmic rays. All the decays of the produced particles have been measured and some of the decay products are neutrinos. The resulting ratio of electron neutrinos to muon neutrinos can be calculated. This ratio is not sensitive to the details of the cosmic ray interactions. When large underground detectors measure this ratio, they obtain a number approximately a factor of 2 different than the prediction. Careful measurements, primarily from a large underground water detector in Japan, show that there are the predicted number of electron neutrinos but too few muon neutrinos. This result has been verified by a large underground iron detector in the United States. The cosmic ray neutrino anomaly can also be explained if some of the muon neutrinos produced in the atmosphere change identity to tau neutrinos as they travel toward the detector. Changing identity is only possible if the neutrinos have mass. An accelerator in Japan has verified this result by producing a muon neutrino beam and shooting it toward the same large underground water detector.
It is now clear that the Standard Model is not the fundamental theory of elementary particles. New theories are waiting in the wings. A theory of super-symmetry would unify the spin-½ constituent particles, quarks and leptons, and the spin-1 interaction particles. This theory predicts that another set of particles mirrors those of the Standard Model but at a mass high enough that existing particle accelerators cannot produce them. Each spin-½ quark and lepton would have a hypothetical spin-0 partner. Each spin-1 interaction particle would also have a hypothetical spin-½ partner. Supersymmetry has the added advantage of reducing the rate of proton decay to a size that experiments would not yet have tested. If supersymmetry is correct, there are many elementary particles waiting to be discovered.
Another alternative to the Standard Model goes back to the fundamental importance of the geometry of space and time. Perhaps the many particles of the Standard Model are not fundamental after all. The particles could be minute but regular vibrations of the fabric of space itself. In this type of theory, the fundamental structure of space and time only allows a certain set of vibrations. This is like the musical notes from the string of a violin that are determined by where the ends of the string are held, the density of the string, and how tight the string is stretched. In this theory the string would be the underlying structure of space and the notes the elementary particles. Such a theory, called superstring theory, has been formulated. The mathematical structure of this theory allows the possibility of unifying the strong, electroweak, and gravitational interactions into a single framework for the first time. However, super-string theory predicts that space has at least ten dimensions. It is hard to visualize more than the usual four dimensions of height, width, depth, and time. The six extra dimensions do provide the flexibility of having neutrinos with mass, supersymmetric partners, and an energy density for space. If these dimensions exist at every point in space why haven't they been noticed?
The theory is still in its early stages and so far has not made any definitive predictions that could be tested with an experiment. It is possible that the extra dimensions are curled up so tightly that they have little effect on everyday life. Perhaps only very precise experiments will reveal the influence of extra dimensions. For example, experiments are underway to determine whether the gravitational force changes behavior at distances of less than 1 mm. It is even possible that the extra dimensions have large effects that were mistakenly thought to be connected to a different cause or were not noticed because no one looked for them. Perhaps there might be a violation of CPT symmetry because interactions in the four dimensional world are influenced by the other dimensions. Examples of such symmetry violations would be different masses or lifetimes of particles and antiparticles. Of course, superstring theory may just be an interesting mathematical diversion with nothing to do with reality. Maybe history will repeat itself, and the Standard Model particles will be found to be made of a smaller number of more fundamental particles that have not yet been found.
On the Horizon
On the experimental front, a new and more powerful colliding beam accelerator is being constructed at CERN to deliver a high enough energy to produce the hypothetical Higgs particle needed for electroweak unification. This will be a crucial test of that theory. Experiments at the lower-energy colliding beam accelerator at Fermilab have already begun to determine if the Higgs is at the lower end of its possible range of masses. These experiments may find the Higgs and begin the task of understanding it by measuring its properties. On the other hand, they may not find it but instead open the door to a new and more fundamental level of matter by finding unexpected particles or interactions. Meanwhile the United States, Europe, and Japan are all building powerful neutrino beams to be shot at huge detectors hundreds of miles away. These experiments will probe the nature of mass by investigating the identity-changing properties of the neutrino and making precision measurements of the mass differences of the neutrino types.
As the twenty-first century begins, elementary particle physics is on the brink of a new understanding of the fundamental workings of nature. Will investigations of CP violation together with new and larger proton-decay experiments finally show why the universe is made of matter? Not all the hidden dark matter can be neutrinos. What is the rest of it? It is clear that the Standard Model is at best incomplete. Will superstring theory emerge to explain the masses, families, and charges of the quarks and leptons? Perhaps results from larger and more sensitive underground experiments or larger and more powerful accelerators will send theory in completely different directions. Will there be a new level of particles below those known? Is humanity about to learn it lives in a universe with more dimensions than imagined? Will the interactions of the Standard Model be unified or, are they only approximations of a different set of fundamental interactions. Elementary particle physics continues its quest to explain everything. In the coming decades which view of the universe will take center stage: that particles are fundamental, interactions are fundamental, or geometry of space is fundamental? It is even possible that a new and original paradigm will emerge. How will these new ways of viewing the universe affect everyday life? One lesson learned from elementary particle physics is that the universe is both stranger and simpler than imagined. Even its most abstract discoveries tend to find practical application. The search for the fundamental components of the universe is certainly never boring.
Asimov, I. Atom: Journey Across the Subatomic Cosmos (Plum, New York, 1992).
Einstein, A., and Infeld, L. The Evolution of Physics (Simon & Schuster, New York, 1967).
Feynman, R. QED (Princeton University Press, Princeton, NJ, 1988).
Greene, B. The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory (W. W. Norton, New York, 1999).
Hawking, S. The Universe in a Nutshell (Bantam, New York, 2001).
Hoffmann, B. The Strange Story of the Quantum (Dover, New York, 1959).
Particle Data Group. "The Particle Adventure." <http://particleadventure.org>.
Toulmin, S., and Goodfield, J. The Architecture of Matter (University of Chicago Press, Chicago, 1982).
Weinberg, S. The First Three Minutes (Basic Books, New York, 1993).
Kenneth J. Heller