Probability: History, Interpretation, and Application

views updated

Probability: HISTORY, INTERPRETATION, AND APPLICATION

It is often said that something is "probably the case" or "probably not the case." The word probable comes from the Latin probabilis, meaning commendable, which itself derives from probare, to prove. Indeed, the English probable and provable have the same etymologic origin. The scientific study of probability takes the everyday notions of recommending and approving and gives them strict definitions and systematic analysis, something that narrows their focus while enhancing their power to inform. Insight into related matters is essential in advanced technological societies where experts regularly give technical advice to a public that must then decide whether or not to accept it. This may involve the development of new government policies or actions to be taken by individuals, such as submitting to a new medical treatment.

But there are other complex issues to consider. It is generally understood that probability has something to do with chance, a concept of enduring fascination throughout history. While philosophers explore alternative interpretations of probability that lead to different modes of induction in science, there remains the enigma of the role of chance in the world. Given the theories of quantum physics and evolutionary biology proclaiming a universe of chance, how do these impact the fundamental questions of philosophy that sooner or later confront every thinking person: Who am I? Why am I here? How should I live my life?

Reflecting in search of insight, it is important to distinguish between what is science and what is philosophy, and to differentiate between the speculations of philosophers—traditionally fraught with controversy—and the daily activities of practicing scientists. There is a need to understand the role of probability in science and technology, as well as its relation to the perennial questions of human existence. After a brief sketch of the history of probability, the present entry offers some thoughts on this vast and profound subject, concluding with a discussion of the applications of probability at the start of the twenty-first century.


Highlights of History

This quick survey of the history of probability is presented in two sections, beginning with the evolution of mathematical concepts and then turning to their use in philosophical speculation.


THE RISE OF MATHEMATICAL PROBABILITY. There are earlier records of mathematics applied to games of chance, but the beginning of the theory of probability is generally identified with the 1654 correspondence between the two French mathematicians Blaise Pascal (1623–1662) and Pierre de Fermat (1601–1665) concerning the so-called problem of points in gambling. The question was how to divide the stakes between two players who part before completing the game. To arrive at the solution, Pascal introduced the binomial distribution for p = .5 and found the coefficients by means of the arithmetical triangle, a curious numeric structure now named after him. In 1657 the Dutch mathematician Christiaan Huygens (1629–1695) published his monograph De Ratiociniis in Ludo Aleae (Reasoning on games of chance), the first printed mathematical treatment of games of chance. In these games equally likely outcomes, such as the six faces of a balanced die, were the assumption that led to the classical definition of probability. The first major work devoted to probability theory was Ars Conjectandi (The art of conjecturing) by the Swiss mathematician Jakob (Jacques) Bernoulli (1654–1705), published in 1713. It contained the first form of the law of large numbers.

About this time in England, attention focused on the by then established systematic recording of births and deaths and related practical issues of insurance and annuities. Relative frequency was applied to mortality data by the merchant John Graunt (1620–1674), whose Natural and Political Observations ... Made upon the Bills of Mortality (1662) marked the beginning of actuarial science. The stability of observed ratios suggested the second, the statistical or frequentist, definition of probability. William Petty (1623–1687), physician and mathematician, coined the term political arithmetic in his quantitative analysis of social phenomena that would become the foundation of modern economics. Also working in England, the French mathematician Abraham de Moivre (1667–1754) wrote The Doctrine of Chances; or, A Method of Calculating the Probabilities of Events in Play (1718, 1738, 1756), another landmark in the history of probability. The second and third editions of the book include his discovery of the normal curve as the limit of the binomial distribution.

Important advances were made in the first part of the nineteenth century. The normal distribution, applied to measurement variations in astronomy, was studied by the French mathematician Pierre-Simon de Laplace (1749–1827), author of the first comprehensive work on probability, Théorie analytique des probabilités (1812; Analytic theory of probability). Laplace discovered and proved the earliest general form of the central limit theorem. The normal curve is also called the Gaussian distribution, after the German mathematician Carl Friedrich Gauss (1777–1855), who developed it as the law of errors of observations, in conjunction with the principle of least squares, in which it plays a key role. Least squares, a method for combining observations to estimate parameters by minimizing the squared deviations of the observations from expected values involving the parameters, became a basic tool in astronomy, geodesy, and a wide range of other areas.

Probability came to be used for the analysis of variation in itself, not as errors to be eliminated, in the social sciences and in physics and biology. The intense study of heredity triggered by Charles Darwin's (1809–1882) theory of evolution, spearheaded by his cousin Francis Galton (1822–1911), would lead to the new field of mathematical statistics around the turn of the twentieth century. The axiomatic foundation of the modern theory of probability was the work of the Russian mathematician Andrei N. Kolmogorov (1903–1987), published in 1933.

PROBABILITY AND PHILOSOPHY. The notion of prob-ability dates back to antiquity, and beyond games of chance to questions of philosophy, of permanence and change, of truth and uncertainty, of knowledge and belief. The revival of interest in the thought of the ancients during the Renaissance brought about an interplay of intellectual currents with scientific discoveries that energized a renewed search for explanation and meaning. The role of chance was at the core of developments from the start.

Pascal posed a challenge to skeptics of his day in the famous "Wager" of his Pensées, published posthumously in 1670, in which the question of God's existence was to be answered as if by the toss of a coin at the end of life. Presenting arguments for betting that God exists, Pascal developed basic elements of decision theory concerning courses of action in the face of uncertainty.

The work of Isaac Newton (1642–1727), his universal law of gravitation and his synthesis of cause and effect explained by laws of physics in a fully determined universe, launched the era of modern science. Since then, reports of scientific advances have been at the forefront of public consciousness, dominant factors to be integrated into any cohesive worldview. Newton's system involved his concept of an omnipresent deity who maintains the motion of heavenly bodies, and this led to a lively natural theology (part of philosophy, as it does not have recourse to Revelation) in the eighteenth century. In contrast to the observed regularity of planetary orbits there was variability in human affairs, but here the stable patterns of long-run frequencies also seemed to imply design and purpose. The constant excess of males among the newborn was a recurring example.

In 1710 John Arbuthnot (1667–1735), physician and scholar, published an influential essay titled "An Argument for Divine Providence, Taken from the Constant Regularity Observed in the Births of Both Sexes." He found that in the eighty-two consecutive years on record more boys than girls had been born in London. He reasoned that because boys were at greater risk of dying young as a result of their duties in the world, there was a need in a monogamous society for more boys to be born, and this was wisely arranged by Providence. His article contained the earliest example of a test of a statistical hypothesis, concluding that the observed result would be highly unlikely if in fact the true probability of a boy was one-half.

De Moivre aimed to show that probability had more consequential objects than the frivolous pastime of gambling, and in the second and third editions of The Doctrine of Chances argued for its serious mission in proving the existence of God. While chance produces irregularities, he wrote, it is evident that these are governed by laws according to which events happen, and the laws serve to preserve the order of the universe. We are thus led "to the acknowledgment of the great MAKER and GOVERNOUR of all; Himself all-wise, all-powerful, and good" (1756, p. 252).


One of the most famous documents in the history of science is "An Essay towards Solving a Problem in the Doctrine of Chances," by Thomas Bayes (1702–1761), an English clergyman also interested in probability. It is the first expression in precise, quantitative terms of one of the chief modes of inductive inference. The essay contains what is now called Bayes's theorem and is central to approaches known as Bayesian inference. The manuscript was published posthumously in 1763, with an introduction by the Reverend Richard Price (1723–1791). In delineating the importance of Bayes's achievement, Price suggested that his method of using the probabilities of observed events to compare the plausibility of hypotheses that could explain them is a stronger argument for an intelligent cause than the appeal to laws obtained from chance events proposed by de Moivre. More generally, as asserted by Price and explored by modern scholarship, Bayes's method in a sense evades the problem of direct induction posed by the Scottish philosopher David Hume (1711–1776), who rejected the very possibility of inductive reasoning. A Bayesian does not claim to justify any set of beliefs as uniquely rational. But having a belief structure that satisfies the axioms of probability, one's earlier personal probability (degree of belief) can be updated by new evidence in a coherent, reasonable manner. Bayes's method, the argument goes, provides a uniquely rational way to learn from experience.

In Germany, using results from England as well as his own extensive collection of data, Johann Peter Süssmilch (1707–1767), military chaplain and mathematician, wrote the first analytic theory of population, Die göttliche Ordnung in den Veränderungen des menschlichen Geschlechts, aus der Geburt, dem Tode, und der Fortpflanzung desselben erwiesen (1741; The divine order in the fluctuations of the human race, shown by the births, deaths, and propagation of the same). Through his pioneering work in demography Süssmilch sought to discern in the detected patterns of population trends, in this natural order, the eternal laws of God.

As the use of probability expanded in the nineteenth century, so did philosophical concern with the problem of chance in a deterministic universe, with questions of causality, proof, natural law, free will. Speculation entered a new phase with the theory of evolution, when chance assumed a dominant role, to be enhanced by quantum theory in the early twentieth century. The debate continues with renewed vigor, in the light of new developments in cosmology, evolutionary biology, and other related disciplines.


Interpretation: A Commentary

The following discussion of various aspects of probability does not aim to be comprehensive or exhaustive. Rather, it offers some comments to stimulate thought and further exploration of this deep, complex subject.


OBJECTIVE VERSUS SUBJECTIVE PROBABILITY. Probability has a dual nature, recognized since its emergence in the seventeenth century. It may be aleatory (frequentist, from "dicing") or epistemic (pertaining to knowledge), also called objective or subjective probability. Objective probability takes a sort of Platonic view, assuming the existence of idealized states, represented by a mathematical model and estimated by observed relative frequency. Subjective probability is degree of belief, and it involves personal judgment.

Both interpretations are common in everyday use. The probability that a newborn child is a boy, which is .5 according to Mendelian genetics and .51 as observed relative frequency, provides two examples of objective or frequency-type probability. The subjective or belief-type may refer to any statements expressing some belief or opinion. It can be illustrated by the high-profile Terri Schiavo case of early 2005. A severely brain-damaged woman, on artificial nutrition and hydration for years, had her feeding tube removed by court order at the request of her husband but against the strong objections of her parents. There were many conflicting reports in the media concerning important aspects of the case, so that no one not directly involved could possibly know the facts for sure. In the absence of a living will, a key factor was the husband's claim, challenged by others, that prior to being stricken fifteen years earlier the young woman had clearly stated her wishes not to be kept alive under these circumstances. The diverse opinions expressed in public and private debates were examples of subjective probability, not determined by objective information, but reflecting the division in American society on a host of related issues.

The precise interpretation of probability in science has been of special concern to philosophers. The theory of subjective probability is the theory of coherence of a body of opinion, guided by its conformance to the axioms of probability that both types must obey, with probability as a number between zero and one. There are several approaches of subjective probability, explained and illustrated with simple examples in Ian Hacking's 2001 textbook An Introduction to Probability and Inductive Logic.


The subjective probability of a proposition may be defined as the value to the user of a unit benefit contingent on the truth of the proposition. The concept of personal value or utility is central to decision theory in economics and the behavioral sciences. But in general statistical inference, the two interpretations of probability are in direct opposition, with no resolution likely in the foreseeable future. The subjective approach, usually called Bayesian, involves combining one's prior probability, based on a qualitative assessment of the situation, with new information to obtain the posterior probability. A key controversial issue is the subjective choice of the prior probability. Critics of objective probability counter that relative frequency itself is subjective, because it depends on the denominator used, and what about situations in which long-run repeated experimentation under identical conditions is not possible, even in principle? And so it goes. But any approach of logic has its intrinsic limitations. There are no right or wrong answers to the debates of philosophers; probability and chance are among the primitive concepts always open to analysis, such as knowledge, cause, and truth.

Some points to remember: Unless otherwise indicated in the title of a published report, the "default" method of analysis is based on objective probability and the classical (Neyman-Pearson) theory of statistical inference. From the viewpoint of communicating scientific results to the public, often in media sound bites, objective probability seems to be the more suitable method. In any case, under many conditions the results are similar. But discoveries are not made by formula. Creative scientists know what is happening in their own field and entertain ideas in the context of their own views. Out of this may emerge something new after years of search and many blind alleys. Ethical concerns pertain to violation of the codes of research conduct and false reporting of results, whatever the claimed method of confirmation.

CHANCE AT THE HEART OF REALITY? From the great Aristotelian synthesis of antiquity to the late nineteenth century, physical determinism with strict causality was a basic assumption of science and philosophy. Chance was taken as a measure of ignorance, a lack of knowledge of the complex interaction of unknown causes. This changed with the theory of evolution, involving random mutation and natural selection, and was followed in the early twentieth century by the discovery of quantum mechanics and indeterminism at the fundamental level. According to Heisenberg's uncertainty principle, the position and momentum of elementary particles can be considered together only in terms of probabilities. These theories endow chance with a distinct identity, as an explanatory principle of effects without a cause.

Is chance then an intrinsic part of nature, a feature of reality? That was the Copenhagen interpretation of quantum theory, accepted by the majority of physicists, although it never became unanimous. Albert Einstein expressed his opposition in the famous statement: "God does not play dice with the universe." An alternative view is to differentiate between interaction in nature and the level of measurability in physics (Jaki 1986). But the acceptance of chance in quantum mechanics does not imply a lawless universe; the probabilities of the different states can be precisely measured, and on a macroscopic scale nature appears to follow deterministic laws. There is also the concept of contingent order: Events that may be random still obey a larger law; an example would be random mutation in biology, within the structure of Mendelian genetics.

Again, some points to consider: Training in physics at the doctoral level is required to appreciate the implications of quantum mechanics. The subject has no intuitive meaning for nonspecialists, and there is continued disagreement among physicists. Speculation on the nature of reality belongs to philosophy, even if done by physicists. Intrinsic to the intellectual motivation of working scientists is a philosophy of realism, the belief in an external world of order that is accessible to human inquiry. In this context chance remains a measure of uncertainty, and that is the relevant interpretation for the applied sciences and technology.

OBSERVING RANDOMNESS. The word random cannot be defined precisely; one can say only what it is not. In textbooks of probability and statistics it is generally an undefined term, like point in geometry. The random numbers generated by computer and used in many research applications are in fact produced by given rules and as such are not random; pseudorandom is the proper technical term. There is much ongoing research on the concept of randomness. The simplest common example of a random experiment, the flipping of a coin, has been analyzed in terms of Newton's laws of physics, with upward velocity and rate of spin of the coin determining the outcome. Similar analyses hold for dice and roulette wheels.

Chaos theory has found that very little complexity in a deterministic system is needed to bring about highly complex phenomena, often unpredictably "chaotic" behavior. Almost imperceptible differences in the initial conditions can result in widely diverging outcomes. First noted in a computer simulation of a weather system, this has become known as the "butterfly effect," the image of a butterfly flapping its wings causing a hurricane somewhere across the globe. The phenomenon has been observed in a variety of fields, and the theory being developed has application in a wide range of disciplines, including hydrodynamics, biology, physiology, psychology, economics, ecology, and engineering. The important observation is that even many phenomena that are adequately covered by deterministic theories of classical physics prove to be chaotic, suggesting that there are real limitations on what can be learned about physical systems.

Clearly here scientific determinism does not imply epistemological determinism (meaning that results can be established with certainty). The phenomena appear random and need to be addressed in terms of probabilities. These discoveries should teach caution in expectations for the claimed effects of various aggressively promoted economic and social policies for giant systems such as the United States and other nations.


FREE WILL AND THE LAWS OF PROBABILITY. As a simple example, consider a local telephone calling region where the length of a call does not affect its cost. Residents can call anyone in the region they wish, at any time they wish, and talk as long as they wish, for one unit charge per call. Then the probability distribution of call durations for any given time period will be an exponential distribution. The number of calls arriving at an exchange during a fixed time interval will follow a Poisson distribution, with higher means for busy periods of telephone traffic. These precisely defined laws make possible the efficient design of communications systems. From the engineering viewpoint the calls, initiated by the free will of large numbers of individuals, are random, following known probability laws with parameters that are estimated from observations.


PURPOSE IN THE UNIVERSE? The evolution contro-versy is often presented to the public as the conflict between two diametrically opposed fundamentalist views: Strict Darwinism, according to which chance variation and natural selection are sufficient to explain the origin of all life on Earth, and so-called creationism, which accepts a literal interpretation of the Book of Genesis of the Old Testament. In fact the situation is more complex.

Some evolutionary biologists hold that further structures beyond strict Darwinism are needed to account for the complexity of living systems. They are naturalists, whose explorations use the latest scientific advances to seek better explanations in the natural order. Many mainstream believers accept the fact of evolution, and those interested in science also question the mechanism of evolution. They are creationists in the sense that they believe in Creation, but they seek to learn what science has to say about how the world came into being. They believe that there is purpose in the universe, and see no problem with considering intelligent design as one of the explanatory hypotheses. Because the aim is to understand all of life and human experience, they do not think it rational to exclude any viable hypotheses.

Working along these lines are the American researchers Michael J. Behe, William A. Dembski, and Stephen C. Meyer, who argue that the complex specified information found in the universe, including irreducibly complex biochemical systems, cannot be the product of chance mechanisms and thus provides evidence of intelligent design (Behe, Dembski, and Meyer 2000). In cosmology the big bang theory of the origin of the universe and the anthropic principle concerning conditions necessary for the existence of life may be used in speculations of natural theology. Any emerging results that show consistency of science with the tenets of belief should be discussed openly, along with everything else. Submit it all to the test of time.


THE RELEVANCE OF PASCAL. The work of Pascal, of enduring interest for 300 years, was the subject of books by two prominent thinkers of the twentieth century—the Hungarian mathematician Alfréd Rényi (1921–1970) and the Italian-German theologian and philosopher of religion Romano Guardini (1885–1968), who held the philosophy chair "Christliche Weltanschauung" (Christian worldview) at the University of Munich.

Letters on Probability (Rényi 1972) is a series of four fictitious letters by Pascal to Fermat, assumed to be part of the lost correspondence between the two mathematicians. Addressed to the general reader, it is a witty and charming exploration of the notion of chance and probability, in the cultural context of the seventeenth century that shows the timelessness of the subject. In the last letter Pascal reports on a dialogue he had with a friend concerning the merits of objective and subjective probability. They discussed De rerum natura (On the nature of things), by the Roman poet-philosopher Lucretius (fl. first century b.c.e.), in which he described the Greek atomistic philosophy of Democritus (c. 460–c. 370 b.c.e.) and Epicurus (341–270 b.c.e.); they wondered what the ancients might have meant by chance and random events. In its images of whirling atoms the poem conveys a striking picture of Brownian motion. Pascal is here an advocate of objective probability, reflecting the views of the author.

Pascal for Our Time (Guardini 1966) is a biography placing an immensely gifted believer at the point in the history of ideas when the scientific consciousness of the modern age had fully emerged, but that of the previous era had not yet faded. Pascal is presented as a human being who—simultaneously endowed with keen insight in science, psychology, and philosophy—seeks with reflection to justify his existence at every moment. Guardini shows Pascal's relevance at the intellectual and cultural watershed reached by the twentieth century.

For Pascal thinking was the basis of morality, and a reasoned search the way to proceed to find meaning. Human longing far surpasses what this life has to offer: "Man infinitely transcends man" (Pascal 1995, #131; the numbering refers to the fragments in this edition of the Pensées). A totally committed search is the only option of reason. But the search is feebleminded if it stops before reaching the absolute limits of reason: "Reason's last step is the recognition that there are an infinite number of things which are beyond it. It is merely feeble if it does not go as far as to realize that" (#188). Faith offers more knowledge, but it has to be consistent with the evidence of sense experience: "Faith certainly tells us what the senses do not, but not the contrary of what they see; it is above, not against them" (#185).

The ultimate limits of human reason, perceived by Pascal, were established in the twentieth century with Kurt Gödel's incompleteness theorem in mathematics. The search Pascal so strongly urged was taken up by the natural theologians, among others, and it continues into the twenty-first century. And for thoughtful believers there still cannot be a conflict between faith and science.


THE ETHICS OF EVIDENCE. The comments shared above fit into a proposed framework for dealing with uncertainty, the Ethics of Evidence (Miké 2000). The Ethics of Evidence calls for developing and using the best evidence for decision-making in human affairs, while recognizing that there will always be uncertainty—scientific as well as existential uncertainty. It calls for synthesis of the findings of all relevant fields, and taking personal responsibility for committed action. Philosophical questions such as the nature of reality and purpose in the universe cannot be decided by the latest findings of a particular science. The French philosopher Étienne Gilson (1884–1978) argued in his book The Unity of Philosophical Experience (1999 [1937]) that this age has been going through the last phase of the current cycle of twenty-five centuries of Western philosophy. A new philosophical synthesis is needed, with a first principle that integrates the accumulating insights of science and other disciplines.


Application of Probability

Since the 1960s much historical scholarship has focused on what Gerd Gigerenzer and colleagues (1989) aptly described as The Empire of Chance: How Probability Changed Science and Everyday Life. There are encyclopedias devoted to the subject, with probability as an integral component of the field of statistics. Probability is the basis of theories of sampling, estimation of parameters, hypothesis testing, and other modes of inference, in a multitude of complex designs for the simultaneous study of variables of interest.

Reminiscent of the beginnings with games of chance, the Hungarian mathematician John von Neumann (1903–1957) published a seminal essay in 1928 on the theory of games of strategy, opening up entirely new paths for mathematical economics. He collaborated with the Austrian economist Oskar Morgenstern (1902–1977), by then both in the United States, on their classic work Theory of Games and Economic Behavior (1944). The theory of games provides models for economic and social phenomena, including political and military contexts, in which participants strive for their own advantage but do not control or know the probability distribution of all the variables on which the outcome of their acts depends. An important extension is noncooperative game theory, which excludes binding agreements and is based on the concept of Nash equilibrium, used to make predictions about the outcome of strategic interaction. It is named after its originator, the American mathematician John F. Nash (b. 1928). Game theory is inference in the form of decision-making.

More generally, there are stochastic processes, in what is called the probability theory of movement; these are systems that pass through a succession of states, usually over time, as distinct from deterministic systems in which a constant mechanism generates data that are assumed to be independent. Examples of these include epidemic theory, study of complex networks, finance theory, genetic epidemiology, hydrology, and the foundations of quantum theory.

Ethical aspects of probability pertain to knowing and using the proper techniques to clarify and help resolve problems in science and technology, with close attention to remaining uncertainties. If mechanisms of action are fully understood, as in many engineering systems, careful design and built-in redundancies will result in reliable performance within specified probabilities. But in most areas of interest, such as medical, social, and economic phenomena, the number of variables is large and the mechanisms often unknown or at best poorly understood. Thus only a selection of potentially relevant factors can be studied in any one tentative model, amid vast uncertainties. Misuse of such limited results makes the public vulnerable to manipulation by state, market, and a multitude of interest groups. It seems impossible to overstate the importance of awareness and education concerning these issues.


VALERIE MIKÉ

SEE ALSO Pascal, Blaise; Risk: Overview;Statistics;Uncertainty.

BIBLIOGRAPHY

Arbuthnot, John. (1710). "An Argument for Divine Providence, Taken from the Constant Regularity Observed in the Births of Both Sexes." Philosophical Transactions of the Royal Society of London 27: 186–190. Reprinted in Studies in the History of Statistics and Probability, Vol. 2., ed. Maurice G. Kendall and R. L. Plackett. New York: Macmillan, 1977.

Bayes, Thomas. (1763). "An Essay towards Solving a Problem in the Doctrine of Chances." Philosophical Transactions of the Royal Society of London 53: 370–418. Reprinted in Studies in the History of Statistics and Probability, Vol. 1, ed. Egon S. Pearson and Maurice G. Kendall. London: Griffin, 1970.

Behe, Michael J.; William A. Dembski; and Stephen C. Meyer. (2000). Science and Evidence for Design in the Universe. San Francisco: Ignatius Press. Authors are trained in biochemistry, mathematics, and philosophy.

Bernoulli, Jacques. (1713). Ars Conjectandi [The art of conjecturing]. Basel: Impensis Thurnisiorum.

David, Florence N. (1962). Games, Gods, and Gambling: The Origins and History of Probability and Statistical Ideas from the Earliest Times to the Newtonian Era. London: Charles Griffin. Illustrated story of the prehistory of probability and its early development. Assessment of Pascal's contribution questioned by other scholars, such as Rényi (1972).

Edwards, A. W. F. (2002). Pascal's Arithmetical Triangle: The Story of a Mathematical Idea. Baltimore: Johns Hopkins University Press.

Eisenhart, Churchill, and Allan Birnbaum. (1967). "Tercentennials of Arbuthnot and de Moivre." American Statistician 21(3): 22–29.

Gigerenzer, Gerd; Zeno Swijtink; Theodore Porter; et al. (1989). The Empire of Chance: How Probability Changed Science and Everyday Life. Cambridge, UK: Cambridge University Press. Summary of a two-volume work by a team of historians and philosophers of science, written for a general audience.

Gilson, Étienne. (1999 [1937]). The Unity of Philosophical Experience. San Francisco: Ignatius Press. Analysis of the history of Western philosophy with a proposed new philosophical synthesis.

Graunt, John. (1662). Natural and Political Observations Mentioned in a Following Index, and Made upon the Bills of Mortality. London. Reprinted in Natural and Political Observations Made upon the Bills of Mortality, ed. Walter F. Willcox. Baltimore: Johns Hopkins University Press, 1939.

Guardini, Romano. (1966). Pascal for Our Time, trans. Brian Thompson. New York: Herder and Herder. Translation of Christliches Bewußtsein: Versuche über Pascal, 1935.

Hacking, Ian. (1975). The Emergence of Probability: A Philosophical Study of Early Ideas about Probability, Induction, and Statistical Inference. London: Cambridge University Press. Includes speculation on the dual nature of probability.

Hacking, Ian. (1990). The Taming of Chance. Cambridge, UK: Cambridge University Press. Continuation of 1975 work (see above), exploring the development of probability to the beginning of the twentieth century.

Hacking, Ian. (2001). An Introduction to Probability and Inductive Logic. Cambridge, UK: Cambridge University Press. Introductory textbook for students of philosophy, with many examples.

Huygens, Christiaan. (1657). De Ratiociniis in Ludo Aleae [Reasoning on games of chance]. In Exercitationum Mathematicarum Libri Quinque [Five books of mathematical exercises], ed. Frans van Schooten. Leiden, Netherlands: Johannis Elsevirii.

Jaki, Stanley L. (1986). "Chance or Reality: Interaction in Nature versus Measurement in Physics." In his Chance or Reality and Other Essays. Lanham, MD: University Press of America. Analysis of the controversy over the interpretation of quantum mechanics by a noted historian of science.

Kolmogorov, Andrei N. (1956). Foundations of the Theory of Probability, 2nd English edition, trans. Nathan Morrison. New York: Chelsea Publishing. Translation of Grundbegriffe der Wahrscheinlichkeitsrechnung, 1933. The original work on the axiomatic basis of probability theory.

Kotz, Samuel; Norman L. Johnson; and Campbell B. Read, eds. (1982–1999). Encyclopedia of Statistical Sciences. 9 vols. plus supp. and 3 update vols. New York: Wiley.

Kruskal, William H., and Judith M. Tanur, eds. (1978). International Encyclopedia of Statistics. 2 vols. New York: Free Press.

Laplace, Pierre-Simon de. (1812). Théorie analytique des probabilités [Analytic theory of probability]. Paris: Courcier. First systematic treatment of probability theory.

Laplace, Pierre-Simon de. (1951). A Philosophical Essay on Probabilities, trans., from the 6th French edition, Frederick Wilson Truscott and Frederick Lincoln Emory. New York: Dover. Translation of Essaie philosophique sur les probabilités, 1819. Addressed to the general public, included as the introduction to the third edition (1820) of the work listed above.

Miké, Valerie. (2000). "Seeking the Truth in a World of Chance." Technology in Society 22(3): 353–360. Discusses the work of Pascal in a contemporary cultural context.

Moivre, Abraham de. (1718). The Doctrine of Chances; or, A Method of Calculating the Probabilities of Events in Play. London: W. Pearson. 2nd edition, London: Woodfall, 1738. 3rd edition, London: Millar, 1756. Reprinted: New York: Chelsea Publishing, 1967; Providence, RI: American Mathematical Society, 2000.

Pascal, Blaise. (1995). Pensées [Thoughts], trans. A. J. Krailsheimer. London: Penguin. Originally published in French, 1670. Fine modern English translation, with an introduction by the translator.

Peterson, Ivars. (1990). Islands of Truth: A Mathematical Mystery Cruise. New York: Freeman. One in a series of richly illustrated books by a science writer on new ideas in mathematics, addressed to the lay reader; includes chaos theory.

Rényi, Alfréd. (1972). Letters on Probability, trans. László Vekerdi. Detroit, MI: Wayne State University Press. Translation of Levelek a valószínüségröl, 1969. Incisive and witty exploration of the notion of probability, in the form of fictitious letters assumed to be part of a lost correspondence between Pascal and Fermat. Written for the general reader.

Stigler, Stephen M. (1990). The History of Statistics: The Measurement of Uncertainty before 1900. Cambridge, MA: Harvard University Press, Belknap Press. A comprehensive history, tracing the interplay of mathematical concepts with the needs of several applied sciences that gave rise to the field of statistics.

Süssmilch, Johann Peter. (1741). Die göttliche Ordnung in den Veränderungen des menschlichen Geschlechts, aus der Geburt, dem Tode, und der Fortpflanzung desselben erwiesen. [The divine order in the fluctuations of the human race ...]. Later enlarged ed. reprinted, Augsburg, Germany: Verlag-Cromm, 1988. First analytic theory of population by a founder of modern demography.

von Neumann, John, and Oskar Morgenstern. (1944). Theory of Games and Economic Behavior. Princeton, NJ: Princeton University Press.

About this article

Probability: History, Interpretation, and Application

Updated About encyclopedia.com content Print Article