A modern version of formal logic, referred to variously as logistic, mathematical logic, and the algebra of logic; it may be described generally as the set of logical theories elaborated since the mid-19th century with the aid of symbolic notation and a rigorous method of deduction. Symbolic logic differs from traditional logic in its extensive use of symbols similar to those used in mathematics, in its lack of concern with the psychology and epistemology of knowledge, and in its formalism. It is concerned mainly with the analysis of the correctness of logical laws, such as the law of contradiction, that of the hypothetical syllogism, and so on. Symbolic logicians attempt to deduce logical laws from the smallest possible number of principles, i.e., axioms and rules of inference, and to do this with no hidden assumptions or unexpressed steps in the deductive process (see axiomatic system). This article provides a brief survey of the history of the discipline and discusses its basic concepts and principal divisions, viz, propositional logic, the logic of predicates and of classes, and the 1ogic of relations.
History. G. W. leibniz is usually regarded as the forerunner of symbolic logic, largely for his attempt to formulate a mathesis universalis and for his discovery of several theorems that later assumed importance. Historians of symbolic logic, mainly of the Polish school (J. Lukasiewicz, J. Salamucha, I. M. Bocheński), have pointed out that the principal concepts utilized in the new logic are to be found in the works of aristotle, who introduced variables and the idea of the deductive system. Similarly, they have shown that the logic of propositions was extensively treated by the Stoics and by the later scholastics, and that even some aspects of the problem of antinomies had their counterparts in the medieval concern with insolubilia. Yet it was not until the mid-19th century, with the work of G. Boole and A. de morgan, that systems of symbolic logic similar to those used in the 20th century were developed. The history of this development may be conveniently divided into three periods, the first (1847–90) dominated by the work of Boole, the second (1890–1930) principally under the influence of G. Frege, and the third (1930–60s) devoted largely to metalogical considerations.
Boolean logic had two characteristics: it was a logic of classes and it was developed using a rigorous mathematical method. It was Boole's intention, in fact, to apply the method of algebra to logic—whence the designation of his system as "the algebra of logic." De Morgan furthered the development, discovering some new laws, doing work on the syllogism, and making a pioneer study of the logic of relations. C. S. peirce likewise belongs to this period. The most ample development of logic according to Boole's method, however, is to be found in the work of E. Schröder, Vorlesungen über die Algebra der Logik (3 v. Leipzig 1890–1905).
The Fregean period was characterized by a more formal development of the new discipline. Frege himself discovered a new logic of propositions and developed the first axiomatic system for such a logic; this has been regarded as a fundamental work on the foundations of mathematics. Improving on Frege's symbolism, G. Peano invented a form of symbolic writing that was later adopted by B. russell and A. N. whitehead in their Principia Mathematica (3 v. Cambridge, England 1910–13). Another notational advance was made by the Polish logician J. Lukasiewicz, who also invented polyvalent or many-valued logics and did research in the history of formal logic. Also worthy of note, although extending somewhat beyond this period, is the work of the German logicians D. Hilbert and P. Bernays on the foundations of mathematics (Grundlagen der Mathematik, 2 v. Berlin 1934–39).
The metalogical period was inaugurated by K. Gödel, who showed that many propositions in the Principia Mathematica and in equivalent systems were formally undecidable, i.e., that their truth or falsity could not be proved within the formal structure of the system. Noteworthy in this period is the work of A. Tarski on the semantic definition of truth and that of K. Popper and R. Carnap on the methodology of the exact sciences. Additional applications of the methods of mathematical logic have been made in theology (Bocheński, I. Thomas), in analytical philosophy (A. Church, N. Goodman, W. V.O. Quine, C. G. Hempel), in physics (H. Reichenbach, C.E. Shannon), in biology (J. H. Woodger), and in economics (J. von Neumann, O. Morgenstern). see logic, history of.
Basic Concepts. A fundamental distinction in symbolic logic is that between constants and variables. Variables are symbols (usually the letters x, y, z ) that can be replaced by constants (usually the letters a, b, c ) or by complex formulas. If a constant is replaced by a variable in a sentence, or proposition, the result is a function; this is a schema for a sentence, or proposition, and in itself is neither true nor false. Thus, "x is a student" is a function and is neither true nor false, whereas "a is a student" and "John is a student" are sentences and may be true or false. Functions may be transformed back into sentences, or propositions, by prefixing a quantifier to them. There are two types of quantifiers: universal quantifiers, of which an example would be "for all x, …" [written (x )]; and existential quantifiers, of which an example would be "there is at least one x such that …" [written (∃x )].
Symbols are generally divided into basic categories and functor, or predicate, categories. The basic categories are either names (substantives) or sentences. Functors, or predicates, are symbols (usually designated by the Greek letters φ, ψ, χ, or by specially invented characters) that determine other symbols, which are referred to as arguments. Thus, "Peter" is the argument of the functor "walks" in the sentence "Peter walks," which may be written "φa," where "a " stands for "Peter" and "φ" stands for "walks." Functors are divided in three different ways, each based on a different principle of division.(1) First there is the division into sentence-forming and name-forming functors. Thus, "walks" is sentence forming because "Peter walks" is a sentence, whereas "brilliant" is name forming because "brilliant student" is a name. (2) A second division is that into namedetermining and sentence-determining functors. Thus, "walks" is a name-determining functor, as in the example "Peter walks"; on the other hand, "it is not the case that" is a sentence-determining functor, as in the example "It is not the case that Peter walks." (3) Finally, functors are distinguished according to the number of arguments that they determine into one-place, two-place, three-place, or, in general, n -place functors. An example of a one-place functor is "walks" in the sentence "Peter walks"—"walks" here determines only one argument, viz, "Peter." An example of a two-place functor is "loves" in the sentence "Paul loves Joan"—here "loves" determines two arguments, viz, "Paul" and "Joan." An example of a three-place functor is "gives" in the sentence "Paul gives Joan a ring"—here "gives" determines three arguments, viz, "Paul," "Joan," and "ring." And so on.
In accordance with these principles of division, symbolic logic may be seen as divided into three main parts:(1) propositional logic, in which all functors are sentence-determining; (2) the logic of predicates and of classes, which treats of name-determining functors; and (3) the logic of relations, which is concerned with special properties of functors that determine two or more arguments.
Propositional Logic. Propositional logic is concerned exclusively with sentences, or propositions, that may be constructed by means of so-called truth functors. Truth functors are sentence-forming, sentence-determining, generally one-and two-place functors that can be used to form sentences whose truth value depends exclusively on the truth value of their arguments and not upon their meanings. Truth value in propositional logic— which is a two-valued logic—is twofold: it may be either the value of truth (usually written T or 1) or the value of falsity (usually written F or 0). An example of a truth functor is negation, since the value of a negated true sentence is falsity and the value of a negated false sentence is truth, and this independently of the sentences' meanings. The most widely employed truth functors are negation ("it is not the case that …," usually written [.similar]), the logical sum ("either … or …" in the sense of "either or both"), the logical product ("… and …," usually symbolized by a period or dot), material implication ("if …, then…," usually written [.propersuperset]), equivalence ("if and only if …, then …," usually written [.equivalence]), and disjunction ("either … or …" in the sense of "not both… and …," usually written |).
The truth functor known as material implication is most important for understanding how symbolic logic differs from traditional formal logic. Although material implication is taken to mean "if … then …," it has a different significance from the conditional compound of ordinary discourse. Because of its ordination to a truthvalue type of verification, material implication abstracts from, ignores, or leaves behind some of the ordinary elements of meaning of the conditional compound. Some authors (e.g., H. Veatch) make this abstraction the central point of their evaluation of material implication, arguing that it cannot express the intentional character of the conditional, which must lie in the relation of meaning between the component propositions, viz, the antecedent and the consequent. Other authors, while recognizing differences between the ordinary conditional compound and material implication, attempt to point out an element common to both. Thus I. M. Copi argues that material implication expresses a partial meaning of the conditional. Every conditional whose antecedent is true and whose consequent is false must be considered a false proposition; it is this element of the conditional that is expressed by material implication. Since material implication has a "weaker" meaning than the conditional compound, material implication can always be asserted when a strict conditional obtains, although the converse is not true. The essential value of material implication appears to lie in its permitting one to state that if the antecedent proposition has been assigned the value of truth, the consequent proposition must also be assigned the same value; this makes possible a purely mechanical operation that resembles a deductive process based on the recognition of meanings of what is stated in the antecedent and the consequent.
Using the concept of deduction thus associated with material implication, one may derive all the sentences, or propositions, of propositional logic from very few axioms and rules. Propositional logic is the most completely developed part of symbolic logic; it is regarded by mathematical logicians as the simplest and most basic part of their science, which provides the framework, so to speak, for all other types of logical analysis and deduction.
Logic of Predicates and of Classes. The second branch of symbolic logic falls into two divisions: the logic of predicates, which gives an intensional interpretation of its formulas, and the logic of classes, which gives an extensional interpretation.
In the logic of predicates the sentence is analyzed into a sentence-forming, name-determining functor (usually written φ, ψ, or χ) and a name (usually written as a variable or as a constant). An example of the basic formula would be φx. Formulas of this type are combined by means of sentence-determining functors, i.e., truth functors, and are transformed into sentences by means of quantifiers. Thus the universal proposition "All φ is ψ" may be replaced by the expression "(x ). φx ⊃ ψx," and the particular proposition "Some φ is ψ," or "There is a φ that is ψ," may be replaced by the expression "(∃ x ). φx. ψx." Use of these modes of writing and the deductive methods of the logic of propositions has led to a considerable extension of Aristotelian syllogistics.
The logic of classes is the extensional counterpart of the logic of one-place functors or predicates. A class or set (generally designated by the Greek letters α, β, or γ) is always defined by a predicate; it is the set of all objects that possess a given property. For example, the class of human beings consists of all objects to which the predicate "is a man" can be attributed. The most important concept of the logic of classes is that of class membership, "x ε α," which is usually read "x is a member of α" or "x belongs to α." Another concept—one that has caused considerable controversy among philosophers—is that of the null class, i.e., the class that contains no elements. On the basis of the definition of class and the theorems of the logic of predicates, as well as those of propositional logic, various combinations of classes can be effected and the relationships between them ascertained.
Logic of Relations. The logic of relations may be described as an extensional counterpart of the logic of predicates (or functors) that determine two or more arguments, just as the logic of classes may be regarded as an extensional counterpart of the logic of predicates that determine one argument. The reason for this is that relations can hold only between two or more arguments. In this branch of symbolic logic, relations are conceived extensionally, i.e., as relating to groups of objects. A relation, in a manner completely analogous to the defining procedure for a class, may be defined by a two-place predicate. Thus one may define the relation "in love with" as "the set of pairs of persons who love each other." The symbol usually employed is R, which is generally written between the two variables it relates, e.g., x Ry. Every relation may be conceived as having a converse; thus "to the right of" is the converse of "to the left of," and "the author of" is the converse of "the work of." It is common also to distinguish various relational descriptions: (1) individual, e.g., the husband of the Queen of England; (2) plural, e.g., the authors of the New Catholic Encyclopedia ; (3) double plural, e.g., the authors of English poems; and (4) the domain, which is the most general type of relational description, e.g., all authors. Of considerable importance are the concepts used for the purposes of compounding several relations, such as the relative product (e.g., the square of the half, the brother of the mother) and the relative power (e.g., the father of the father, or father "squared"). Another group of useful concepts is provided by the properties of relations: some are reflexive, i.e., x Rx ; others are symmetrical, i.e., if xRy then yRx ; and still others are transitive, i.e., if xRy and yRz, then xRz. A concept of great use in the investigation of series is that of ancestral relation (R or R 2 or R 3, etc.).
See Also: antinomy; mathematics, philosophy of; semantics.
Bibliography: a. church, "A Bibliography of Symbolic Logic," Journal of Symbolic Logic 1 (1936) 121–218; 3 (1938) 178–192, continued in subsequent issues; "A Brief Bibliography of Formal Logic," Proceedings of the American Academy of Arts and Sciences 80 (1952) 155–172. History. w. and m. kneale, The Development of Logic (Oxford 1962). h. scholz, Concise History of Logic, tr. k. f. leidecker (New York 1961). i. m. bocheŃski, A Précis of Mathematical Logic, tr. o. bird (Dordrecht, Netherlands 1959), select bibliog.; A History of Formal Logic, ed. and tr. i. thomas (Notre Dame, IN 1961); Ancient Formal Logic (Amsterdam 1951). j. lukasiewicz, Aristotle's Syllogistic from the Standpoint of Modern Formal Logic (2d ed. enl. Oxford 1957). p. boehner, Medieval Logic: An Outline of Its Development from 1250-c. 1400 (Chicago 1952). e. a. moody, Truth and Consequence in Mediaeval Logic (Amsterdam 1953). Studies. w. v. o. quine, Mathematical Logic (rev. ed. Cambridge, MA 1958); From a Logical Point of View (2d ed. rev. Cambridge, MA 1961). i. m. copi, Symbolic Logic (New York 1958). r. carnap, Introduction to Symbolic Logic and Its Applications (New York 1958). s. k. langer, An Introduction to Symbolic Logic (2d ed. New York 1953). i. m. bocheŃski et al., The Problem of Universals: A Symposium (Notre Dame, IN 1956). j. a. ladriÈre, Les Limitations internes des formalismes (Louvain 1957). h. veatch, "Aristotelian and Mathematical Logic," Thomist 13 (1950) 50–96; 14 (1951) 238–258; 15 (1952) 624–641.
[w. a. wallace]
Symbolic logic is sited at the intersection of philosophy, mathematics, linguistics, and computer science. It deals with the structure of reasoning and the formal features of information. Work in symbolic logic has almost exclusively treated the deductive validity of arguments: those arguments for which it is impossible for the premises to be true and the conclusion false. However, techniques from twentieth-century logic have found a place in the study of inductive or probabilistic reasoning, in which premises need not render their conclusions certain.
The historical roots of logic go back to the work of Aristotle (384–322 BCE), whose syllogistic reasoning was the standard account of the validity of arguments. Syllogistic reasoning treats arguments of a limited form: They have two premises and a single conclusion, and each judgment has a form like “all people are mortal,” “some Australian is poor,” or “no politician is popular.”
The discipline of symbolic logic exploded in complexity as techniques of algebra were applied to issues of logic in the work of George Boole (1815–1864), Augustus de Morgan (1806–1871), Charles Sanders Peirce (1839–1914), and Ernst Schröder (1841–1902) in the nineteenth century (see Ewald 1996). They applied the techniques of mathematics to represent propositions in arguments algebraically, treating the validity of arguments like equations in applied mathematics. This tradition survives in the work of contemporary algebraic logicians.
Connections between mathematics and logic developed into the twentieth century with the work of Gottlob Frege (1848–1925) and Bertrand Russell (1872–1970), who used techniques in logic to study mathematics. Their goals were to use the newfound precision in logical vocabulary to give detailed accounts of the structure of mathematical reasoning, in such a way as to clarify the definitions that are used, and to make fully explicit the commitments of mathematical reasoning. Russell and Alfred North Whitehead’s (1861–1947) Principia Mathematica (1912) is the apogee of this project of logicism.
With the development of these logical tools came the desire to use them in different fields. In the early part of the twentieth century, the logical positivists attempted to put all of science on a firm foundation by formalizing it: by showing how rich theoretical claims bear on the simple observations of experience. The best example of this is the project of Rudolf Carnap (1891–1970), who attempted to show how the logical structure of experience and physical, psychological, and social theory could be built up out of an elementary field of perception (Carnap 1967). This revival of empiricism was made possible by developments in logic, which allowed a richer repertoire of modes of construction or composition of conceptual content. On an Aristotelian picture, all judgments have a particularly simple form. The new logic of Frege and Russell was able to encompass much more complex kinds of logical structure, and so with it, theorists were able to attempt much more (Coffa 1991).
However, the work of the logical positivists is not the enduring success of the work in logic in the twentieth century. The radical empiricism of the logical positivists failed, not because of external criticism, but because logic itself is more subtle than the positivists had expected. We see this in the work of the two great logicians of the mid-twentieth century. Alfred Tarski (1902–1983) clarified our view of logic by showing that we can understand logic by means of describing the language of logic and the valid arguments by giving an account of proofs. However, we view logic by viewing the models of a logical language, and taking a valid argument as one for which there is no model in which the premises are true and the conclusion false. Tarski clarified the notion of a model and he showed how one could rigorously define the notion of truth in a language, relative to these models (Tarski 1956). The other great logician of the twentieth century, Kurt Gödel (1906–1978), showed that these two views of logic (proof theory and model theory) can agree. He showed that in the standard picture of logic, validity defined with proofs and validity defined by models agree (see von Heijenhoort 1967).
Gödel’s most famous and most misunderstood result is his incompleteness theorem: This result showed that any account of proof for mathematical theories, such as arithmetic, must either be completely intractable (we can never list all of the rules of proof) or incomplete (it does not provide an answer for every mathematical proposition in the domain of a theory), or the theory is inconsistent. This result brought an end of the logicist program as applied to mathematics and the other sciences. We cannot view the truths of mathematics as the consequences of a particular theory, and the same holds for the other sciences (see von Heijenhoort 1967).
Regardless, logic thrives. Proof theory and model theory are rich mathematical traditions, their techniques have been applied to many different domains of reasoning, and connections with linguistics and computer science have strengthened the discipline and brought it new applications.
Logical techniques are tools that may be used whenever it is important to understand the structure of the claims we make and the ways they bear upon each other. These tools have been applied in clarifying arguments and analyzing reasoning, and they feature centrally in the development of allied tools, such as statistical reasoning.
One contemporary debate over our understanding of logic also bears on the social sciences. We grant that using languages is a social phenomenon. How does the socially mediated fact of language-use relate to the structure of the information we are able to present with that use of language? Should we understand language as primarily representational, with inference valid when what is represented by the premises includes the representation of the conclusion, or should we see the social role of assertion in terms of its inferential relations? We may think of assertion as a social practice in which the logical relations of compatibility and reason-giving are fundamental. Once we can speak with each other, my assertions have a bearing on yours, and so logic finds its home in the social practice of expressing thought in word (Brandom 2000).
SEE ALSO Aristotle; Empiricism; Logic; Models and Modeling; Philosophy; Social Science; Statistics in the Social Sciences
Brandom, Robert. 2000. Articulating Reasons: An Introduction to Inferentialism. Cambridge, MA: Harvard University Press.
Carnap, Rudolf. 1967. The Logical Structure of the World, and Pseudoproblems in Philosophy. Trans. Rolf A. George. New York: Routledge and Kegan Paul.
Coffa, J. Alberto. 1991. The Semantic Tradition from Kant to Carnap: To the Vienna Station, ed. Linda Wessels. Cambridge, U.K.: Cambridge University Press.
Ewald, William, ed. 1996. From Kant to Hilbert: A Source Book in the Foundations of Mathematics. Oxford: Oxford University Press.
Tarski, Alfred. 1956. Logic, Semantics, Metamathematics: Papers from 1923 to 1938. Trans. J. H. Woodger. Oxford: Clarendon.
von Heijenhoort, Jan. 1967. From Frege to Gödel: A Source Book in Mathematical Logic, 1879–1931. Cambridge, MA: Harvard University Press.
Symbolic logic is the branch of mathematics that makes use of symbols to express logical ideas. This method makes it possible to manipulate ideas mathematically in much the same way that numbers are manipulated.
Most people are already familiar with the use of letters and other symbols to represent both numbers and concepts. For example, many solutions to algebraic problems begin with the statement, "Let x represent.…" That is, the letter x can be used to represent the number of boxes of nails, the number of sheep in a flock, or the number of hours traveled by a car. Similarly, the letter p is often used in geometry to represent a point. P can then be used to describe line segments, intersections, and other geometric concepts.
In symbolic logic, a letter such as p can be used to represent a complete statement. It may, for example, represent the statement: "A triangle has three sides."
Mathematical operations in symbolic logic
Consider the two possible statements:
"I will be home tonight" and "I will be home tomorrow."
Let p represent the first statement and q represent the second statement. Then it is possible to investigate various combinations of these two statements by mathematical means. The simplest mathematical possibilities are to ask what happens when both statements are true (an AND operation) or when only one statement is true (an OR operation).
One method for performing this kind of analysis is with a truth table. A truth table is an organized way of considering all possible relationships between two logical statements, in this case, between p and q. An example of the truth table for the two statements given above is shown below. Notice in the table that the symbol ∧ is used to represent an AND operation and the symbol ∨ to represent an OR operation:
p q p∧q p∨q
T T T T
T F F T
F T F T
F F F F
Notice what the table tells you. First, if "I will be home tonight" (p) and "I will be home tomorrow" (q) are both true, then the statement "I will be home tonight and I will be home tomorrow"—(p) and (q)— also must be true. In contrast, look at line 3 of the chart. According to this line, the statement "I will be home tonight" (p) is false, but the statement "I will be home tomorrow" (q) is true. What does this tell you about p∧q and p∨q?
First, p∧q means that "I will be home tonight" (p), and "I will be home tomorrow" (q). But line 3 says that the first of these statements (p) is false. Therefore, the statement "I will be home tonight and I will be home tomorrow" must be false. On the other hand, the condition p∨q means that "I will be home tonight or I will be home tomorrow." But this statement can be true since the second statement—"I will be home tomorrow"—is true.
The mathematics of symbolic logic is far more complex than can be shown in this book. Its most important applications have been in the field of computer design. When an engineer lays out the electrical circuits that make up a computer, or when a programmer writes a program for using the computer, many kinds of AND and OR decisions (along with other kinds of decisions) have to be made. Symbolic logic provides a precise method for making those decisions.