Modern Logic: The Boolean Period: Peirce

views updated

MODERN LOGIC: THE BOOLEAN PERIOD: PEIRCE

The logical work of Charles Sanders Peirce (18391914) was an unusual blend of the traditional and the modern. His early paper "Memoranda concerning the Aristotelian Syllogism," read and distributed in 1866, adapted to the second and third syllogistic figures Kant's description of first-figure reasoning as the subsumption of a case under a rule, and in later papers he exhibited analogy and induction as probabilistic weakenings of the second and third figures thus conceived. In 1867, independently of Jevons, Peirce improved Boole's logical algebra by identifying logical addition with the inclusive rather than the exclusive sense of "either-or." In 1870, inspired by De Morgan's pioneer work on the logic of relations, he extended Boole's method of algebraic analogy to this discipline, noticed that there are three-termed as well as two-termed relations, and introduced the sign "" for class inclusion, considered an analogue of the arithmetical "."

In 1880, Peirce began to use the symbol "" indifferently for class inclusion, implication, and the "therefore" of inference. It became one of his persistent themes that the distinction between terms, propositions, and inferences is of little logical importance. For him all propositions are, in the end, implications (this thesis is bound up with his pragmatic theory of meaning) and as such are simply inferences deprived of an element of assertiveness; terms, at least general terms, are propositions deprived of a subject. General terms are "rhemes," or, as we would now say, "open sentences," sentences with gaps where names might go. Such sentences with gaps are in a broad sense relative terms, the number of gaps indicating what Peirce called the "adinity" of the relation. Thus, "loves" represents a "dyadic" relation,"givesto" a "triadic" one, and so on. Extending this conception downward, Peirce described an ordinary predicative term, such as "is a man," as representing a "monadic" relation and a complete sentence, with no gaps at all, as representing a "medadic" one.

As Frege did with his "concepts," Peirce compared his "rhemes" to unsaturated chemical radicals having various valencies. Unlike Frege, however, he did not subsume rhemes under functions, like "The square of," as the special case in which the value of the function for a given argument is a truth-value. Frege's procedure underlined the resemblance between a completed proposition and a name; for Peirce a completed proposition was rather a special case of a predicate. Nevertheless, Peirce pioneered (in 1885) the use of truth-value calculations in establishing logical laws and also foreshadowed many-valued logic by suggesting that there might be an infinity of degrees of falsehood, with truth as the zero.

A gap in a rheme may be filled, in the simplest case, by what Peirce called an "index." He divided signs into indices, which operate through some physical connection with what they signify; icons, which operate through some resemblance to what they signify; and symbols, which acquire their meaning by convention. An ordinary proper name is an "icon of an index"; it is (when uttered) a noise that resembles the noise that was made when we were introduced to the person named. A simple index would be, for example, a demonstrative pronoun accompanied by a pointing gesture. Peirce regarded the phrase "demonstrative pronouns" as an inaccurate descriptionit would be more appropriate to call a noun a "pro-demonstrative." A common noun, for Peirce, is only an inseparable element in a rheme (for example, "man" in "is a man").

Instead of directly filling a gap in a rheme with an index, we may say either "I can find you an object such that it" ("is a man," "loves Susan," etc.) or "Take anything you like and it" ("is mortal if human," etc.). These are the particular and universal quantifiers, which Peirce introduced into his logicindependently of Frege, but with some debt to his own student O. H. Mitchellin 1883. He represented them with the mathematical symbols "Σ" and "" for continued sums and products. If we write "a = 0" for "a is false" and "a 0" for "a is true," Σi a i or "For some individual i, a i " will have for its value the sum of the values of the possible a i 's and therefore will be 0 (that is, true) if and only if at least one of the a i 's is 0, whereas i a i or "For any individual i, a i " will have for its value the product of the values of the possible a i 's and therefore will be 0 if and only if all of the a i 's are 0. Peirce was aware of the possibility of putting any quantified expression into what is now called prenex normal form, with all the quantifiers at the beginning. He also, in what he called second-intentional logic, quantified over variables other than those standing for indices.

Every implication, Peirce came to believe, has an implicit or explicit initial quantifierthat is, is of the form i (ai bi ), "For any i, if ai, then bi." The i 's may be either ordinary individuals of which our a and b may be true, or instants at which they may be true, or possible states of affairs in which they may be true; for example, "If it rains it pours" may mean "For any instant i, if it rains at i, it pours at i " or "For any possible state of affairs i, if it rains in i, it pours in i." But in the latter case we may consider wider or narrower ranges of possibility, and if we limit ourselves to the actual state of affairs, the quantifier may be dropped.

Peirce made several attempts to define negation in terms of implication, and in 1885 he produced a set of axioms for the propositional calculus with implication accepted as an undefined operator and negation defined as the implication of a proposition from which anything at all would follow. This was the second set of axioms sufficient for the propositional calculus to be produced in the history of the subject (the first being Frege's of 1879) and the first set to use the curious law ((a b )a )a, now called Peirce's law. But Peirce experimented with other types of systems also, and in 1880 he anticipated H. M. Sheffer in showing that all truth-functions can be defined in terms of "Neithernor" and "Not bothand." The "not" within a proposition (as opposed to "It is not the case that," governing the whole), which forms the "negative propositions" of traditional logic, he regarded as expressing the relation of otherness, and he worked out what properties of this relation are reflected in traditional logical laws. For example, the law of contraposition, "'Every A is a B ' entails that whatever is not a B is not an A," follows from the mere fact that otherness is a relation, for whatever relative term R may be, if every A is a B, then whatever is an R (for instance, an other) of every B is an R of every A.

Peirce thought it desirable that logical formulas should reflect the structure of the facts or thoughts which they express and so be, in his sense, "icons"that is, signs operating by resemblance to what they signifyand he sought constantly to develop symbolisms that were genuinely "iconic." In his later years he came to regard this as best achieved by a system of diagrams which he called "existential graphs." Typically, he attempted to represent his graph for "If A then B " as basic, but in fact his diagrams are most easily understood as starting from the representation of "and" by juxtaposition and of "not" by enclosure in a bracket or circle or square. (A (B )), which is his graph for "If A then B," reads off naturally as "Not both A and not B." Rules of inference are represented as permissions to alter the graphs by insertions and erasures; for example:

(R1) We may insert or remove double enclosures at will, provided that there is no symbol caught between the two enclosures; for instance, we may pass from A to ((A )), i.e., to "Not not A," and back, but not from (A (B )) to AB.

(R2) Any symbol may be removed from an evenly enclosed graph (including a completely unenclosed one) or added to an oddly enclosed one; for instance, we may pass from AB, i.e., "A and B," to A, or from (A (BC )) to (A (B )), i.e., from "If A then both B and C " to "If A then B," or from (A ) to (AB ), i.e., from "Not A " to "Not both A and B."

(R3) We may repeat a symbol across an enclosure immediately interior to the symbol's own, and if a symbol is already thus repeated, we may remove it from the inner enclosure; for instance, we may pass from (A (B )) to (A (AB )), i.e., from "If A then B " to "If A then both A and B," or from A (AB ) to A (B ), i.e., from "A and not both A and B " to "A and not B."

If a graph is such that these permissions will enable us to transform it into any graph at all, that graph is "absurd" and its negation a logical truth. For example, A (A ), "Both A and not A," leads by R2 to A ((B )A ), where B is any graph you please, and this leads by R3 to A ((B )), this by R2 to ((B )), and this by R1 to B. Hence, (A (A )), "If A then A," is a logical truth. For clarity Peirce suggested drawing rectangular enclosures, with evenly enclosed symbols written on the left and oddly enclosed ones on the right. For example, Figure 3 is a representation of (A (B (C )), "If A then (B but not C )." This arrangement makes it clear that Peirce was, in effect, setting up what are nowadays called "semantic tableaux," in the manner of E. W. Beth.

Peirce also thought of logical truth as represented by the blank sheet on which his graphs were drawn and absurdity by an enclosure with nothing but the blank graph sheet inside it. Since by R2 we may inscribe anything whatever in such an otherwise blank enclosure, this enclosure would in fact represent an absurdity in the previous sense of a graph that can be transformed into any graph whatsoever. "If A then absurdity," Peirce's favorite definition of "Not A," would then be strictly "(A (( )))" ("If A then B " is "(A (B ))," and here we put "( )" for B ), but this assumes that in representing the absurd as "( )" we already understand simple enclosure as negation, and in attempting to modify his symbolism in ways which would avoid this assumption Peirce was led into occasional unnecessary trouble.

Although Peirce was one of the inventors of bound variables, in his graphs for quantified formulas he explicitly dispensed with them in favor of what he called "lines of identity," a device recently put to the same purpose, though informally, by W. V. Quine and Peter Geach. A monadic rheme may be written as "A " or "A ," the single valency line being close enough to be thought of as part of the symbol, and on its own this symbol is read as "Something is A." If "B " is added to this, the whole, "A B," of course, means "Something is A and something is B." But if the valency lines are joined by a "line of identity," to give us "A B," this means "Something is A and that same thing is B," or "Something is at once A and B." In the common systems this identification of the subjects of which A and B are predicated is effected by attaching these predicates to the same bound variable, thus: "For some x, x is A and x is B." Again, "If anything is A then that same thing is B " is distinguished in the common systems from the more indefinite "If anything is A then something is B " by writing the former with a common bound variable, thus: "For any x, if x is A then x is B." In Peirce's graphs this is done by tightening "(A (B ))" to or "(A (B ))." To give some examples with dyadic rhemes, "Every A is an R of some B " comes out as "(A (R B ))"; "Some B is R 'd by every A " as "(A (R ))B "; and "Every A is an R of itself" as "" or "."

This "Beta part" of Peirce's graphs, of course, contains special rules for the transformation of lines of identity.

For example, the additions and erasures of terms permitted by R2 may be extended to terms attached to others by lines of identity; thus, we may pass from "A B," "Something is at once A and B," to the plain "B," "Something is B." Peirce said that the blank sheetwhich is left here when "A " with its line of identity is removed and which represents accepted truth when considered as a medadrepresents an accepted existent when considered as a monad.

Since lines of identity may themselves be treated as dyadic rhemes and subjected to enclosure, the graphs cover identity theory and, therefore, the arithmetic of specific integers, as well as the theory of first-order quantification. For example, "There are at least two A 's" will be "A ()A "that is, "Something is an A, and something that is not that thing is also an A." But the graphs do not readily lend themselves to the representation of higher-order quantifications, such as "Some qualities belong to everything and others to some things only," although Peirce made some rather clumsy efforts in this direction. More successful, but only adumbrated in outline, was his extension of his method to modal logic by using separate sheets for different possible worlds. This procedure is very like that now adopted by S. A. Kripke) and also echoes medieval theories of "ampliation."

There is probably no logical writer who has been more rich in original suggestions than Peirce, and his papers are a mine that has still to be fully worked. He was, at the same time, more aware than any of his contemporaries of the contributions made by their ancient and medieval predecessors. He held and persuasively supported a theory that Aristotle had anticipated (in a chapter of the Prior Analytics (now missing) later derivations of simple conversion from the laws of identity and syllogism, and he saw the significance of the Megarian controversy over the nature of implication and of the distinctions drawn by the Schoolmen in their theory of consequentiae.

Peirce's immediate circle in America included two logicians of some distinction: O. H. Mitchell, from whom Peirce derived the germ of his device of quantification, and Christine Ladd Franklin (18471930), who used eight "copulae" to construct De Morgan's eight categorical forms and exhibited syllogisms in different figures as derivable from "inconsistent triads," or "antilogisms." An antilogism states that a certain three propositionsfor example, "Every Y is a Z," "Every X is a Y," and "Not every X is a Z "cannot all be true: hence (syllogism 1), the first and second jointly imply the denial of the third; also (syllogism 2), the first and third jointly imply the denial of the second; also (syllogism 3), the third and second jointly imply the denial of the first.

See also Boole, George; De Morgan, Augustus; Existence; Frege, Gottlob; Jevons, William Stanley; Kant, Immanuel; Modal Logic; Peirce, Charles Sanders; Quine, Willard Van Orman.

A. N. Prior (1967)

About this article

Modern Logic: The Boolean Period: Peirce

Updated About encyclopedia.com content Print Article

NEARBY TERMS

Modern Logic: The Boolean Period: Peirce