Skinner, B. F. (1904-1990)
SKINNER, B. F. (1904-1990)
The American psychologist B. F. Skinner was renowned for his pioneering work in behaviorism. Born on March 20, 1904, in Susquehanna, Pennsylvania, Burrhus Frederic Skinner was the older son of Grace Madge Burrhus Skinner and William Arthur Skinner, an attorney with some political aspirations. Skinner's younger brother died suddenly of a cerebral aneurysm at the age of sixteen. Skinner did his undergraduate work at Hamilton College in Clinton, New York, where he majored in English. During the summer before his senior year, he studied at the Bread Loaf School of English at Middlebury, Vermont. There he met Robert Frost, who asked Skinner to send him some of his work. Frost's comments encouraged Skinner to try writing, at first in his parents' home and later in New York City's Greenwich Village. He discovered that "I had nothing important to say" (Skinner, 1970, p. 7). He then turned to psychology and graduate work at Harvard University.
Several factors drew Skinner to psychology. First, his biology teacher directed him to Jacques Loeb's Physiology of the Brain and Comparative Psychology (1900) and Pavlov's Conditioned Reflexes (1927). Then the writings of Bertrand Russell, in The Dial, a literary magazine, and in the book Philosophy (1927), which he read while writing in Greenwich Village, led him to J. B. Watson's Behaviorism (1924). Harvard's department of psychology did not strengthen his interest in behaviorism, but Fred S. Keller, then a graduate student in the department, did. Skinner described Keller as "a sophisticated behaviorist in every sense of the word" (1970, p. 9) and his thesis as having "only the vaguest of Harvard connections" (1970, p. 10). It included his study of eating rate in the rat (which came to be the response rate of later work), two brief papers on the reflex and drive, and his paper on the concept of the reflex in psychology. That concept was based on an operational analysis in which he insisted on defining it as an observed correlation of stimulus and response. He used the equation R = f (S, A), where R stood for reflex strength, S for stimulus, and A for any condition affecting reflex strength, such as drive, which was specified in terms of the deprivation operation (Skinner, 1977).
After receiving his Ph.D., Skinner served as a junior fellow in the Harvard Society of Fellows for three years; then he moved to the University of Minnesota where, during World War II, he embarked on a project to train pigeons to guide missiles. While at the University of Minnesota, he married Yvonne (Eve) Blue, with whom he had two children, Julie and Deborah. In 1945 he moved to Indiana University, where he remained until 1947, when he returned to Harvard University. During that same year, he delivered his William James Lectures on Verbal Behavior, which evolved into his book on that subject in 1957.
As he himself implied, Skinner held on to the concept of "reflex" beyond its usefulness when he wrote his book The Behavior of Organisms in 1938. Not long after that, he gave up the concept because operant behavior not elicited but emitted; he thus ceased to be a stimulus-response psychologist. This means that Skinner did not conceive of human beings, or any organisms, as automatons waiting to have some behavior elicited. Rather, he viewed them as emitting behavior upon which the environment acts by selecting some of it through the provision of consequences. Also important in this context is the concept of classes of behavior and classes of stimuli—Skinner referred to this as the generic nature of stimulus and response (1935). Even though behavior analysis, a term now used to describe Skinner's concepts of learning, refers to classes, not some hyperspecified atomistic stimulus and response, uninformed people still characterize Skinner's approach incorrectly as atomistic.
The difficulty of eliciting operant behavior necessitated the invention of a special procedure to produce "new" behavior—hence the concept of "shaping." Skinner's approach to learning emphasized the three-part reinforcement contingency. Behavior occurring on particular occasions and followed by certain consequences (reinforcers) will be strengthened by those consequences; that is, other members of the same response class will have a higher probability of occurring on similar occasions. There are positive and negative reinforcers. The former strengthens the behavior that produces it and the latter strengthens the behavior that avoids or eliminates it. Reinforcers are also divided into unconditioned (primary) and conditioned (secondary). The former act as reinforcers without any learning history, whereas the latter act as reinforcers because of their association with the unconditioned reinforcers. Skinner distinguished reinforcers from punishing stimuli, which weaken the behavior they produce.
Skinner's concept of operant behavior has generated many experiments, including those on schedules of reinforcement in which the different intermittent patterns of reinforcement give rise to characteristic patterns of response rates (Ferster and Skinner, 1957). The concept of intermittent reinforcement was significant in a variety of ways, notably in its resemblance to the conditions of the natural environment, which brought basic learning research closer to the "real" world. The number of different kinds of intermittent schedules that can be generated is limited only by the experimenter's imagination, but they generally fall into two broad classes: one in which reinforcement depends on the frequency or type of behavior and the other in which it depends on the occurrence of a response following the passage of a certain interval or various intervals.
Intermittent schedules of reinforcement produced behavior that is particularly resistant to extinction and thus gave rise to the study of maintenance of behavior, to which other learning approaches gave scant attention. Maintenance of behavior is like memory, a concept Skinner avoided. Instead of viewing recall as "searching a storehouse of memory," he considered the conditions, both external and response-produced, that increase "the probability of responses" (Skinner, 1974, pp. 109-210). Interestingly, Skinner did not limit his work to basic research. With respect to memory, he wrote a charming and informative book (Skinner and Vaughn, 1983) outlining a program of self-management in old age.
Skinner's book on verbal behavior (1957) appeared in the same year as his work on intermittent reinforcement (Ferster and Skinner, 1957). He considered the former to be his most important contribution to psychology and viewed verbal behavior as he did other behavior, not as standing for something else (Skinner, 1945) but as constituting the subject matter of interest. In contrast with methodological behaviorists, who must restrict their studies to currently measurable phenomena, Skinner the radical behaviorist was able to extend his analysis to private events that cannot yet be measured. In his book on verbal behavior and later in his Contingencies of Reinforcement (Skinner, 1969), Skinner explicitly recognized that not all behavior is produced through conditioning; rule-governed behavior is produced not through exposure to the actual contingencies of reinforcement but to a verbal description of those contingencies. In one of his last papers, Skinner (1990) suggested that such rule-governed behavior might, as "knowledge by description," postpone the destruction of the earth.
Skinner applied his principles of behavior to many areas of functioning. In education, he invented programmed instruction, a form of learning in which students always make the "correct" response, thus having their correct responses immediately reinforced (Skinner, 1954a; 1968). He used the methods of shaping and stimulus fading to make that possible. In abnormal psychology, he first talked about behavior modification by applying reinforcement to psychotic patients' behavior (Skinner, 1954b). He applied behavior analysis to the study of drugs (Skinner and Heron, 1937), thereby initiating an area still practiced and useful; and, as already mentioned, he applied it to old age.
Skinner's first excursion into the study of culture and how to improve took the form of a novel, Walden Two (Skinner, 1948). He returned to that theme in Science and Human Behavior (Skinner, 1953) and in Beyond Freedom and Dignity (Skinner, 1971). He always remained close to the principles of behavior analysis that he had discovered in his basic research. Skinner was undoubtedly one of the most influential psychologists of the twentieth century. His systematization of behavior was never limited to learning as such. Rather, he and others applied his approach to all areas of psychology. A reconsideration of his basic papers, complete with comments by his supporters and critics—along with his response to those comments—appeared in Catania and Harnad (1984).
B. F. Skinner died in 1990.
See also:BEHAVIORISM; OPERANT BEHAVIOR; PAVLOV, IVAN; WATSON, JOHN B.
Catania, A. C., and Harnad, S., eds. (1984). Canonical papers of B. F. Skinner. The Behavioral and Brain Sciences 7, 473-724.
Ferster, C. F., and Skinner, B. F. (1957). Schedules of reinforcement. New York: Appleton-Century-Crofts.
Loeb, J. (1900). Physiology of the brain and comparative psychology. New York: Putnam.
Pavlov, I. (1927). Conditioned reflexes. London: Oxford University Press.
Russell, B. (1927). Philosophy. New York: W. W.
Norton. Skinner, B. F. (1935). The generic nature of the concepts of stimulus and response. Journal of General Psychology 12, 40-65.
—— (1938). The behavior of organisms. New York Appleton- Century-Crofts.
—— (1945). The operational analysis of psychological terms. Psychological Review 52, 270-277.
—— (1948). Walden two. New York: Macmillan.
—— (1953). Science and human behavior. New York: Macmillan.
—— (1954a). The science of learning and the art of teaching. Harvard Educational Review 24, 86-97.
—— (1954b). A new method for the experimental analysis of the behavior of psychotic patients. Journal of Nervous and Mental Diseases 120, 403-406.
—— (1957). Verbal behavior. New York: Appleton-Century- Crofts.
—— (1968). The technology of teaching. New York: Appleton- Century-Crofts.
—— (1969). Contingencies of reinforcement. New York: Appleton- Century-Crofts.
—— (1970). B. F. Skinner: An autobiography. In P. B. Dews, ed., Festschrift for B. F. Skinner. New York: Appleton-Century-Crofts.
—— (1971). Beyond freedom and dignity. New York: Alfred A. Knopf.
—— (1974). About behaviorism. New York. Alfred A. Knopf.
—— (1976). Particulars of my life. New York: Alfred A Knopf.
—— (1977). The experimental analysis of operant behavior. In R. W. Rieber and K. Salzinger, eds., The roots of American psychology: Historical influences and implications for the future. Annals of the New York Academy of Sciences 291, 374-385.
—— (1979). The shaping of a behaviorist. New York: Alfred A. Knopf.
—— (1983). A matter of consequences. New York: Alfred A. Knopf.
—— (1990). To know the future. The Behavior Analyst 13, 103-106.
Skinner, B. F., and Heron, W. T. (1937). Effects of caffeine and benzedrine upon conditioning and extinction. Psychological Record 1, 340-346.
Skinner, B. F., and Vaughn, M. E. (1983). Enjoy old age. New York: W. W. Norton.
Watson, J. B. (1924). Behaviorism. New York: W. W. Norton.