Skinner, Burrhus Frederic

views updated

Skinner, Burrhus Frederic

1904–1990

AMERICAN PSYCHOLOGIST, WRITER

HARVARD UNIVERSITY, Ph.D., 1931

BRIEF OVERVIEW

Burrhus Frederic (B.F.) Skinner (1904–1990) is considered by most to be one of the pivotal psychologists of the twentieth century. Both his followers and detractors alike agree that his tireless work in behaviorism has significantly changed the landscape of psychology in general and the perception of how behavior is understood by both scientists and common people. His theories, though modified in various ways over the years, still continue to be widely applied in all walks of contemporary life.

Skinner was an American psychologist best known for the theory he developed over many years, which he called operant conditioning. Operant conditioning was a refinement of Ivan Pavlov's earlier concept of classical conditioning. Operant conditioning states that learning occurs as a result of the rewards and punishments the subject receives in response to a particular behavior. If the result of the behavior is a reward, the same behavior is likely to be repeated. If the result is a punishment, the behavior is less likely to be repeated.

Skinner had an initial interest in becoming a writer and received a bachelor's degree in English from Hamilton College in New York. After some time out of school writing newspaper articles, Skinner enrolled in the experimental psychology program at Harvard University and earned his masters and doctoral degrees in 1930 and 1931, respectively.

It was while Skinner was at Harvard that he was heavily influenced by the work of John B. Watson, who is commonly referred to as the "father of behaviorism" and the one responsible for initially popularizing many behavioral principles in the culture. Stemming from this and other influences, Skinner dedicated his life's work to studying the relationship between reinforcement and observable behavior. Throughout his career, he insisted that psychology was a scientific, empirically driven discipline.

In 1936, Skinner joined the faculty of University of Minnesota and later (1945) took up a position as chairman of the psychology department at Indiana University. In 1948, however, Harvard offered him a faculty position, which he accepted, and he remained there for the rest of his life.

Skinner is perhaps best known for several of his books. The first, entitled Walden Two (1948), describes a utopian community where the members of the community lived by the principles of operant conditioning and reinforcement. It received great praise from those receptive to his radical ideas and harsh criticism from those opposed to the mechanistic application of his theory to life. A prolific but slow writer, Skinner penned a combined total of nearly 200 articles and books over his long and influential career. His other important works include Behavior of Organisms (1938), and Beyond Freedom and Dignity (1971). In Beyond Freedom and Dignity, Skinner advocated mass conditioning as a means of social control, which created a great stir of controversy when it was published.

Skinner is also known for his invention of "the Skinner box," which is used in behavioral training and experimentation of animals to test and record the results of operant conditioning. For years it was rumored that Skinner kept his own daughter in one of the experimental boxes for an extended period of time, but historical records show this to be false.

Although Skinner's research was predominantly conducted with laboratory rats, he believed that his results could also be extrapolated to the behavior of human beings. As a behaviorist, he viewed human behavior as largely a response to environmental stimuli.

At the time of his death in 1990 from leukemia, Skinner had become one of the most notable figures in the field of psychology. The principles of operant conditioning and reinforcement that he outlined were built upon by clinical psychologists and applied to the treatment of disorders such as phobias, panic disorders, and child conduct problems.

BIOGRAPHY

The early years

B.F. Skinner was born on March 20, 1904 in Susquehanna, a small railroad town located in northeastern Pennsylvania. Skinner wrote three volumes of autobiography during his later years, and much of what we know of his earliest years comes from his own recollection.

Skinner was the older of two children and was brought up in a home with "rigid standards" enforced by his mother Grace Burrhus. Like most children in the early 20th century, Skinner and his younger brother Edward (called "Ebbie") grew up in an atmosphere where a strict code of conduct was followed. Grace clearly attempted to pass this strong social code to Skinner (called Fred), by expressing disapproval when he wavered from the expected norm. Skinner seemed to be especially receptive to praise from his parents, though it was apparently not given in great quantity. It is an interesting parallel that later his theory of operant conditioning would emphasize the crucial effect of "positive" reinforcement on behavior.

Skinner's father, William, was an only child and lived most of his life in Susquehanna. After finishing high school, William worked for a short period as a draftsman in the Erie Railroad Engineering Department. Because he showed little mechanical aptitude, he decided in 1895 to enroll in law school in New York. After passing the bar examination in 1896, he opened a law practice and was interested in making his mark amid the opportunities that were present in the ever-changing cultural landscape of the early twentieth century. He was successful as an attorney, political orator, and town booster, but was also notoriously boastful about his accomplishments to peers and underlings.

Skinner recalled his father as a gentle parent who never physically punished him, preferring verbal disappointment or good-natured ridicule as the preferred form of discipline. His father never missed an opportunity, however, to inform him of the punishments which were waiting for him if he turned out to have a criminal mind. His father once took his eldest son through the county jail to show what life would be like inside a prison.

Despite William's verbosity in the community, at home he seemed to live under the control of his wife's domineering personality. She acted in a condescending way toward her husband, and according to Skinner's account, the two were never very close emotionally.

Grace, Skinner's mother, was the oldest of four children and three years younger than her husband. She apparently was quite attractive and had a gifted singing voice, which she regularly used in her performances at the Susquehanna Hogan Opera. She attended Susquehanna High School and had ambitions to become a secretary, which she eventually realized when she was hired by the Erie Railroad in 1901. It was during this time that she met William and was impressed by his rising reputation as a lawyer and political speaker. They were married in 1902 and had a much more promising future since the economic depression and widespread labor unrest of the 1890s had abated. American women of that era were expected to sacrifice their careers when they married, and Grace was no exception. Even though she still cared a great deal about her standing in the community, her status would be associated with William's professional position.

PRINCIPAL PUBLICATIONS

  • About Behaviorism. 1974.
  • The Analysis of Behavior: A Program for Self-Instruction. 1961.
  • The Behavior of Organisms: An Experimental Analysis. 1938.
  • Beyond Freedom and Dignity. 1971.
  • The Contingencies of Reinforcement: A Theoretical Analysis. 1969.
  • Cumulative Record: Definitive Edition. 1959, 1961, and 1972.
  • Enjoy Old Age: A Program of Self-Management. 1983.
  • A Matter of Consequence. 1983.
  • Notebooks. 1980.
  • Particulars of My Life: Part One of an Autobiography. 1976.
  • Recent Issues in the Analysis of Behavior. 1989.
  • Reflections on Behaviorism and Society. 1978.
  • Schedules of Reinforcement. 1957.
  • Science and Human Behavior. 1953.
  • The Shaping of a Behaviorist: Part Two of an Autobiography. 1979.
  • Skinner for the Classroom. 1982.
  • The Technology of Teaching. 1968.
  • Upon Further Reflection. 1987.
  • Verbal Behavior. 1957.
  • Walden Two. 1948.

Skinner's brother Ebbie was two and a half years younger than he and appeared to be the favored child of his parents. Ebbie was an affable child who raised pigeons and played the clarinet. Ebbie was more outgoing than Fred and seemed to have a social grace that Fred lacked. Yet Fred was apparently not jealous of his brother and even appeared to like him. As Ebbie grew older, he proved to be much better at sports and more socially popular than his older brother. Ebbie often would tease Fred about his literary and artistic interests. Tragically, Ebbie died when he was 16 years old due to a massive brain aneurysm. The loss of Ebbie was devastating to the Skinners, especially William, who seemed thereafter to lose a part of himself he was never quite able to recover. Perhaps this was related to William's secret favoritism of his younger son over Fred. Years later, while reflecting on this tumultuous period, Frederic Skinner admitted that he was "not much moved" by his brother's death and subsequently felt guilty for his lack of emotion.

FURTHER ANALYSIS:

The operant chamber

An operant-conditioning chamber, also commonly known as a "Skinner box," is an experimental apparatus that was invented by B. F. Skinner in 1935 and was the basis of operant conditioning theory. Operant theory suggests that humans and animals "operate" on their environment and in doing so, encounter reinforcing stimuli that shape behavior. In operant conditioning, the behavior is followed by a consequence, which reinforces the behavior and makes it more likely to be repeated.

Skinner used the operant chamber to study the learning process of small animals. The chamber was a soundproof, light-resistant box or cage used in laboratories to isolate an animal for experiments in operant conditioning, and usually containing only a bar or lever to be pressed by the animal to gain a reward, such as food, or to avoid a painful stimulus, such as a shock. The chamber was large enough to easily accommodate the animal while allowing for easy viewing of the subject.

Most subjects that Skinner used in the operant chamber were smaller animals such as rats and pigeons. However, many other researchers have subsequently used the chamber with monkeys, raccoons, a variety of birds, and a host of other animals. Skinner began his research, like most at the time, using rats. But he soon found pigeons to be superior subjects because they could be conditioned more quickly using operant techniques. After this discovery, he used pigeons exclusively in his experiments.

The typical operant chamber includes a bar-press lever that is attached to a wall, adjacent to a food-cup or dish. During the exploration of the box, the subject may come across the lever and activate it, which triggers the release of a food pellet in the food cup or opens a door in the chamber that reveals food. Depending on the type of animal used, the chamber will incorporate different types of feeders and an operandum. An operandum is a device that automatically detects the occurrence of a behavioral response or action in the subject. The operandum is typically hooked up to a computer or other monitoring device to record the responses of the subject.

A modern operant chamber is more complex than those used by Skinner. Typically it would contain one or more levers that an animal can press; more than one stimulus, such as light or sound; and depending on the experiment, the chamber may include several means of reinforcing the behavior with food. The animal's interaction with the levers can be detected and recorded automatically. It is also possible to deliver other reinforcers such as water, or a form of punishment like electric shock through the floor of the chamber. With this configuration of multiple operandii and reinforcers, it is possible to investigate countless psychological phenomena.

In principle, the goal of the operant chamber is to measure an animal's ability to learn the association between the behavior (pressing a bar) and receiving a reward or reinforcer of the behavior (food). In operant terms, if the organism is learning this association, then the reinforcer (food) is likely to cause the behavior to repeat.

The use of the operant chamber was one of Skinner's most important developments and formed the basis for much of his theory of operant conditioning, which he has generalized to human behavior as well. Behavioral psychologists around the world still use variations of the operant chamber in ongoing research with subjects.

Much of Skinner's early years were spent building things. Whatever control his parents exercised, they still allowed Fred substantial freedom to explore, observe, and invent. A sampling of his inventions included musical instruments, roller-skate scooters, merry-go-rounds, model airplanes, and even a glider, which he tried to fly. As an early business venture, Skinner invented a flotation system that separated ripe elder-berries from green ones so that he and a friend could sell the ripe berries door-to-door. In retrospect, these early inventions were an indication of Skinner's immense curiosity of how things worked. He would later shift this same curiosity from the mechanics and interrelationship of objects to the mechanics and reinforcement of behavior.

As part of life in a small town, Skinner attended the same school during his first 12 years of education. There were only eight students in his graduation class. His keen mind and literary interests allowed him to excel academically. One teacher in particular, named Mary Graves, would prove to be an important figure in his life. Her father was the village atheist and an amateur botanist who believed in evolution. Ironically, Graves also taught Skinner and a handful of other boys most of the Old Testament in a Presbyterian Sunday school class she lead for years. Despite her efforts, Skinner would, years later, announce to Graves that he did not believe in the existence of God.

Graves was a dedicated person with cultural interests that far exceeded those of the average person in town. She organized what was known as the "Monday Club," a literary society to which Skinner's mother belonged. Graves also introduced Skinner to a wide range of classic literature ranging from Shakespeare to Conrad's Lord Jim. Graves taught Skinner many subjects during his years in that schoolhouse. She taught him drawing in the lower grades, and later English, both reading and composition. Skinner attributed his interest in literature and later his choice of English as his major study in college to Graves' influence.

College

Skinner attended Hamilton College on the recommendation of a friend of the family and majored in English. He minored in Romance languages. Hamilton was proud of its reputation for public speaking, and required all of its students to be trained in oratory skills throughout their stay. Skinner reluctantly complied with the four compulsory years of public speaking. Though a good student, Skinner never felt like he fit into student life at Hamilton. He joined a fraternity without knowing what it entailed. He was admittedly not good at sports and complained that the college was "pushing him around" with unnecessary requirements, such as attending daily chapel. He observed that most students showed almost no intellectual interest in the subjects taught and by his senior year was in open revolt of the school's system.

Skinner claims that the most important thing that happened to him while at Hamilton was getting to know the Saunders family. Percy Saunders was dean of Hamilton College at the time and through a series of conversations, Skinner was chosen as a mathematics tutor for the Saunders' youngest son. The Saunders family lived in a large frame house alongside the campus, and they exposed Skinner to a world of art and culture he had previously not known. The Saunders' home was full of books, pictures, sculpture, musical instruments, and huge bouquets of peonies in season. His visits to the Saunders' home exposed him to writers, musicians, and artists. It would be commonplace during his visits to hear beautiful music playing in the background composed by Schubert or Beethoven, or to hear poetry recited. According to Skinner, "Percy and Louise Saunders made an art of living, something I had not known was possible."

Literary interests

As a child, Skinner had an inclination to become a writer. He had used an old typewriter to compose poems and stories. He even started a novel or two. In high school he worked for the local newspaper, called the Transcript. In the morning before school he would crib national and international news from the Binghamton, New York, papers that came in on the morning train. The summer before his senior year he attended the Middlebury School of English at Breadloaf, Vermont. He took a course with Sidney Cox, who one day invited him to have lunch with the poet Robert Frost. During lunch, Frost asked Skinner to send him some of his work, which he did—three short stories. Frost responded with encouragement to continue writing, and it was at this point that Skinner made a definite decision that he would be a writer.

Unfortunately, Skinner's decision to become a writer was not supported by his father. William, from the time his son was born, had hoped his eldest son would follow in his footsteps and join him in the practice of law. Skinner cites his birth notice as an indication of his father's long-held eagerness for his son to join his profession. It read: "The town has a new law firm: William A. Skinner & Son." Skinner's father thought that Fred should first prepare himself to earn a living as a lawyer and then try his hand at writing after he was established. But William eventually conceded and agreed to let young Skinner live at home—which at the time was in Scranton, Pennsylvania—and write for a year or two to set his career in motion. Skinner spent a great deal of time building a small study in the attic, which included bookshelves, a desk, and other furniture. Though he had comfortable surroundings in which to write, he never seemed to make time to do it. He used his time poorly, read aimlessly, built model ships, played the piano, listened to the newly invented radio, contributed to the humorous column of a local paper, but wrote almost nothing else, and even thought about seeing a psychiatrist. He later referred to this period as the "dark year."

Before the year was out, Skinner ended up taking a job with the government. The job required him to read and write abstracts for thousands of legal decisions handed down by the courts that pertained to grievances over highly publicized coal strikes in previous years. His work was compiled and used as a reference book on the subject. After finishing the book, Skinner went to New York for six months of bohemian living in Greenwich Village, then to Europe for the summer, and on to Harvard in the fall to begin the study of psychology.

During the "dark year" Skinner developed a growing curiosity about writers who embraced a behavioristic philosophy of science. Foremost among these was John B. Watson, the founder of behaviorism. Skinner probably first read about Watson in the summer of 1926, when he was 22 years old, but this exposure only whetted his appetite and did not exert profound influence on him until several years later. Perhaps it was his own depression or lack of understanding about his failure as a writer, but the science of psychology was becoming increasingly intriguing to him.

Growing interest in psychology

Human behavior had always interested Skinner, but college did little to further his interest in psychology. The only formal instruction he recalled receiving at the university "lasted 10 minutes." After college, Skinner's literary interests did more to carry him in the direction of psychology than formal studies. Yet he did owe a debt to one of his college instructors for exposing him to the material that would start him down a path he would follow the rest of his career.

A biology teacher at Hamilton called Skinner's attention to Jacques Loeb's Physiology of the Brain and Comparative Psychology, and later showed him a copy of Pavlov's Conditioned Reflexes. Skinner bought Pavlov's book and read it while living in Greenwich Village. Skinner also read the literary magazine called The Dial, which at the time was publishing articles by the philosopher Bertrand Russell. Russell's book, Philosophy, which Skinner read shortly thereafter, devoted a good deal of space to John B. Watson's theory of behaviorism. After reading these books, Skinner was able to begin putting pieces of his fragmented thoughts into place and envision a direction for the kind of work he believed might explain human behavior. Skinner was not interested in traditional psychological theories that were reminiscent of the Freudian emphasis on the inner self, however. He was much more captured by the outward manifestation of behavior.

At the age of 24 Skinner enrolled in the psychology department of Harvard University. Still rebellious and impatient with what he considered unintelligent ideas, Skinner found an equally caustic and hard-driving mentor. William Crozier was the chair of a new department of physiology. Crozier fervently adhered to a program of studying the behavior of "the animal as a whole" without appealing, as the psychologists did, to processes going on inside. That exactly matched Skinner's goal of relating behavior to experimental conditions. Students were encouraged to experiment. Given Skinner's enthusiasm and talent for building new equipment, he constructed various gadgets to use in his lab work with rats. After creating a dozen pieces of apparatus and stumbling onto some lucky accidents, Skinner discovered something new. He found that the rats' behavior was not just dependent on a preceding stimulus (as Watson and Pavlov insisted), but was more influenced by what happened after the rat pressed the bar. In other words, the type of reinforcement the rat received after the behavior was perhaps more important than the stimulus that occurred before. Skinner named this new process operant conditioning.

After completing his doctoral degree in 1931, Skinner was awarded a series of fellowships that lasted five years at Harvard. These enabled him to continue his experiments in the laboratory without the burden of teaching responsibilities.

Minnesota

In 1936, then 32 years old, Skinner married Yvonne Blue, and the couple moved to Minnesota where Skinner had his first teaching job. Busy with teaching and his new family, which in 1938 included a daughter, Julie, he did little during these years to advance the science he had started. That changed with the advent of war.

In 1944, World War II was in full swing. Airplanes and bombs were common during this time, but there were no missile guidance systems yet available. Anxious to help, Skinner sought funding for a top-secret project to train pigeons to guide bombs to their target. He knew from working with animals in the lab that pigeons could be quickly trained to perform a desired task. Working intently, he trained pigeons to repeatedly peck a point of contact inside the missile that would in effect hold the missile on its intended trajectory toward the target. The pigeons pecked reliably, even when falling rapidly and working with warlike noise all around them. But, Project Pigeon, as it was called, was eventually discontinued because a new invention, radar, proved to be far more useful. Though Skinner was disappointed at the discontinuation of his experiment, it did strengthen his determination to continue using pigeons in future experiments because they responded more quickly to reinforcement than did rats. He never again worked with rats from that point forward.

The baby box

In 1944, near the end of the Second World War, Skinner and Yvonne decided to have a second child. Knowing that Yvonne found the first two years of caregiving for a child arduous, Skinner suggested they "simplify the care of the baby." This suggestion evolved into an invention that would later become known as the "baby box," or baby tender, as Skinner called it. The baby box was intended to be a superior alternative for the traditional baby crib. Skinner's baby box consisted of a thermostatically controlled enclosed crib with safety glass on the front and a stretched-canvas floor. It provided restraint and protection for the infant while also allowing great freedom of movement for the child. This baby box would be the sleeping space for their second daughter, Deborah, for the next two-and-a-half years. Skinner reported his invention and its use with his daughter in an article he submitted to Ladies Home Journal during that time period. As a result of this exposure, hundreds of other babies would eventually be raised in similar devices that would come to be known as Aircribs, for the increased air flow that resulted from the design. To the end of his life, however, Skinner was plagued by rumors that he had used his second daughter as one of his experimental subjects in putting her in the baby box, causing harm that ranged from mild to severe. These rumors proved to be untrue. Skinner was in fact an affectionate father and never experimented on either of his children.

Walden Two

In the spring of 1945 at a dinner party in Minneapolis, Skinner sat next to a friend who had a son and a son-in-law in the South Pacific. They discussed difficulties facing returning soldiers as they attempted to transition back into a civilian lifestyle. This started Skinner thinking about an experimental attitude toward life that led him to write a fictional account of one. The community he envisioned would live by the principles of operant conditioning and reinforcement that he was working forging in his experiments. He called the book Walden Two (1948), as a loose extension of Henry David Thoreau's book about his outdoor experiences in Walden published much earlier. Skinner's book began simply as a description of a feasible design for community living and evolved into something that his characters seemed to dictate once he began writing.

Skinner was known as a slow writer and typically wrote longhand. In general, he would claim that it would take him about three or four hours of writing each day to produce about 100 publishable words. Walden Two was an entirely different experience. He wrote it on the typewriter in seven weeks. In fact, he stated afterward that writing Walden Two was a "venture in self-therapy." Some of it was written with great emotion. After the publication of Walden Two, Skinner received many letters from individuals wanting to know whether the community he described actually existed; some even wanted to join. For a period of time he seriously entertained the idea of such an experiment, but abandoned it, citing his age as the biggest hurdle to seeing such an involved experiment to completion.

Indiana

In the fall of 1945, Skinner moved from Minnesota and took the position as chairman of the Department of Psychology at Indiana University. It was an administrative position that exempted him from teaching duties but still allowed time for a number of experiments, all of which used pigeons. Although Skinner was well-connected with other faculty members at Indiana University, his wife felt isolated and unhappy. It was not uncommon for her to pass her days reading novels. When it was apparent that they would be leaving Indiana University, both of them were enthusiastic about moving back east.

Back to Harvard

While giving the William James Lectures at Harvard in 1947, Skinner was asked to become a permanent member of the department. So in 1948, Skinner and his family moved to Cambridge, where he would finish his career at prestigious Harvard University. Before agreeing to come, Skinner had negotiated with Harvard that his presence as faculty member would entail more than teaching. He was given sufficient funds to purchase and maintain a laboratory where he could conduct experiments and actively promote operant science.

In the early 1950s Dr. Harry Solomon, then chairman of the Department of Psychiatry at the Harvard Medical School, helped Skinner set up a laboratory for the study of the operant behavior of psychotics at the Metropolitan State Hospital in Waltham, Massachusetts. By this time, a number of others had extended operant principles to the management of psychotic patients in hospital wards, and there was increasing interest in its applications to personal therapy.

Teaching machines and programmed instruction

By 1953, Skinner's children were growing up; his youngest child was now in the fourth grade. Skinner attended his daughter's class one November day at the school's invitation for fathers to observe their children. He had no idea that this visitation would alter the direction of his career.

As Skinner sat at the back of this typical fourth grade math class, what he saw suddenly hit him with an unnatural force of inspiration. As he put it, "through no fault of her own the teacher was violating almost everything we knew about the learning process." In other words, Skinner's concepts of operant conditioning were being violated right before his eyes in the classroom. The students were not being reinforced positively if they came up with the correct answer. But according to operant theory, shaping a desired behavior required immediate reinforcement. The other problem he became aware of was the dilemma of the teacher to shape the mathematical behavior of 20 or 30 children simultaneously. Clearly, the teachers needed help to facilitate learning for so many students. That afternoon, Skinner constructed his first teaching machine.

Skinner's teaching machine simply presented mathematics problems in random order for students to perform, with feedback after each one. This machine did not teach new behavior; it provided practice on skills already learned. Within three years, however, Skinner developed a program of instruction where, through careful sequencing, students responded to material broken into small steps. The steps were similar to what a skilled tutor would present if working one-on-one with a student. The first responses of each sequence were prompted, but as performance improved, less help was given. By the end of the material, a student was doing work beyond what they could have accomplished at the beginning. For about the next 10 years, Skinner was caught up in the teaching-machine movement, answering every one of thousands of letters from parents, schools, and business and industry.

After securing a grant, Skinner hired James G. Holland, who with Skinner's supervision, created the book called The Analysis of Behavior for use in Skinner's classes at Harvard; it was designed to be used with a teaching machine. The field of education embraced this newest teaching method, but many of the materials were poorly written, and companies were reluctant to spend much money designing materials for a teaching machine that might go out of production. By around 1968 education publishers stopped printing programmed instruction for the machine. That same year Skinner published The Technology of Teaching, a collection of his writings on education. Some of the better programs from the 1960s are still used. With the advent of the computer and Internet, the sophisticated machine that Skinner lacked is now available. Increasingly, instructional designers are realizing that, as Skinner insisted, tutorials must do more than present blocks of content with quizzes at the end. Effective instruction requires learners to respond to what each screen of information presents and to get feedback on their performance before advancing to the next segment of instruction.

Skinner's analysis of how to design sequences of steps for teaching came to him as he was finishing a book on which he had worked, on and off, for 20 years. He eventually named the book Verbal Behavior. Published in 1957, it was an analysis of why people speak, write, and think the way they do. It took another 20 years before researchers used Skinner's categories and found that the different controlling variables he postulated were, indeed, independent. His work in this area has contributed significantly to establishing methods of teaching children, especially those with autism, to communicate effectively.

Later life

An interest in the implications of behavioral science for society at large turned Skinner to philosophical and moral issues. In 1969 he published Contingencies of Reinforcement and two years later, perhaps his most well-known book, Beyond Freedom and Dignity, which prompted a series of television appearances. Still, the lack of understanding and misrepresentation of his work prompted him to write another book entitled About Behaviorism in 1974. Toward the end of his life he was still active professionally. In addition to professional articles, he wrote three autobiographical volumes, Particulars of My Life,The Shaping of a Behaviorist, and A Matter of Consequences.

After finishing Beyond Freedom and Dignity, at age 67, he was especially exhausted. He had previously felt symptoms of angina and was told by his physician that he might not survive another five years if he didn't change his lifestyle. His daughters put him on a strict diet to lower his cholesterol. By the mid-1970s, although his general health remained good, he had lost much of his hearing. In addition to wearing a hearing aid, he devised an amplification system in his basement that allowed him to continue listening to music. More health concerns followed with the discovery of a cancerous lesion in his head in 1981, a fall that required two surgeries in 1987, and other difficulties. But despite these temporary setbacks, Skinner continued working.

In 1989 he was diagnosed with leukemia, which would eventually take his life. He kept as active as his increasing weakness allowed. At the American Psychological Association, 10 days before he died, he spoke before a crowded auditorium. He finished the article from which the speech was taken on August 18, 1990, the day he died.

Skinner was the uncontested champion of behavioral psychology from the 1950s to the 1980s. During this period, American psychology was shaped more by his work than by the ideas of any other psychologist. In 1958, the APA bestowed on Skinner the Distinguished Scientific Contribution Award, noting that "few American psychologists have had so profound an impact on the development of psychology and on promising younger psychologists." In 1968, Skinner received the National Medal of Science, the highest accolade bestowed by the U.S. government for contributions to science. The American Psychological Foundation presented Skinner with its Gold Medal Award, and he appeared on the cover of Time magazine. In 1990, Skinner was awarded the APA's Presidential Citation for Lifetime Contribution to Psychology.

THEORIES

B. F. Skinner's entire theoretical system is based on what he called operant conditioning. Operant conditioning is one of the most basic forms of learning and affects virtually all forms of human behavior. It states that learning occurs as a result of voluntary responses that are operating on the environment. These behavioral responses are either strengthened (more likely to recur) or weakened (less likely to recur) depending on whether the consequences of the response are favorable or unfavorable. Unlike classical conditioning, which depends on the biological responses to some stimulus such as food (Ivan Pavlov's dogs salivating at the sight of meat powder), operant conditioning applies to voluntary responses, which an organism deliberately performs in order to achieve a desired outcome.

One way to understand operant behavior is it operates on the environment in ways that produce consequences. If a person is playing the piano, that person is operating on the environment (the keys on the piano) in such a way as to produce music. The quality of the music and comments from listeners are the consequences that condition the person's operant performance at the piano. Well-played music elicits social approval that reinforces the skills needed for playing well. In contrast, poor playing is likely to be criticized and thus negatively reinforced.

Operant conditioning is sometimes called instrumental conditioning because it is instrumental in changing the environment and producing consequences. Working late at the office may be instrumental in getting a particular project finished by the deadline.

Skinner's research was concerned with describing behavior rather than explaining it. His research dealt only with observable behavior. He was also unconcerned with speculations about what might be occurring inside the organism. His program of operant conditioning included no assumptions about drives or physiological process that characterized other theories. Whatever might happen between stimulus and response was not the sort of objective data with which Skinner was concerned.

Skinner's behaviorism assumes that humans are controlled and influenced by forces in the environment and the external world, not by forces from within. He did not go as far as denying the existence of internal physiological or even mental conditions, but he did deny their usefulness for the scientific study of behavior.

It is also worth noting that Skinner did not use large numbers of subjects to make statistical comparisons between the average responses of subject groups. His method was the comprehensive investigation of a single subject.

Main points

Reinforcement

Reinforcers are the prime movers of operant conditioning. Reinforcers that follow an operant behavior increase the likelihood that a similar response will occur in the future. A reinforcer is also called a reinforcing stimulus. The speed in which a person learns an operant behavior depends on the complexity of the behavior, the person's level of skills, the reinforcer involved, and many other variables.

There are two kinds of reinforcement: positive and negative. To reinforce means to strengthen; both positive and negative reinforcement strengthen behavior. Both increase the likelihood that a subject will repeat the behavior in the future. The critical difference between the two is that positive reinforcement occurs with the addition of a reinforcing stimulus. Negative reinforcement consists of removing an aversive stimulus.

Explanation In its simplest form, reinforcers can be thought of in terms of rewards: both a reinforcer and a reward increase the probability that a preceding response will occur again. The term "reward," however, is limited to positive support. Reinforcement, as Skinner used it, can be either positive or negative.

A positive reinforcer is a stimulus added to the environment that brings about an increase in a preceding response. For instance, if food, water, money, praise, or any number of other stimuli follow a particular response, it is very likely that this response will occur again in the future. Positive reinforcement can be given in natural or artificial ways. Unnatural praise and artificial rewards are not very effective in reinforcing behavior. Highly contrived, unnatural rewards can even decrease the frequency of an operant behavior if used in a manipulating manner. However, natural and sincere positive feedback is usually both pleasurable to receive and effective in reinforcing behavior.

In contrast, a negative reinforcer refers to an unpleasant stimulus whose removal from the environment leads to an increase in the probability that a preceding response will occur again in the future. The two main classes of behavior produced by negative reinforcement are escape and avoidance. Escape responses are those operants that allow a person to get away from aversive stimuli after the stimuli are present. Avoidance responses are those operants that allow a person to prevent the occurrence of aversive stimuli before the aversive stimuli appear. In other words, escape involves reacting after an aversive event is present. Avoidance involves proacting, or taking preventative steps before an aversive event arises. People react to getting a splinter in their finger by pulling it out; they proact by putting on gloves before handling rough wood. Escape behaviors are usually learned before avoidance behaviors.

Examples An example of positive reinforcement is a student who diligently plans and follows a disciplined schedule of study in order to get good grades. The positive reinforcer is the achievement of good grades. In other words, the good grades reinforce the disciplined study habits of the student so he or she is likely to continue the study regimen with hopes that good grades will continue.

Positive reinforcement can also come from nonsocial sources by virtue of an operant principle termed "selective perception." Selective perception describes a person's ability to pay attention to only a fraction of all the stimuli in their environment, neglecting the others. So, for instance a person is walking down a sidewalk when he notices a dollar bill lying in the curb. He sees no one around to whom it may belong and puts it in his pocket. Moments later he notices a change in his own behavior. He is no longer looking at the trees or houses, but is scanning the ground as if looking for another lucky find. The behavior of looking down was followed by the positive reinforcement of finding the money.

Negative reinforcement could be illustrated by a child who begrudgingly does his chores simply to escape the nagging of his parents. In this example, the nagging is the negative reinforcer. So, as the child performs the assigned chores, he finds it eliminates the nagging, which in turn reinforces the likelihood that he will continue doing the chores.

An example of escaping could involve a married couple who repeatedly find themselves in verbal arguments with each other. They react by trying to escape the aversive situation through marital counseling. Other couples who see their friends having marital troubles may proact by working on improving their communication and resolving differences before problems arise, thereby avoiding some arguments and possible long-term damage to their marriage.

Punishment

When an operant behavior is followed by a response that reduces the frequency of a similar response in the future, that stimulus is called punishment.

Positive punishmentNegative punishment
(Courtesy Thomson Gale.)
When the subject—a person or animal—engages in a behavior and something negative is applied as a result, the behavior is less likely to be repeated.When the subject—a person or animal—engages in a behavior and something positive is taken away, that behavior is less likely to be repeated.

If a person receives a significant fine after driving through a red light, the punishment is likely to reduce the tendency to speed through red lights in the future. Both punishment and extinction reduce the frequency of behavior, but punishment usually does so more rapidly and more completely than extinction does. Punishment produces the fastest reduction of the behavior when it is strong, immediate, and not opposed by reinforcement.

There are two types of punishment: positive punishment and negative punishment, just as there are both positive and negative reinforcement. In both cases, the term "positive" refers to something that is added, whereas "negative" implies something that is removed.

Explanation The terms "positive" and "negative" indicate whether punishment occurs with the onset or termination of the stimulus that follows the operant. "Positive" indicates onset, and "negative" indicates termination. Positive punishment occurs when the onset of an aversive stimulus suppresses behavior. For instance, if you spill hot coffee on your hand while carrying a cup to a nearby table, the onset of an aversive stimulus (hot coffee) punishes the clumsy act. This is considered a positive form of punishment. Negative punishment occurs when the termination of a rewarding stimulus suppresses behavior. If a haphazard action results in your dropping and losing an important document, the loss serves as punishment for the act. Therefore the loss of a positive reinforcer is a negative punishment. It is important to distinguish between negative reinforcement and punishment. The two are not the same in operant conditioning. Punishment refers to a stimulus that decreases the probability that a prior behavior will occur again. This differs from negative reinforcement, which increases the likelihood of a recurrence in the behavior.

Punishment does not cause behavior to be unlearned or forgotten. It merely suppresses the frequency of responding. Often the effects of punishment are only temporary. When the punishment no longer occurs, the rate of responding usually increases. This phenomenon is called recovery. Recovery is fastest and most complete when the original punishment was mild or infrequent and there is reinforcement for reinstating the behavior. The milder the original punishment, the sooner a behavior is likely to recover after the end of punishment.

Positive punishment weakens a response or makes it less likely to recur through the application of an unpleasant stimulus. On the same track, but coming from the opposite direction, is negative punishment. This consists of removing something that is pleasant in order to weaken the response or make it less likely to be repeated.

Although Skinner recognized the role of punishment in response to behavior, he was against using it because he did not believe it had a permanent effect on altering behavior except in extreme cases. Although it may initially stop the particular behavior in question, Skinner believed that the prior response was likely to reappear over time. In addition, punishment may actually cause a resulting fear or anxiety to emerge that wasn't present before the application of the punishment.

A very effective non-punitive method of decreasing the frequency of a behavior is the use of differential reinforcement of other behavior. This means that reinforcement is provided for behaviors other than the one that is problematic, with the hope the behaviors reinforced will be repeated and the problematic behavior will decrease or cease. Differential reinforcement works best when the desired behavior is incompatible with the undesired behavior.

Examples Positive punishment can be illustrated by thinking of a young child who disobeys a parent and receives a spanking for his response. Here the parent is adding an unpleasant stimulus (the spanking) with the hope that is will weaken the future response of the child and make it less likely to recur.

A good example of negative punishment is when a teenager is told she is "grounded" and will no longer be able to use the family car because of her poor grades. The negative punishment entails the removal of what is pleasant or desirable (using the car). The hope behind the use of negative punishment in this case is that removal of the privilege will make poor grades less likely to recur.

If a child is being overly aggressive with his playmates, his parent can use differential reinforcement by providing rewards or reinforcement for nonaggressive behavior such as helping, consideration, or concern for others. This not only draws the child away from aggression, but gives him a new style of interpersonal relations which make aggression much less likely to occur.

Extinction

Once an operant has been reinforced and become common, there is no guarantee that the frequency of the response will remain the same in the future. Either extinction or punishment will cause a response to become less frequent. Regardless of which one is in effect, both work in the opposite direction from reinforcement. Extinction consists of the discontinuation of reinforcement, whether positive or negative, that once maintained a given behavior. This withholding of the reinforcement will, in theory, cause the behavior to cease.

Explanation Extinction can take place because there is no reinforcement associated with a certain behavior, or there is less reinforcement associated with that behavior because there is some superior alternative. The idea behind extinction is that without the reinforcement, either positive or negative, the behavior will cease because the reward is no longer present. The rate at which a response ceases depends on the individual's prior history of reinforcement. When extinction begins, people usually give up a pattern of behavior much faster if the behavior had been rewarded all the time (continuous reinforcement) in the past instead of rewarded only part of the time (intermittent reinforcement).

The question becomes: Once a conditioned response has been extinguished, can it return? Pavlov discovered during his experiments with dogs that the conditioned behavior that had stopped being reinforced and ceased could be engaged again with the commencement of the conditioned stimulus. This effect is known as spontaneous recovery, or the reemergence of an extinguished conditioned response after a period of rest. With each successive reduction of the behavior due to removal of the conditioned stimulus, the conditioned response can spontaneously recover more rapidly once the conditioned stimulus is reused. This concept is called saving, which implies that some of the learning is retained from previous conditioning.

Behavior modification practitioners often advocate extinction as an alternative to punishment. The danger in using extinction, however, is that it may produce frustration in the respondent and as a result, temporarily increase the behavior that is supposed to stop. The potential benefit of using extinction is that once the old behavior is not reinforced, the person looks for new behaviors to try and restore the reward. These behaviors are more likely to be in line with the desired behavior. If reinforced, these new behaviors should repeat.

Examples Two businessmen (Tom and Joe) strike up a friendly rapport due to their frequent phone calls with each other related to business matters. One day Joe calls Tom to discuss a new product he has just received, but gets voice mail. Joe leaves a message asking for a return call. When no return call is received he makes several more calls to Tom over the course of the next few weeks. None of these are returned. Because the reinforcement of a return phone call from Tom no longer exists, Joe eventually ceases to call Tom anymore, not expecting to hear from him.

If a person's car has always started on the first try (continuous reinforcement), that person is more likely to give up and call a garage if one morning the car doesn't start. In another case, the person used to trying a dozen times or more to start the car is less likely to give up quickly on a given day when he or she is unable to start the car. This latter person has been rewarded intermittently, which makes extinction of a particular behavior more difficult.

Shaping

Shaping is a technique that is used in behaviorism to train an organism to perform a behavior that is completely new. Shaping teaches a complex behavior by rewarding or reinforcing each step of the learning process rather than the final outcome.

Explanation Shaping works from the principle that a little can eventually go a long way. The final goal or target response is beyond the realistic reach of the organism because the behavior is not yet in their behavioral repertoire; it is completely new. The concept of shaping breaks down the learning process into smaller pieces. Skinner used incremental stages to reinforce the desired behavior. At first, actions even remotely resembling the target behavior, which he termed successive approximations, are followed by a reward. Gradually, closer and closer approximations of the final target behavior are required before the reward is given. Shaping, then, helps the organism acquire or construct new and more complex forms of behavior from simpler behavior. By the time shaping is complete, the reinforcement need only be given at the completion of the desired behavior in order for the behavior to recur.

Examples Textbooks for students are often written using the concept of shaping. Typically, information is presented so that new material builds on previously learned concepts or skills. If this were not the progression, most students would become confused and perhaps abandon the attempt to learn the concepts under study.

Teachers are continually in the position to shape the behavior of their students. An art student begins a series of drawings while the teacher assesses the various skills of the student. The teacher gives positive feedback for the areas in which the student performs well and looks for ways to reinforce small steps (successive approximations) toward the desired outcome. During the shaping process, praise (reinforcer) is given for the skills the student can do at present. The teacher may compliment the student on his shading techniques and at the same time suggest he try to expand his shading to another portion of the drawing to perhaps work on perspective. This allows the shaping to occur in increments while being positively reinforced through the learning process.

Chaining

Chaining refers to a type of conditioning that is similar to shaping but requires a more complex sequence of behaviors. This process is referred to as chaining because each response is like a link in the chain. The reward is presented after the entire sequence of behaviors is completed, thus reinforcing the sequence and not the individual behavior.

Explanation Chains can be trained in the forward direction, that is, by practicing the first response in the chain and then adding successively the next elements. It can also be learned backwards, beginning with the last element and working toward the front. Sometimes the entire chain is learned simultaneously. Training that starts at either end tends to place a greater emphasis on the skills or knowledge that lies at those places for the overall mastery of the chain. Training that attempts to learn the entire chain simultaneously leads to more total errors but affords more practice on all links of the chain.

Examples Backward chaining is often used with pilot trainees when using a flight simulator. They practice landings first, followed by landing approaches and then other flight specific behaviors such as midair maneuvers. The purpose of the backward chaining is that landings are the most difficult behavior to master in the chain and by starting there, this behavior receives the most practice as the behavioral links are put in place. Forward chaining might be used by physical therapists to teach disabled individuals to transfer themselves from a wheelchair to another chair or bed. A forward chain is often preferable when the skills learned in the first link are needed to build successively from that point.

Discrimination and generalization

People and animals learn to pay attention to cues in the environment that reliably signal certain consequences for their actions. Learning to distinguish one stimulus from another is called stimulus-control training. For instance, it doesn't take a child very long to distinguish that a red light at an intersection means stop and a green light means go. In stimulus-control training, a behavior is reinforced in the presence of a specific stimulus but not in its absence.

Stimulus generalization happens when an organism learns a response to one stimulus and applies it to another similar stimulus. Even though the stimuli may be different, the familiarity that accompanies the initial learning can be applied to other stimuli as well.

Explanation A discriminative stimulus signals the likelihood that reinforcement will follow a particular response. Some discriminations are relatively easy, while others extremely complex. For instance, it is easy to distinguish between the facial features of two people who resemble one another in appearance if the observer looks carefully. It is far more difficult, however, for that same observer to discriminate when a facial expression is communicating friendliness versus love. Certain cues must be present or absent for the observer to draw a convincing conclusion. Perception plays a large part in a person's ability to discriminate one stimulus from another. The ability to effectively discriminate plays a considerable role in human behavior.

Stimulus generalization enables organisms to take previous learning and apply it to new, but similar, situations. The ability to utilize previous learning keeps the organism from having to start over in the learning process. Generalizations can be less effective when the stimulus has an element of newness unassociated with the familiarity. For example, when a person learns to drive a car, this training can be generalized to driving most other cars. If the initial training was on an automatic transmission, though, and now the driver must drive a manual transmission, generalization of the prior skills is limited. The respondent must then use stimulus discrimination to distinguish between the familiar and the new information, and make the appropriate adjustments to generalize the new learning.

Examples Children, for example, may learn that when their father whistles, he is in a good mood and therefore he is more likely to respond favorably if asked for money or permission to do something fun with friends. The children learn to discriminate the good mood from the bad mood by the presence of the cue (whistling). Over time the children learn to make requests only in the presence of the signal for a good mood.

If a person has learned that being polite produces the reinforcement of granting them what they want in certain situations, that person is likely to generalize that response to other situations.

Reinforcement schedules

In his early research, Skinner discovered that reinforcement need not be given for each response, but instead could be given after some number of responses according to various schedules of reinforcement. A schedule of reinforcement refers to the specific relationship between the number, timing, or frequency of responding; and the delivery of the reward. In other words, once a behavior has been shaped, it can be maintained by various patterns of reinforcement. Depending on the particular schedule, the reward may follow the response immediately or have varying degrees of delay.

Schedules are among the most powerful determinants of behavior. All reinforcers and punishers are embedded in one schedule or another and each schedule has its own characteristic effects on behavior.

Reinforcement can be given for each occurrence of the response or only for some of the responses. The two broad categories of schedules are continuous and partial (also called intermittent) reinforcements. With continuous reinforcement, each response of a particular type is reinforced. In a partial reinforcement schedule, only a portion of the responses are reinforced.

Explanation When attempting to instill a particular behavior, a continuous schedule of reward generally produces more rapid conditioning or a higher level of responding than a partial-reinforcement schedule. Though a continuous schedule may condition more rapidly, partial schedules are often more powerful in sustaining the behavior, depending on the interval of reward. Extinction does tend to occur more quickly if a behavior that has received continuous reinforcement is no longer reinforced.

There are again two broad types of partial-reinforcement schedules: interval schedules, which are based on the passage of time; and ratio schedules, which are based on the number of responses. On an interval schedule, the first response made after an interval of time has passed is reinforced. Responses made before that interval of time are not reinforced. There are two types of interval schedules: fixed and variable. In a ratio schedule, time is not a factor. Instead, reinforcement is given only after a certain number of responses. Ratio schedules also have two types: fixed and variable.

Examples A fixed-interval schedule applies the reinforcer after a specific amount of time. An example might be an employee who gets a raise once at the end of each year but no increase in pay during the course of the year. The reinforcer (increase in pay) comes only at a predetermined time regardless of the employee's work performance during the year. Fixed-intervals have built-in problems that manifest in certain situations. Using the example of the employee's end-of-the-year raise, the employee, because he knows when the reinforcement is to come, may tend to lower his performance immediately after the reinforcement and tend to increase performance right before the reinforcement period. In this case, he might improve his performance near the end of the year to "look good" when it comes time for the review that determines the amount of pay raise.

Reinforcement is also controlled mainly by the passage of time in a variable-interval schedule. In contrast to the fixed-interval, in which the person knows the time the reinforcement will be given, the person does not know when the reinforcement will appear in a variable-interval schedule. An example of this schedule might be the supervisor who checks an employee's work at irregular intervals. Because the employees never know when such checks will occur, they must perform in a consistent manner in order to obtain positive outcomes, such as praise; or avoid negative ones, such as criticism or loss of their job. The advantage of variable-interval schedules is that it often eliminates the inconsistencies of performance associated with the fixed interval. Because of this, variable schedules are usually considered more powerful and result in more consistent behaviors than fixed intervals schedules.

Another modern example of variable-interval scheduling is the use of random drug testing. Athletes are routinely tested as well as people whose impaired performance could endanger the lives of others, such as airline pilots, security personnel, and healthcare workers. Because the participants cannot predict the day when the next test will be given, these individuals are more likely to refrain from using drugs.

Reinforcement is determined in a very different manner on a fixed-ratio schedule. Here, reinforcement occurs only after a fixed number of responses. For example, some individuals are paid on the basis of how many pieces of goods they produce. A factory worker who drills a series of holes for a particular product is paid a certain price for every product they complete. Or consider the person who collects recyclable aluminum cans or scrap metal and is paid based upon the number of pounds they turn in. Generally, a fixed-ratio schedule yields a high rate of response, though there is a tendency for brief pauses immediately after reinforcement. In the examples above, the reinforcement would be the fixed amount of pay obtained in good produced.

On a variable-ratio schedule, reinforcement occurs after completion of a variable number of responses. Since the person using a variable-ratio schedule cannot predict how many responses are required before reinforcement will occur, they usually respond at high and steady rates. Perhaps one of the best examples of the variable-ratio schedule is found in casinos across the country. The person who repeatedly plays the slot machine knows at some point the machine will have a payoff, but they are not sure when it will occur. The anticipation that it could happen on the next pull compels many to keep playing beyond the point of good reason.

Variable-ratio schedules also result in behaviors that are highly resistant to extinction. This means that even in the absence of reinforcement, the behavior might persist. In fact, resistance to extinction is much higher after exposure to a variable-ratio schedule than to a continuous-reinforcement schedule. This would help explain why gambling can be so addictive for certain individuals.

HISTORICAL CONTEXT

Skinner's theory of operant conditioning did not spring from his mind alone. Several theorists were profoundly influential in laying a foundation for the work Skinner was to build on.

All behavioral theories owe some debt of gratitude to Ivan Pavlov for developing the principles of classical conditioning. Pavlov, who won the Nobel Prize in 1904 for his work on digestion, was best known for his experiments on basic learning processes. While studying the secretion of stomach acids and salivation in dogs in response to eating various amounts of food, he discovered that even the mere sight of a person who normally fed the dogs could elicit an anticipation of food by the canines. In other words, the dogs were not only responding to the biological need to eat but also demonstrated that there was learning going on in the process of feeding. A neutral stimulus such as the experimenter's footsteps, when paired with food, could bring about a similar response as the food alone. This type of learning Pavlov called classical conditioning.

The basic process of classical conditioning can be described in several steps. It first needs the presence of a neutral stimulus that does not elicit specific response in the participant prior to the experiment. In Pavlov's classic experiment, the neutral stimulus was the sound of a bell. Ringing the bell prior to the experiment did not elicit salivation in a dog. The second component is the unconditioned stimulus, which in this experiment was meat. At the mere sight of meat the dog would salivate. It is called the unconditioned stimulus because the dog salivates instinctively and needs no training for this response. Hence, the dog's response is an unconditioned response. During the conditioning process, the bell is routinely rung just before the presentation of the meat. Over time, the ringing of the bell alone will bring about salivation. Conditioning is complete when the previously neutral stimulus of the bell (now the conditioned response) is now able to elicit salivation (conditioned stimulus).

Although the initial conditioning experiments performed by Pavlov and others were conducted on animals, classical conditioning principles were soon being used in various ways to explain everyday human behavior. Pavlov's conditioning techniques provided psychology with behavioral ways in which complex behavior could be better understood and built upon by other theorists.

At approximately the same period of time that Pavlov was experimenting with animals and developing his classical conditioning theory, a man by the name of Edward Thorndike was conducting ground-breaking experiments of his own. Thorndike is one of the most influential theorists of the early twentieth century and considered a very important researcher in the development of animal theory. Thorndike believed that psychology must study behavior, not mental elements or conscious experiences, and thus he reinforced the trend toward greater objectivity within the emerging field of psychology.

One of Thorndike's major contributions to the study of psychology was his work with animals. Through long, extensive research with these animals, he constructed devices called "puzzle boxes." These were essentially wooden crates that required the manipulation of various combinations of latches, levers, strings to open. A cat would be put in one of these puzzle boxes and would eventually manage to escape from it by trial and error. On a successive attempt, the amount of time it took the cat to escape decreased. Thorndike compared the results of several cats and found a similar pattern. If he rewarded the behavior of the cat, the behavior was repeated, if he did not, it would cease. He surmised that certain stimuli and responses become connected or dissociated from each other in the process of learning. This learning principle he termed the law of effect.

This evaluation led Thorndike to conclude that animals learn by trial and error, or reward and punishment. Thorndike used the cat's behavior in a puzzle box to generalize what happens when all beings learn anything. All learning involves the formation of connections, and connections were strengthened according to the law of effect. Intelligence is the ability to form connections, and humans are the most evolved animal because they form more connections then any other being. He continued his study with learning by writing his famous book called Animal Intelligence. In this he argued that we study animal behavior, not animal consciousness, for the ultimate purpose of controlling behavior.

A subtle but important distinction should be made between trial and error learning (instrumental learning) and classical conditioning. In classical conditioning, a neutral stimulus becomes associated with part of a reflex, which is either the unconditioned stimulus or the unconditioned response. In trial and error learning, no reflex is involved. A reinforcing or punishing event, which is also a type of stimulus, alters the strength of the association between a neutral stimulus and the arbitrary response.

Thorndike's early research served as the foundation for Skinner's work that was beginning in the latter years of Thorndike's career. Whereas Thorndike's goal was to get his cats to learn to obtain food by leaving the box, animals in Skinner's box learned to obtain food by operating on their environment within the box. Skinner became interested in specifying how behavior varied as a result of alterations in the environment.

One of the biggest influences on Skinner's ideas came from the work of John B. Watson, often referred to as the "father of behaviorism." Watson carried the torch of the behaviorist position, claiming that human behavior could be explained entirely in terms of reflexes, stimulus-response associations, and the effects of reinforcers. His 1914 book entitled Behavior: An Introduction to Comparative Psychology became the official statement of his theory and was widely read at the time.

Watson's lab work with rats enabled him to discover that he could train rats to open a puzzle box like Thorndike's for a small food reward. He also studied maze learning but simplified the task dramatically. One type of maze he used was a long straight alley with food at the end. Watson found that once the animal was well trained at running this maze, it did so almost automatically. Once started by the stimulus of the maze, its behavior becomes a series of associations between movements rather than stimuli in the outside world. The development of other well-controlled behavioral techniques by Watson also allowed him to explore animal sensory abilities.

Watson's theoretical position was even more extreme than Thorndike's. He would have no place for intellectual concepts like pleasure or distress in his explanations of behavior. He essentially rejected the law of effect proposed by Thorndike, denying that pleasure or discomfort caused stimulus-response associations to be learned. For Watson, all that was important was the frequency of occurrence of stimulus-response pairings. Reinforcers might cause some responses to occur more often in the presence of particular stimuli, but they did not act directly to cause their learning.

After Watson published his second book Psychology from the Standpoint of a Behaviorist in 1919, he became the founder of the American school of behaviorism. In this book he addressed a number of practical human problems such as education, the development of emotional reaction, and the effects of factors such as alcohol or drugs on human performance. Watson believed that mental illness was the result of "habit distortion," which might be caused by fortuitous learning of inappropriate associations. These associations then go on to influence a person's behavior so that it became ever more abnormal.

Watson became a very controversial figure in psychology for several reasons. He was credited with wedding behavioral techniques with celebrity endorsements of products and services to manipulate motives and emotions. Now a widely used strategy for marketing and advertising, during the 1920s, it was not well received by many people. In a larger sense, Watson was a pivotal figure in shaping public perception away from the dominant view of psychoanalysis and the internal processes of behavior. His call was for a society based on scientifically shaped and controlled behavior. His ideas offered hope to those disenchanted with old ideas.

Skinner probably first read some of Watson's work in the summer of 1926, when he was 22 years old, but it wasn't until the spring of 1928 that Skinner took the writings of Watson more seriously. Years later, when Skinner had established himself as an independent thinker and writer on radical behaviorism, he said that Watson had brought the "promise of a behavioral science," but this was not the same thing as delivering the science itself. But Skinner agreed with Watson in that he denied that behavior is determined by processes with the physiology of the organism.

By the 1920s, the field of psychology had already captured the public's attention. Given Watson's charisma, personal charm, persuasiveness, and message of hope, Americans were enthralled by what one writer called an "outbreak" of psychology. Much of the public was convinced that psychology provided a path to health, happiness, and prosperity. Psychological advice columns sprouted up in the pages of the daily newspapers. Watson's behaviorism was the first stage in the evolution of the behavioral school of thought. The second stage, sometimes referred to as neobehaviorism, can be dated from about 1930 to about 1960 and includes the work of Edward Tolman, Clark Hull, and B. F. Skinner.

Edward Tolman was one of the early converts to behaviorism and like Watson, rejected the notion of introspection and inner processes for determining behavior. He was firmly committed to working only with those behaviors that were objective and accessible to observation. Tolman is recognized as a forerunner of contemporary cognitive psychology, and his work had a great impact, especially his research on problems of learning. Some of his core principles were later used by Skinner and other behaviorists.

Clark Hull and his followers dominated American psychology from the 1940s until the 1960s. Hull had a proficient command of mathematics and formal logic and applied this knowledge to psychological theory in a way that no one had before. Hull's form of behaviorism was more sophisticated and complex than Watson's. Hull described his behaviorism and his image of human nature in mechanistic terms and regarded human behavior as automatic. He thought behaviorists should regard their subjects as machines and believed the machines would one day replicate many human cognitive functions. As might be guessed, Hull drew much criticism for his hard-line approach to the mechanism of human processes, but his influence on psychology at the time was substantial.

Beginning in the 1950s, Skinner became the major figure in American behavioral psychology. He attracted a large, loyal, and enthusiastic group of followers. His influence extended far beyond the professional community of psychologists at work in laboratories. His popularity was largely as a result of the advent of television in the early 1950s. His two most widely read books, Walden Two and Beyond Freedom and Dignity, thrust him into popular culture. It was the modern medium of television, however, that made him a household name. He would regularly appear on television talk shows to advance his views on operant conditioning and how it applied to everyday life. In a short period of time, he became a celebrity and arguably the best-known psychologist of that era.

Skinner's system of psychology reflects his early life experiences. According to his view, life is a product of past reinforcements. He claimed that his life was just as predetermined and orderly as his system dictated all human lives should be. He believed his experiences could be traced solely and directly to stimuli in his environment. Having been raised by a mother who was rigid in her discipline and by a father who tended toward being verbally critical of Skinner, praise was not common in his home life. Perhaps there is a correlation in his theory with his most important concept of reinforcement. Operant conditioning states that for a behavior to be repeated, it must be positively reinforced. The centrality of that theme in his theory has been mentioned by some scholars as a response to his own desire for more praise and encouragement from his parents.

In 1938 Skinner published what was arguably the most influential work on animal behavior of the century, entitled The Behavior of Organisms. Skinner resurrected the law of effect in more starkly behavioral terms and provided a technology that allowed sequences of behavior produced over a long time to be studied objectively. His invention of the Skinner box was a great improvement on the individual learning trials of both Watson and Thorndike. Skinner's theory would eventually become known as operant conditioning and would become one of the most enduring theories of the twentieth century.

CRITICAL RESPONSE

Skinner has aroused more than his share of controversy. Those who are familiar with Skinner's ideas tend to have a strong positive or negative reaction depending upon their own presuppositions about human nature. The most common critical responses follow.

Free will and personal responsibility

Skinner's operant conditioning opposes the concepts of free choice and personal responsibility. He maintained that it is the environment that determines what a person was, is, and will be in the future. He accounts for genetic inheritance by referring to the environments that existed during evolutionary history. In short, he claims that environmental factors determine behavior in a way that free will and individual choice play no causal role.

According to Skinner, each person is unique, but not because of choices the individual makes. Rather, personality arises from genetic makeup and the different experiences each person is exposed to during their lives. In addition, individuals remain under the influence of their environment throughout the lifespan, regardless of the degree of learning that has preceded.

There is agreement among many of Skinner's critics that environmental factors are important. The extreme position that Skinner takes (that environment alone shapes behavior) causes much controversy. His view of environmental reinforcement as the basis of behavior violates what most people believe regarding the presupposition of "freedom of choice" and personal accountability for one's actions. Though many critics would agree that the environment is a shaping entity for some human behavior, only a minority are willing to agree that the totality of human behavior can be explained in operant terms.

A question naturally arises in regard to control: If humans are controlled by their environments and have no free choice, how must people go about the process of "deciding" to follow the principles of operant conditioning? It appears a contradiction on the one hand to say we have no free-will choice, and on the other hand to imply that choices must be made to reinforce certain behaviors.

Generalizing findings to human behavior

Skinner conducted nearly all of his experiments with laboratory animals, most of which were rats and pigeons. Although there have been a number of successful applications of Skinner's concepts with humans, criticism has been leveled over how much of the results of Skinner's experiments can actually be generalized to human beings. The criticism basically states that humans are far more complex and advanced than the animals used in the operant experiments, so how can Skinner so confidently generalize the outcomes to humans? In using animals as substitutes for humans in the exploration of human behavior, Skinner was making the huge assumption that general laws relating to the behavior of animals can be applied to describe the complex relations in the human world.

Could it be possible, critics add, that even though the behavior of both rats and humans tends to increase in frequency when certain consequences occur, that humans still have higher and perhaps different cognitive processes? Perhaps humans can assess what is going on by using rational thought processes and then decide which behavior they will do to be reinforced. In contrast, rats may process the information more mechanically, with no conscious or rational self-determination. The final answer to these questions is not yet available. Those psychologists who hold this view believe more experiments with human participants must be done to prove the validity of this theory.

Operant conditioning is overly simplistic

Skinner's concepts of operant conditioning have often been interpreted as being simplistic because he either ignores or negates the richness of life. The assumption is often made that Skinner doesn't deal with human emotions and thoughts and has virtually nothing to say about the complex behaviors of life that are displayed in creative activities. In general, Skinner seems to ignore the realm deemed "creative," which includes the imagination, because it is not easily open to direct observation and presents difficulties on the experimental level. Skinner saw the creation of a poem, for instance, as being analogous to having a baby or to the process of laying an egg by a hen. He believed that there is no creative act that is autonomous. The person who writes the poem has a particular background and is living under certain conditions that reinforce one's view of the world. Therefore the creation of the poem is merely a function of how the environment has treated that person, as opposed to some uncaused event that sprung from nowhere. The criticisms of Skinner on this point have more to do with his mechanistic view of human nature than the resulting conclusions about creativity. It follows logically that if a human being is nothing more than a machine of sorts, then there is no need for an inner life of which imagination and creativity are parts. These aspects of life bring a multidimensional enjoyment of life that many people cannot reconcile with operant-conditioning principles.

Development of human language

Although Skinner's ideas on operant conditioning are able to explain phobias and neurosis, a number of critics find the applicability to the more complex human behaviors of language and memory sadly lacking. The argument centers around the idea that some portion of language acquisition in young children must be inherited. Infants do not learn language on a word-by-word basis. Instead they learn grammatical rules necessary to produce sentences over time.

Skinner's inability to explain the language phenomenon in a satisfactory way has caused a number of critics to dismiss the theory altogether. While observable objective stimuli for verbal responses are more clear-cut, private stimuli or concepts such as "I'm hungry" are harder to explain. According to Skinner, the acquisition of verbal responses for private stimuli can be explained in four ways. First, he claims that private stimuli and the community do not need a connection. As long as there are some sorts of public stimuli that can be associated with the private stimuli, a child can learn. Also, the public can deduce the private stimuli through nonverbal signs, such as groaning and facial expressions. The critics claim that these nonverbal signs associated with public and private events can often be misinterpreted. His third theory states that certain public and private stimuli are identical; therefore there is no need for interpretation. Finally, he says that private stimuli can be generalized to public stimuli with coinciding characteristics. Although Skinner attempted to respond to ongoing criticism of these claims during his lifetime, his arguments were considered by many to be weak and relatively unproven.

Misuse of reinforcement

A number of criticisms have arisen relating to operant conditioning and the use or misuse of reinforcement. One objection states that the use of reinforcement, as outlined by Skinner's theory, is manipulative. Granting and withholding reward is a form of control. The concept of control is central to Skinner's thinking, however, and appears repeatedly in his writing. When he uses the term, he claims that individuals are controlled by environmental forces, which include the actions and behaviors each person displays to others. He would simply say that these forms of control are necessary ways of interaction or operating in the world. For a culture of freedom-loving, self-directed people, however, the concept of being controlled by forces beyond voluntary choice is not popular.

Another criticism of reinforcement argues that certain behaviors should be performed by individuals in society regardless of the rewards or reinforcements that are associated. Appropriate behavior, such as responsible parenting, civil duties, altruistic help, and many others, should be expected as the norm for community behavior and not depend on bribery or the enticement of a reward. Skinner would respond by saying that reinforcements or rewards are always being used in daily living, whether individuals are consciously aware of them or not. Even if explicit rewards are not given, internal reinforcement may be present. Self-praise, or feelings of self-esteem from doing well at a chosen task, could provide a form of reinforcement.

A third objection is that reinforcement undermines intrinsic motivation, or an internally motivated desire to perform a given behavior for its own sake. With intrinsic motivation, the incentive to perform comes from the activity itself. In extrinsic motivation, the drive to perform stems from the rewards attached to the task. In recent years, more researchers have questioned the validity of rewards as a counterproductive means toward fostering intrinsic motivation, especially in children.

Finally, some critics challenge the effectiveness of reinforcement, saying that reinforcement often produces short-term changes, which disappear when the reinforcement ceases or becomes infrequent. True learning, according to many learning specialists, is supposed to produce relatively permanent changes in behavior.

Antitheoretical contradiction

Skinner made confident assertions about economic, social, political, and religious issues that derived from his system. In 1986, he wrote an article with the allembracing title "What is Wrong with Life in the Western World?" He stated that "human behavior in the West has grown weak, but it can be strengthened through the application of principles derived from an experimental analysis of behavior." This willingness to draw conclusions from the data, particularly as it pertained to solutions to complex human problems, is inconsistent with Skinner's antitheoretical stance. In other words, Skinner went beyond the central premise of his theory, which was that only observable behavior was important, and presented a theoretical blueprint for the redesign of society.

Although Skinner suggests that his behavioral principles can be applied at the societal level, he appears to sidestep the issue of who will put these principles into effect. Who will exert the power to set up certain reinforcement contingencies, and who will decide which behaviors are to be reinforced? He has addressed some of these issues, but more in a philosophical manner and not in practical or concrete terms.

Limitations of applications

Despite the fact that Skinner's principles have been used quite effectively in various settings, including therapeutic, educational, and business, there are still shortcomings even in environments that are carefully controlled. Behavioral management has not always been as effective as some claim. When neurotic behaviors improve but not to the extent that the person can function normally, or when a child learns more using behavioral principles but still cannot master certain concepts, is it the individual's limitations that are at fault or simply the realization that all behavior is not subject to control through reinforcement? Skinner would say that all behavior can be shaped given the appropriate reinforcements, but this is seriously questioned in real-life situations when some variables seem outside the realm of what can be controlled.

Punishment

Skinner's position on punishment is another point that has been commonly criticized. He has asserted that punishment has detrimental effects and that it does not permanently eliminate unwanted behaviors. Although these views might be interpreted as being sensitive to the organism's aversion to harsh treatment, the conclusions are questionable from a scientific perspective. Studies have shown that under certain conditions, punishment does seem to be effective in controlling behavior and does not seem to have long-lasting negative effects. Punishments sometime curtail undesirable behaviors so that alternative, desirable behaviors can be shaped with positive reinforcers. Of course, unless the alternatives are available, applying punishment is not likely to produce the desired outcomes. The point here is not that punishment is more desirable than positive reinforcement as a general technique of control, but rather that Skinner perhaps has neglected to give punishment a viable place in shaping behavior.

Instinctual vs. learned behavior

Skinner's view was that all behavior was learned through the process of reinforcement, whether it was positive or negative. Yet, research completed by Keller and Marian Breland in the early 1960s found that pigs, chickens, hamsters, porpoises, whales, cows, and other animals all demonstrated a tendency toward "instinctive drift." This means that animals tended to substitute instinctive behaviors for behaviors that had been reinforced, even when instinctive behaviors interfered with obtaining food. The animals were quickly conditioned to perform a number of tasks followed by unwanted behaviors. The conclusion is that the animals were reverting to innate behaviors that took precedence over the learned behaviors, even though this delayed receiving food, which supposedly was reinforcing the conditioned behavior. Clearly, in these cases, reinforcement was not as powerful an incentive for the animals as Skinner claimed.

Limitations of behavioral therapy

Behavioral therapy is a natural extension and application of many of Skinner's views focusing on observable behavior. The first criticism pertains to the lack of attention that behavior therapy gives to emotion. Behavioral practitioners hold that empirical evidence has not shown that feelings must be changed first in order to achieve measurable progress. In general, behavioral practitioners do not encourage their clients to experience their emotion, although some will work with aspects of emotion. Critics argue that emotions play a significant part in behavioral responses and should not be ignored. The strict emphasis on overt behavior to the exclusion of an inner life was a core concept that Skinner held throughout his career.

So hence, if there is not an inner life or at least one worth attending to, then it would follow that insight into one's motives or origins of behavioral responses would be of little value. This criticism states that behavior therapy ignores the importance of self or self-consciousness to the exclusion of overt behavioral responses. Skinner rejected the idea that such internal agents such as an ego or self allow us to make independent and free choices or derive any true benefit for examination of internal processes. This viewpoint, however, does not adequately take into account the reflective nature and imagination of the individual. A person cannot, as critics suggest, simply turn off his or her ability to reflect on past events or what propels them toward or causes them to back away from various choices.

Another criticism of behavior therapy is that it treats symptoms rather than causes. The psychoanalytic assumption is that early life events are the source of present difficulties. Behavior therapists may acknowledge the existence of past life events but do not place particular importance to those events in the maintenance of current problems. Instead, the behavioral practitioner emphasizes changing environmental circumstances and how those environmental forces reinforce particular behaviors. Critics respond with the argument that it is natural for humans to conceptualize a cause and effect relationship in behavior. This is an example of sequential learning and is used in many ways to describe the process of progress.

A final therapeutic criticism of behavior therapy involves the use of control and manipulation by the therapist toward the client. The therapist assumes a position of power with the client where he or she, through the process of reinforcement, can potentially manipulate the client's behavior responses. This criticism is largely a misunderstanding of contemporary behavior therapy. If applied in a strictly Skinnerian model, the potential for manipulation would be greater. However, all therapeutic approaches give some degree of control to the therapist, who hopes to facilitate change in the person seeking help. Most modern behavior therapists are not attempting to control their clients or manipulate them. In fact, many use techniques aimed at increased self-direction and self-control.

Ambiguity about human aspects of his theory

Skinner admits that the science of human behavior is not complete and needs further development. He also admits that rats and pigeons are not perfect models of humans. It would seem then that what is needed are some psychological principles that help bridge the gap from animal data to human functioning. Skinner believed, though, that the foundation of a science of human behavior has been set forth in his theories that deal adequately with human behavior.

THEORIES IN ACTION

Operant conditioning has become a very influential area of psychology, because it has successfully provided practical solutions to many problems in human behavior. Operant principles discovered in the laboratory are now being employed in a vast number of areas that include healthcare, education, mental health, prisons, and animal training, among many others.

Research

Applied behavior analysis Skinner was primarily concerned with understanding behavior and the process of learning. Although his experiments were largely with animals, he did generalize his findings to humans. Contrary to some criticisms of Skinner's deterministic principles, he did acknowledge that people could determine the causes of most behavior by identifying environmental conditions that support the behavior and then manipulate these conditions to influence the behavior in desired directions. Skinner's views led to a distinct branch of psychology called applied behavior analysis. Research in this area is directed primarily toward solving problems of everyday life.

Applied behavior analysis is a research method that uses a four-step process:

  • define
  • observe
  • intervene
  • test

Since operant conditioning is focused on observable behavioral outcomes, the first step is to define the target behaviors that need to be changed. Doing so allows researchers to develop procedures to then observe how often the behaviors occur under existing conditions. Once a stable measure of behaviors is maintained, researchers intervene to change the target behavior in the desired direction. For example, they may begin to reward behaviors they wish to increase, or withhold rewards following inappropriate behaviors they wish to decrease. Finally, they test the impact of the intervention by continuing to observe and record the target behavior during the intervention and beyond. Testing allows researchers to see the evidence of the intervention over time.

Computer assisted technology Operant conditioning has also been applied to the field of education. One of the most impressive operant-based techniques involves the use of computers in the classroom. This is often referred to as computer-assisted instruction, or CAI. In CAI, students interact with sophisticated computer programs that provide immediate reinforcement of correct responses. With certain restrictions, these programs are paced according to each student's progress. CAI has been enhanced to now include lecture-based distance learning through the Internet, so that simultaneous learning can occur in virtually any geographic location through high-speed communication technology.

Biofeedback Another area where operant conditioning is being studied and applied is in the realm of biofeedback. This is a technique that enables people to monitor and alter bodily responses such as skin temperature, muscle tension, blood pressure, and electrical activity of the brain. For example, a rise in blood pressure or muscle tension is indicated by a signal such as a loud tone, which acts as the feedback stimulus. As one lowers the blood pressure or relieves the muscle tension, the tone becomes softer. Reinforcement can play several roles, from reward to incentive. In biofeedback, the information given by the changing tones helps the subject know how much the behavior has changed.

Biofeedback research has influenced basic and theoretical ideas about learning. Responses of the autonomic nervous system were once thought to be outside the realm of operant conditioning. Research has demonstrated, though, that instrumental training of autonomic responses is possible with this technique.

Behavior modification The most frequently cited examples of reinforcement can be found in the field of behavior modification. Behavior modification, also known as behavior mod, seeks to apply the principles of operant learning to changing behaviors in a variety of settings.

An application of behavior modification through secondary reinforcement has been used in institutions across the country and is known as a token economy. For example, the staff of a psychiatric hospital is faced with the problem of motivating residents to perform a number of daily living behaviors: dressing and basic grooming, among other simple tasks. The patients are given tokens for each desired behavior or set of behaviors that they complete. The tokens have no inherent worth but can be exchanged for candy, movie tickets, outdoor activities, or other privileges. In this way, the tokens are secondary reinforcers of the behavior. The token economy is one of the few behavior modification techniques that work well with large numbers of subjects at one time. It is based on the concept of positive reinforcement.

Another prevalent use of behavior modification is applied in the school system. Teachers frequently structure reward-giving situations to help them accomplish their learning objectives for students. When teachers want to shape a behavior with a low response rate and make it a high response rate, they often employ behavior modification techniques by associating the two. For instance, a teacher might observe a child for some time to determine which behaviors occur at a high frequency and which occur at a low frequency. If the child has a high frequency with art but a low frequency with math, the teacher will make art contingent upon completing the work dealing with math.

Learned helplessness Research on learned helplessness seems to suggest that its onset stems partly from one's perception of control. When people believe they have no control over their environment, they stop trying to improve their situation. For example, children growing up in urban slums may perceive that they have little control over their environment and even less hope of escaping it. As a result, they may simply resign themselves to a lifetime of exclusion and hopelessness. Operant principles of reinforcement would especially apply with individuals dealing with learned helplessness. Studies have shown that some behavior is influenced not only by the level of rewards a person receives but by the person's evaluation of those rewards. For a reward to be effective, it must match the perceptions of the individual and shape the behavior in a positive manner. For instance, if a teen has learned helplessness about his ability to develop a marketable skill to get a job, the reinforcement cannot be indiscriminate. It must specifically address the need for competence in a potential area of strength, in order to effectively reshape the learned helplessness.

Case studies

A study that illustrates applied behavior analysis involved the task of trying to reduce the amount of graffiti on the walls of three public restrooms on a particular university campus. The increase of graffiti had caused the school to repaint these rooms repeatedly. The researchers began by objectively defining what constituted graffiti and what did not. Then they made daily counts of the graffiti to determine the number of occurrences. The researchers then introduced an intervention they felt might help reduce the amount of graffiti. The intervention consisted of a sign taped to the bathroom wall that read: "A local licensed doctor has agree to donate a set amount of money to the local chapter of the United Way for each day this wall remains free of any writing, drawings, or other markings. Your assistance is greatly appreciated in helping to support your United Way." The intervention was successful and kept the walls free of graffiti for the next three months.

Teaching machine In one significant way, Skinner was well ahead of his time in the development of his teaching machine. It began as a simple observation while attending his daughter's math class. He noticed that the teacher was not reinforcing the answers students provided partly because she was unaware of the importance of such behavior and partly because she could not adequately reinforce so many students' responses simultaneously. This gave him an idea, an invention Skinner was to call the teaching machine.

Skinner's teaching machine presented problems in random order for students, with feedback (reinforcement) after each one. While it did not attempt to teach new behavior (and thus replace the human instructor), it was seen as an excellent tool for reviewing previous material and building on previously learned concepts. Within three years of his initial idea Skinner had developed a complete program of instruction. For the next 10 years, Skinner was very involved with his work on the teaching machine, attempting to perfect it. Unfortunately, the technology he longed for to make his teaching machine didn't arrive until the latter years of his life. The advent of the personal computer would eventually allow many of Skinner's ideas regarding learning to be applied in the way he envisioned them.

Behavior modification The techniques of operant conditioning were primarily used on animals in the early experiments. In 1953, though, Skinner and a colleague began experimenting with some of the principles of operant conditioning, now called behavior modification, at the Metropolitan State Hospital in Waltham, Massachusetts. The purpose of the studies was to determine how applicable operant conditioning techniques were in the experimental analysis of psychotic patients. Fifteen male patients were conditioned to pull levers for candy and cigarettes. Skinner and his colleague were able to demonstrate highly stable individual differences in overall rate of lever-pulling per hour in the subjects. For the first time, medically useful and objective measures of the psychoses were available.

The most intriguing aspects of the studies suggested that psychotic behavior is controlled by reinforcing properties within the immediate environment. From this basic premise came a whole new understanding of the origin of deviant behavior. Previously, the dominant view of deviance was understood in dynamic terms as an internal state of mental illness. The behavior modification study suggested a learning model that involved symptoms that might have been learned at some point in the person's past through accidental reinforcement.

Dolphin-human therapy Dolphin-human therapy was created and developed by David Nathanson, Ph.D., a psychologist with almost 30 years experience working with disabled children. This therapy was developed with a series of carefully controlled language experiments using dolphins as teachers for children with Down syndrome. The key to learning for all people, but especially for the mentally retarded, is to increase sensory attention—i.e., sight, sound, touch, taste, smell—so that increased learning will occur. Most mentally handicapped children have difficulty paying attention to a stimulus, and as a result, learning is impaired. The theory and research behind dolphin-assisted therapy is that children or adults will increase attention as a result of a desire to interact with dolphins. By interacting with the dolphins and using a behavior modification procedure that rewards the child for correct cognitive, physical, or affective responses, the therapy incrementally teaches children skills they may not be able to learn in more conventional ways.

Organizational behavior management Behavior modification has also been used successfully in organizations to help facilitate greater efficiency. One study sought to use behavior modification techniques with an existing hotel-cleaning staff, to determine how to improve and maintain a standard of cleanliness with minimum turnover and cost. The study was initiated because of the very high turnover rate of cleaning staff and the substandard work carried out. To further exacerbate management's situation, the hiring, training, outfitting, and maintenance of a housekeeping staff is one of the largest budget line items for most hotel/motel operations. During the assessment it was discovered that a standard of cleanliness had not been established, feedback to the cleaning staff was essentially nonexistent, and aversive managerial practices were common, leading to low morale. The applications of behavior modification began by establishing clear standards that could be objectively measured regarding cleanliness. Management then chose to positively reinforce the role of the cleaning staff by rewarding their performance with merit-pay increases when they adhered to the cleanliness standards set forth. Regular feedback was given to staff to keep them informed of training and expectations. The program was successful in improving worker morale, raising the level of cleanliness, and saving a substantial amount of money in the long term.

Relevance to modern readers

Skinner's work is considered by some to be the most important contribution to date on learning and the process of behavioral change. Though his work was largely with animals, his concepts have been among the most researched of all the psychological theories. Because the emphasis is on observable behavior, new studies are continually being devised that build from the concepts established by Skinner, expanding upon and adapting the original principles to various types of human behavior. In fact, operant-conditioning principles are apparent in virtually every sphere of modern life. Most noticeable are variations of behavior modification. The modern reader may not immediately recognize the relationship between operant principles and everyday behavior, but once these patterns are identified, it is hard to underestimate the influence of Skinner's ideas on contemporary life. Following are some of the most obvious realms where operant conditioning techniques are at work.

CHRONOLOGY

1904: B.F. Skinner born March 20.

1930: Initiates research in reflexes.

1936: Marries Yvonne Blue.

1938: The Behavior of Organisms is published.

1942: Awarded the Warren Medal by the Society of Experimental Psychologists.

1945: Takes over the Psychology Department at the University of Indiana, where he developed the Teaching Machine and Aircrib.

1948: Walden Two is published.

1956: Fixed interval schedule of reinforcement described.

1966: Introduces the concept of critical period in reinforcing an event.

1968: Identifies the critical characteristics of programmed instruction.

1971: Publishes Beyond Freedom and Dignity.

1972: Receives the Humanist of the Year Award by the American Humanist Association.

1983: Publishes Enjoying Old Age.

1990: Dies on August 18.

Behavioral shaping Behavioral shaping is commonly used to change a behavior in response to rewards known as positive reinforcements. This technique is often used by parents in their attempt to modify their child's behavior. For instance, if a child is learning a new skill, such as riding a bicycle, the parent can reinforce the progress incrementally through praise and supportive encouragement until the skill is mastered. Behavioral shaping can also employ negative reinforcement. For instance, the child who disobeys his or her parent may receive a "time-out." The mandatory time spent away from the child's preferred activity is a negative reinforcement. In other words, to avoid this consequence again, the child needs to change his or her behavior so that it complies with the parent's wishes. Both positive and negative reinforcements can be used effectively to shape behavior in children, teens, and even adults. One example of how this technique is commonly used with adults is weight loss. Positive reinforcements or rewards play a significant role in one's ability to continue the disciplined task of losing weight. One example of behavior modification at work would be the following: each time a person abstains from eating out at lunch hour, they deposit the amount of money they would have spent into a "rewards jar." The money that accumulates will be used for a non-food reward

FURTHER ANALYSIS:

The baby box: Myth and reality

B. F. Skinner, the dominant behavioral psychologist of the 20th century, contributed many insights into the understanding of animal and human behavior during his career. But one "experiment" attributed to him is among the most controversial of all his work. It involved his second daughter Deborah whom he was accused of using for one of his psychological experiments. Throughout Skinner's life, he was routinely charged with accusations regarding this incident and made many attempts to set the record straight. Here's the real story.

Skinner began his career in the 1930s and is best known for the operant chamber, more commonly referred to as the "Skinner box." It was a small laboratory apparatus used to conduct and record the results of operant-conditioning experiments with animals. These experiments typically required an animal to manipulate an object such as a lever in order to obtain a reward.

When Skinner's second daughter, Deborah, was born in 1944, Skinner (who then lived in Minnesota) constructed an alternative type of crib for her that was something like a large version of a hospital incubator. It was a tall box with a door at its base and a glass window in front. This "baby tender," as Skinner called it, provided Deborah with a place to sleep and remain comfortably warm throughout the severe Minnesota winters without having to be wrapped in numerous layers of clothing and blankets. Deborah slept in her novel crib until she was two and a half years old, and by all accounts grew up a happy, healthy, thriving child.

Skinner invented the baby tender not as a lab experiment but as a labor-saving device. Because it was equipped with filtered and humidified air it allowed Deborah to have less risk of airborne infection. The sound-proof walls provided for sounder sleep and the warm air that continually circulated through the crib allowed the child to wear only a diaper to bed. There was also a shade that could be drawn to keep the light out of the crib while the baby was sleeping.

Skinner claimed that his invention was used in the same way that a traditional crib would be used. Deborah was taken out of the crib for short periods throughout the day so that she could eat and interact with her older sister, Julie, and her parents. Friends and neighbor children who visited the house could view the young child in her enclosed crib while keeping her in a germ-free environment.

The trouble began in October 1945, when Skinner submitted an article on the baby tender to the popular magazine Ladies Home Journal. The article featured a picture of Deborah in a portable (and therefore smaller) version of the box, her hands pressed against the glass and the headline read: "Baby in a Box." People who didn't read the article carefully, or who merely glanced at the picture or heard about the article from someone else, tended to confuse the baby tender with a Skinner box, even though the article clearly explained that the baby tender was something quite different.

Nonetheless, many people jumped to the conclusion that Skinner was raising his daughter in a cramped box equipped with bells and food trays. It was viewed by many as just another of Skinner's psychological experiments measuring the reinforcement of reward and punishment. Outraged readers of the magazine wrote letters protesting such behavior and started a landslide of rumor that Skinner was never quite able to put to rest during his lifetime.

Over the years, the details about Skinner's baby tender, which was unsuccessfully marketed under the name "Aircrib," faded somewhat. But by the mid-1960s, about the time Deborah turned 21, the rumor emerged again this time saying that Deborah had become psychotic and was suing her father. Some reports stated that she had committed suicide.

The truth of this story is that Deborah Skinner (now Deborah Skinner Buzan) grew up having a very normal life and remained close to her father while he was alive. She has been living and working in London as an artist since the mid-1970s. She is not psychologically scarred as a result of her use of the baby tender. She claims that most of the criticisms of the box are by people who do not understand what it was.

when they reach their target weight. Thus, each deposit of money is a positive reinforcer for continuing the behavior.

Systematic desensitization Operant-conditioning techniques are also at work in helping those with significant fears and anxiety learn to live more effectively. A process called systematic desensitization is used to overcome the fear or anxiety associated with a particular stimulus. The premise behind systematic desensitization is that if a fear is learned or conditioned, it can then be unlearned by the process of extinction or by not reinforcing the behavior. The person undergoing this treatment is asked to either imagine the anxiety-producing situation or confront the real-life situation incrementally, while positive reinforcement is provided to help establish the perception of control over the stimulus. Occasionally, relaxation training accompanies the use of systematic desensitization whenever the anxiety-producing stimuli are present. It helps to increase the likelihood of a relaxed response to the feared stimulus. This behavior-modification treatment has been very successful at extinguishing the stimulus that triggers the fear or anxiety.

Other applications Behavior modification techniques are also being used to help people with a wide variety of everyday behavior problems, including those with addictive behaviors, aggression, attention deficit disorder, teen delinquency, and learning disabilities, among others. These methods have been used successfully in schools systems, prisons, mental health institutions, the workplace, and many other environments. Behavior modification has become so popular because it has been shown to be extremely effective in various situations and it empowers the individual using the techniques to change unwanted behavior. Though Skinner would attribute behavior change to environmental reinforcements in one's life to which a person has only limited control, modern adaptations of behavior modification instill the perception of control in the person attempting to make the behavioral change.

BIBLIOGRAPHY

Sources

Baldwin, John D., and Baldwin, Janice I. Behavior Principles In Everyday Life. Englewood Cliffs, N.J.: Prentice Hall, 1986.

Bjork, Daniel. B. F. Skinner: A Life. New York: Basic Books, 1993.

Carpenter, Finley. The Skinner Primer. New York: The Free Press, 1974.

Evans, Richard. B. F. Skinner: The Man and His Ideas. New York: E. P. Dutton and Co., 1968.

Epstein, Robert, ed. Skinner for the Classroom: Selected Papers. Champaign, IL: Research Press, 1982.

Geiser, Robert L. Behavior Modification and the Managed Society. Boston: Beacon Press, 1976.

Green, Christopher. Classics in the History of Psychology: The Misbehavior of Organisms. University of Toronto. http://psychclassics.yorku.ca/Breland/misbehavior.htm.

Nye, Robert D. Three Views of Man. Monterey, CA: Brooks/ Cole Publishing Company, 1975.

Nye, Robert D. What is B. F. Skinner Really Saying? Englewood Cliff, N.J.: Prentice Hall, Inc. 1979.

Radford University. What is wrong with Daily Life in the Western World.http://www.radford.edu/~jmontuor/Skinner_Article.htm.

Sagal, Paul T. Skinner's Philosophy. University Press of America, Inc., 1981.

Schultz, Duane P., and Sydney Ellen Schultz. A History of Modern Psychology. Belmont, CA: Wadsworth, 2004.

Skinner, B. F. About Behaviorism. New York: Alfred A. Knopf, 1974.

Skinner B. F. A Matter of Consequences. New York: Alfred A. Knopf, 1983.

Skinner, B. F. Particulars of My Life. New York: Alfred A. Knopf, 1976.

Skinner, B. F. Reflections on Behaviorism and Society. Englewood Cliffs: Prentice Hall, Inc., 1978.

Skinner B. F. The Shaping of a Behaviorist. New York: Alfred A. Knopf, 1979.

Slater, Lauren. Opening Skinner's Box. New York: W. W. Norton and Company, 2004.

Further readings

Bjork, Daniel. B. F. Skinner: A Life. New York: Basic Books, 1993.

Carpenter, Finley. The Skinner Primer. New York: The Free Press, 1974.

Epstein, Robert, ed. Skinner for the Classroom: Selected Papers. Champaign, IL: Research Press, 1982.

Sagal, Paul T. Skinner's Philosophy. University Press of America, Inc., 1981.

Skinner, B. F. About Behaviorism. New York: Alfred A. Knopf, 1974.

Skinner B. F. A Matter of Consequences. New York: Alfred A. Knopf, 1983.

Skinner, B. F. Particulars of My Life. New York: Alfred A. Knopf, 1976.

Skinner, B. F. Reflections on Behaviorism and Society. Englewood Cliffs: Prentice Hall, Inc., 1978.

Skinner B. F. The Shaping of a Behaviorist. New York: Alfred A. Knopf, 1979.

About this article

Skinner, Burrhus Frederic

Updated About encyclopedia.com content Print Article