Learning and Memory, Contemporary Views
LEARNING AND MEMORY, CONTEMPORARY VIEWS.
People have long wondered how best to characterize learning and memory. Most people typically conceive of memory as a place in which information is stored. As Henry Roediger has noted, this spatial metaphor has dominated the study of memory: "We speak of storing memories, of searching for and locating them. We organize our thoughts; we look for memories that have been lost, and if we are fortunate, we find them" (1980, p. 232). Aristotle compared memory to a wax tablet, Plato invoked the image of an aviary, and St. Augustine of Hippo suggested similarities to a cave. Increasingly, the metaphors involved the most recent or most impressive technological accomplishment, as with René Descartes and his analogy to the water-powered automata in the French royal gardens and later by comparisons to the gramophone (an updating of the wax-tablet metaphor) and then a telephone exchange.
The so-called "cognitive revolution" that began in the mid-1950s ushered in the information-processing approach to the study of learning and memory. The main metaphor became the computer; people were seen as general-purpose processors of information capable of manipulating internal representations. As with computers, there was an emphasis on the interaction between hardware and software, usually relabeled structure and process, respectively. By the end of the 1960s, the standard view of memory was the modal model (after the statistical term mode ) with its set of stores, registers, and buffers. Structuralist views continue to use the spatial metaphor and emphasize a memory system in which information is stored. According to these accounts, the store in which a memory resides determines the mnemonic properties.
An alternative to the structural approach is the proceduralist approach, which views memory not so much as a thing that is stored but rather as the result of a process. As David Wechsler put it, "Memories are not like filed letters stored in cabinets.… Rather, they are like melodies realized by striking the keys on a piano" (p. 151). The melodies are a byproduct of pressing keys just as memory is a by-product of processing information. According to this account, the processing performed at encoding and retrieval determines the mnemonic properties.
Consider the essential difference between these two accounts. Within the modal model of the 1960s and early 1970s there were three memory structures: a sensory register, short-term memory (STM), and long-term memory (LTM). Information is first briefly registered in the appropriate sensory buffer, visual information in iconic memory, auditory information in echoic memory, and so on. These buffers hold raw physical information in an unanalyzed form for a very short period of time, perhaps 250 milliseconds. Additional processing is required to convert the information from its sensory form to a more durable representation that is held by short-term memory. Short-term memory serves mainly as a buffer in which information can be temporarily stored and has a limited capacity, usually given as 7 plus or minus 2 items. Items in short-term memory decay over time unless rehearsed; rehearsal is also the way in which information gets copied into long-term memory. Long-term memory is the main memory system and stores all knowledge. It has no known capacity limitations. Whereas information loss from short-term memory is thought to be permanent, it is unclear whether there is permanent loss from LTM.
Table 1 summarizes the essential points from the structuralist point of view. When an item is in STM, it will be stored using an acoustic code (how the word sounds), the capacity will be limited to 7 plus or minus 2 items, and the items will decay within about twenty seconds. In contrast, when an item is in LTM, it will be stored using a semantic code (what the word means), the capacity will be (almost) unlimited, and the item may very well never be forgotten.
A proceduralist would agree with almost everything the structuralist has proposed, except that the proceduralist would ask the following question: What does the top row in the table add to our understanding? In other words, we could delete the top row and still have a complete and accurate description of memory. When words are processed based on how they sound, there will be a very small capacity and the information will not be available for very long. When words are processed based on what they mean, however, there will be an enormous capacity, and the information will be available for a long time.
The divergence between the structuralist and proceduralist approaches has grown since the 1970s. One current debate concerns whether memory is best characterized as a set of
|Capacity||7 ± 2||Infinite (?)|
|Duration||20 s||Infinite (?)|
independent systems—a continuation of the spatial metaphor—or as a set of interrelated processes (see, for example, Foster and Jelicic). This more modern structuralist approach has greatly changed the modal model and generally names five different memory systems. The proceduralist approach has refined the idea of memory as a by-product of processing to focus on the relationship between the processing performed at encoding and the processing performed at retrieval.
The Structuralist Approach
Typically, structuralists are concerned not only with delineating the number of memory systems but also in locating them within the brain. Advocates of the systems approach identify five major memory systems: working memory, semantic memory, episodic memory, the perceptual representation system, and procedural memory.
Working memory is a system for the temporary maintenance and storage of internal information and can be thought of as the place where cognitive "work" is performed. It has multiple subsystems, most notably one for the processing of verbal or speech-like stimuli and another for the processing of visual-spatial information. Each features a store (the phonological store and the visual-spatial sketchpad) and a maintenance process that offsets decay (articulatory control process and the visual scribe). Unlike short-term memory, the capacity limit is based on a trade-off between the rehearsal process and decay.
Semantic memory refers to memory for general knowledge and includes facts, concepts, and vocabulary. The term is somewhat misleading, however, as it implies storage of only semantic information, which is not the case. Rather, the distinguishing feature of this system is the lack of conscious awareness of the learning episode. In contrast, episodic memory is the system that does provide this awareness of the learning episode and thus enables the individual to "travel back" mentally into his or her personal past, to use Endel Tulving's term. As an example of the difference, consider the fact—picked hopefully for its obscurity—that Samuel Beckett is the only first-class cricket player to win a Nobel Prize (he played for Dublin University and won the 1969 Nobel Prize for Literature). Imagine that two weeks from now you happen to participate in a conversation about Nobel Prizes, and you recall the above information about Samuel Beckett. If you remember reading the information in this book, where you were, whether it was hot or cold, or anything else about the episode other than the mere facts, then the information is said to be in episodic memory. If you remember only the fact itself, then the information is said to be in semantic memory.
All three of these systems are part of declarative memory, which is contrasted with procedural memory, the fourth proposed major memory system. Declarative memory is concerned with knowing "that" rather than with knowing "how." For example, if you know that two plus two equals four, the information is said to be in declarative memory. By contrast, if you know how to ride a bicycle, that information is said to be in procedural memory. One distinction is that you can usefully communicate declarative information but not procedural information: simply telling someone how to ride a bicycle (explaining balance, pedaling, and steering) does not work.
Finally, the perceptual representation system is a collection of non-declarative domain-specific modules that operate on perceptual information about the form and structure of words and objects. Although currently the dominant approach, the structuralist conception of memory is not without its problems.
First, there is little agreement on what the criteria should be for a memory system, let alone agreement on the number of systems. In particular, there are no criteria that result in producing just the five systems named above. The most well-specified criteria for determining whether two systems are separate were proposed by David Sherry and Daniel Schacter in 1987: (1) functional dissociations: an independent variable that has one effect on a task thought to tap system A either has no effect or a different effect on a task thought to tap system B; (2) different neural substrates: system A must depend on different brain regions than system B; (3) stochastic independence: performance on a task that taps system A should be uncorrelated with performance on a task that taps system B; and (4) functional incompatibility: a function carried out by system A cannot be performed by system B.
Henry Roediger, Randy Buckner, and Kathleen McDermott have shown how recall and recognition meet the first three of these criteria: the variable word frequency produces functional dissociations (high-frequency words are recalled better than low, but low-frequency words are recognized more accurately than high); neuropsychological dissociations between recall and recognition are observable in amnesiac subjects; and numerous studies have found essentially no correlation between performance on the two tests. A thought experiment suggests that recall and recognition might also demonstrate functional incompatibility: memory for odors is a function that is well supported by recognition (consider the case of Marcel Proust) but not very well supported by recall.
Second, the proposed systems have not yet been anatomically isolated (although it remains an empirical question whether they will be), and even when isolation is found, the results are often equivocal. For example, many studies show that retrieval in episodic tasks involves right prefrontal activation, whereas retrieval of semantic information involves left prefrontal activation. This neuropsychological distinction, however, reverses when the two tasks are equated for difficulty: retrieval of semantic information now produces larger amounts of right frontal activation than the episodic task.
Third, this view does not account for the general pattern of decline seen when normally healthy people get older. One would expect a systematic relationship between tasks that shows a decrement in performance and the underlying system. For example, one might expect that episodic memory might deteriorate faster than semantic memory. In fact one finds no such relationship. Performance on some episodic tasks is unaffected, whereas performance on some semantic tasks is quite impaired. Rather, the pattern of performance loss is better described based on the processes, especially those at retrieval.
The Proceduralist Approach
The modern incarnation of the proceduralist approach began with the levels of processing framework of Fergus Craik and Robert Lockhart. Rather than viewing memory as something that is stored in a system, they viewed memory as the byproduct of performing processing. Because the emphasis is on processing rather than on structure, Craik and Lockhart argued that the most informative research will occur when the experimenter has control over the processing and that therefore researchers should use an incidental learning procedure. The experimenter might ask the subject to judge whether two words rhyme or to judge which of two words has the most pleasant associations. The learning is "incidental" in that the subject is unaware that the material being processed will be tested later on and so is not intentionally trying to remember the information. The type of processing performed at retrieval can also be manipulated.
The basic premise of the transfer-appropriate processing account of memory is that memory will be best when the processing performed at encoding is appropriate given the processing demands required at test. For example, if you processed how an item sounds at encoding but the test emphasizes the meaning of the item, performance will be worse than if your processing focused on how the item sounds at both encoding and retrieval or than if your processing focused on the meaning of the item at both encoding and retrieval. This view offers an explanation for why intent to learn is not an important factor in subsequent tests of learning and memory: if a person uses an inappropriate process, performance will be poor regardless of the intent to learn.
This idea applies very generally. For example, context-dependent memory refers to the observation that memory can be worse when tested in a new or different context relative to performance in a condition in which the context remains the same at both encoding and retrieval. Context can refer to mood (such as happiness at both encoding and retrieval, sadness at both encoding and retrieval, or a mismatch) or to physical location (for example, underwater at both encoding and retrieval, on land at both encoding and retrieval, or a mismatch). One practical implication for divers is that if they memorize decompression tables on land, they may not remember them very well when needed, underwater. The idea is that if you are happy, you are likely to have quite different processing than if you are sad. Neither mood is inherently better (from a memory perspective); the point is that the results of processing while in a sad mood are different from the results of processing while in a happy mood.
One problem with the transfer-appropriate processing view is determining exactly which processes are being used. Larry Jacoby introduced the process dissociation technique, in which a subject is asked to perform two tasks. The experiment is set up so that on one task, two processes can contribute to memory performance, but on a second task, the processes work in opposition. Using this procedure, one can estimate the relative contribution of each process. It remains an empirical question whether this will result in a sufficiently detailed description of the various processes that can be used.
In addition to the divergence between the structuralist and the proceduralist views, another major difference between memory research now and memory research in the 1960s and 1970s is the emphasis on the dynamic, reconstructive nature of memory. Long gone is the library metaphor in which memory was like a book in a library that just sat on a shelf. Instead, memories are seen as being dynamically constructed and changed according to the current available information. It is now well documented that people can even remember things that never happened.
Memory is now viewed as inherently reconstructive. This term emphasizes that at the first retrieval attempt, one constructs a memory from all available information. At the second retrieval attempt, the memory is constructed again. Consider what happens when an event is retrieved and then retrieved a second time. At the first retrieval attempt, you will have two sources of information: memory of the event itself and general knowledge. At the second retrieval attempt, you have three sources of information: memory of the event itself, general knowledge, and memory of your first retrieval.
This process is nicely illustrated from experiments that intentionally try to implant a memory of an event that did not happen. For example, Ira Hyman and F. James Billings asked their subjects, college students, to recall as much information as they could about events from their early childhood. Each subject was asked about two to four events that actually occurred (according to the subject's parents) and one event that did not occur. This event supposedly happened when the subject was five years old and concerned going to a wedding and knocking over the punch bowl, spilling some punch on the bride's parents. Originally only 3 percent recalled the wedding episode, as it had not occurred. Two days later the subjects were again asked to recall as much as they could, and this time 27 percent indicated remembering the event. Subjects were also asked to rate their confidence in the accuracy of their memory. There was no difference in the confidence ratings for real or made-up events.
This is not an isolated finding. People who read a story and were then told that the main character of a story was Helen Keller were likely to remember reading the sentence, "She was deaf, dumb, and blind," even though that sentence was not presented. A student in one of my memory classes had a memory of a favorite pet golden retriever. After hearing my lecture on this topic, she checked with her parents: the dog had died two years before she was born. Apparently she had heard stories about the dog and had seen slides featuring the dog and was attributing memory of these as memory of real events.
Studies from the eyewitness memory literature show that eyewitnesses readily incorporate information from questions into their recollections of events. Unless there is objective evidence available, there is no way of assessing the accuracy of recollection of an eyewitness: they may be very accurate, very inaccurate, or somewhere in the middle. Among the many factors that do not predict subsequent accuracy are the duration of the event; the emotional intensity of the event; the unusualness of the event; the number of details that can be recalled; the confidence expressed about the memory; and the delay between the event and the subsequent questioning.
Memory then is fundamentally active, dynamic, and reconstructive. Like other cognitive processes, memory processes recruit information from a variety of sources, including from memory of the event itself as well as from generic knowledge, memory of past recollections of the event, and even inferences, with the goal of constructing a sensible, coherent memory.
See also Computer Science ; Language and Linguistics ; Psychology and Psychiatry .
Craig, Fergus, and Robert Lockhart. "Levels of Processing: A Framework for Memory Research." Journal of Verbal Learning and Verbal Behavior 11 (1972): 671–684.
Hyman, Ira E., and F. James Billings. "Individual Differences and the Creation of False Childhood Memories." Memory 6 (1998): 1–20.
Jacoby, Larry. "A Process Dissociation Framework: Separating Automatic from Intentional Uses of Memory." Journal of Memory and Language 30 (1991): 513–541.
Neath, Ian, and Aimée M. Surprenant. Human Memory: An Introduction to Research, Data, and Theory. 2nd ed. Belmont, Calif.: Thomson/Wadsworth, 2003.
Parkin, Alan J. "Component Processes versus Systems: Is There Really an Important Difference?" In Memory: Systems, Process, or Function? edited by Jonathan K. Foster and Marko Jelicic, 273–287. Oxford: Oxford University Press, 1999.
Roediger, Henry L. "Memory Metaphors in Cognitive Psychology." Memory and Cognition 8 (1980): 231–246.
Roediger, Henry L., Randy L. Buckner, and Kathleen B. McDermott. "Components of Processing." In Memory: Systems, Process, or Function? edited by Jonathan K. Foster and Marko Jelicic, 32–65. New York: Oxford University Press, 1999.
Schacter, Daniel L., Anthony D. Wagner, and Randy L. Buckner. "Memory Systems of 1999." In The Oxford Handbook of Memory, edited by Endel Tulving and Fergus I. M. Craik, 627–643. New York: Oxford University Press, 2000.
Sherry, David F., and Daniel L. Schacter. "The Evolution of Multiple Memory Systems." Psychological Review 94 (1987): 439–454.
Tulving, Endel. Elements of Episodic Memory. New York: Oxford University Press, 1983.
Wechsler, David B. "Engrams, Memory Storage, and Mnemonic Coding." American Psychologist 18 (1963): 149–153.