Social cognition is the branch of social psychology that studies how people think about themselves and other people. It focuses on the steps people take and the conclusions they reach as they strive to make sense of their social environment. The field tends to view people as information processors, something like a computer, who take in information from the outside world, sort that information out and interpret it, calculate a judgment, and then choose a behavior in response. In doing so, the field examines what information people pay attention to, how they analyze it, how they reach judgments based on that information, how those judgments guide their behaviors, and then which parts of the information they remember.
Work in social cognition is wide-ranging and explores a breathtaking diversity of topics. For example, some social cognitive researchers investigate how people develop opinions and attitudes about social issues. Others examine whether people’s judgments of others are distorted by stereotypes. Others look at whether people reach decisions that are wise and rational versus faulty and costly. Others study how people reach impressions of themselves that lead to high versus low self-esteem.
Social cognition principles carry a wide variety of implications for real world pursuits. Social cognition principles, for example, explain why people make right versus wrong decisions about their health. It suggests the best ways to teach students to remember school material. It explains the pitfalls that prevent people in negotiations from reaching harmonious settlements. It explains how people can commit discrimination against people from other ethnic or social groups without even knowing it. It describes why and when people make poor decisions about their money.
One can claim that social cognition has always been a featured part of social psychology, even when the rest of psychology has neglected, or even denied, the importance of people’s internal thought processes. In particular, in the early to mid-twentieth century, the bulk of psychology was dominated by the behaviorist tradition, led by B. F. Skinner (1904-1990) and others, which emphasized how organisms reacted to rewards and punishments while studiously avoiding any talk of that organism’s internal psychological world. During this era, many social psychologists squarely examined that internal life, exploring how people developed their attitudes toward social issues, as well as how they formed stereotypes about social groups, or made attributions about the causes of other people’s behavior.
However, in the 1960s, with the advent of the cognitive revolution, things changed dramatically. The mainstream of psychology became fascinated with the organism’s internal life—how that organism perceived, thought about, and remembered the world around it. Cognitive psychologists, in particular, generated many sophisticated and powerful theories describing thought and memory. Social psychologists quickly adopted these theories and methods to explore in finer detail how people strive to comprehend events in their social world. Today, work on social cognition remains a vigorous and prominent branch of social psychology.
Although work in social cognition is too diverse to be captured in a simple catalogue, one can point to dominant ideas and themes that social cognitive research has repeatedly demonstrated.
One prominent theme focuses on the building blocks of people’s thoughts. People carry with them information about individuals, social groups, objects, and events arranged in schemata (singular: schema ). A schema is a knowledge structure containing the features and examples associated with a person, group, object, or event. For example, people’s schema of bird usually contains such characteristics as wings, feathers, a beak, and flight, as well as some common examples of birds, such as robin and duck. Usually schemata are described as associative networks, that is, as a web of linked associations. Thus, when the concept of bird comes to mind, these associative links activate the relevant features and examples (e.g., wings, duck) connected to the concept, thus also bringing those notions to mind.
Schemata are tremendously helpful for social life. If a friend tells you, for example, that he or she went to a restaurant last night, you can easily surmise, because you possess a special type of schema for an event called a script, that the person looked at a menu, ordered food, ate it, paid the bill, and left a tip. Your friend does not need to specify these details; you already know.
That said, schemata can also be misleading or harmful, especially when people try to recall the past. For example, if someone asks George to remember the words drowsy, bed, pillow, snoring, and nighttime, he will probably remember most of these words. But he will also probably mistakenly recall the word sleep because all those terms listed above are linked to sleep through the associative network (the schema) of this concept. Memory errors prompted by schemata can be quite profound. For example, people witnessing a crime may misremember that the culprit had a gun, a disguise, or unkempt hair if it fits their schema of the event they witnessed. Schemata also explain how stereotypes can distort memory. If, for example, a friend describes a professor as distant, smart, and assertive, one might also mistakenly recall that the friend said the professor was arrogant—if that attribute fits one’s schema of a professor.
The impact of schemata on social judgment, interpretation, and memory has been shown to be profound in a wide array of studies, and there has been a good deal of discussion about the specific form that schemata take. According to the prototype view, schemata consist of the features associated with an object or event (such as wings and feathers to a bird). According to an exemplar view, schemata consist of typical examples of a concept (such as a robin being a typical example of a bird). Research ultimately suggests, however, that schemata tend to be a blend of both features and examples.
One additional prominent theme in social cognitive research scrutinizes cognitive habits that lead people to make errors and biases in their judgments. For example, one such consequential habit is confirmation bias. Research on this bias shows that when people ask a question (e.g., “Is Jerry outgoing?”), they tend to look for information that would confirm the question in the positive (e.g., “He does go to parties”) and not for information that would disconfirm it (e.g., “He said last week he hated talking in front of large groups”). When the opposite question is asked (e.g., “Is Jerry shy?”), people instead search for information that would confirm that reverse hypothesis, leading to very different conclusions.
Confirmation bias can lead to several problems in judgment. For example, it can lead people to be overconfident in their predictions about themselves and others. When people consider a question soliciting a prediction (e.g., “Will I get a good grade in this class?”), they tend to consider information that suggests that the answer is “yes.” This can lead them to overconfidence about the chance that their prediction will prove to be accurate. That is, they may say that they are 90 percent sure their prediction will be correct even when the real chance is closer to 70 percent. Indeed, when people say they are 100 percent certain of their prediction, they tend to be wrong roughly one time out of five. Some researchers have suggested that a valuable habit for avoiding overconfidence is to also ask how an event might go in the opposite direction from that posed in the question (e.g., ask the reasons why one might get a poor grade). This consider-the-opposite strategy has been shown to reduce overconfidence in people’s predictions.
People also suffer from illusory correlations, seeing relationships between variables even when they do not exist. Some illusory correlations are inspired by schemata and stereotypes. For example, in an experiment described in a 1967 article, Loren J. Chapman and Jean P. Chapman showed participants a series of drawings, some of which were purportedly drawn by people suffering from paranoia. Participants in the study tended to conclude that the drawings of paranoid individuals more often than not included people with larger eyes—even when there was no relationship between eye size and mental illness in the drawings they looked over.
Other illusory correlations are inspired by what people find easier to remember. For example, people tend to remember unusual behaviors (e.g., riding a unicycle) performed by rare groups (e.g., Alaskans), as reported by David L. Hamilton and Robert K. Gifford in a 1976 article. Thus, when asked if there is any relation between a rare behavior and a rare group (e.g., do Alaskans participate in odd sports?), people report that such a relationship exists, even if the evidence they have reviewed fails to support this conclusion. Because rare-rare combinations are memorable, they lead to illusory correlations.
People also fall prey to the fundamental attribution error (also known as the correspondence bias ), which means that they give too much weight to a person’s personality in explaining, evaluating, and predicting social behavior and too little weight to situational forces. That is, people look primarily to a person’s internal character to explain his or her actions, and not to factors outside the person that could have produced the behavior. This bias most commonly arises when people make attributions for another person’s behavior; that is, they try to identify the causes for why the behavior occurred. For example, if you say that “John stumbled while learning the dance,” people tend to leap to the conclusion that John is clumsy (i.e., something about his internal personality) rather than that the dance was difficult (e.g., something about the outside situation).
Several studies have provided powerful demonstrations of the fundamental attribution error. Consider the classic Milgram experiment, completed in the 1960s, in which Stanley Milgram (1933-1984) demonstrated that a majority of participants, if asked, would continue to shock another participant if an authority figure asked them to—even if the other participant suffered heart problems and had stopped answering, and for all practical purposes might be dead. Almost everyone who hears about the study denies that they would “go all the way,” complying with the experimenter until the session is curtailed. However, up to two-thirds of people in this situation do go all the way. The situation is extremely powerful even though people do not see it, and there are few indicators from a person’s personality that reliably predict whether that person will comply or defy the command to shock another person who has stopped answering.
People also make errors because they rely on quick heuristics to reach their judgments, according to the work of Amos Tversky (1937-1996) and Daniel Kahneman. One such example is the availability heuristic, in which people judge the odds, frequency, or truthfulness of an event based on how quickly examples of it spring to mind. For example, if one asks how commonly words of the form ——n- appear in English, people tend to say that there are not many. However, if asked how many words of the form ——ing appear, people say quite a few, mostly because such words are easily brought to mind. Of course, all ——ing words are also ——n- words, so the latter type of word, paradoxically, must be more frequent.
People also rely on the representativeness heuristic. This heuristic refers to the fact that people judge the odds, frequency, or truthfulness of an event based on how well it matches a schema in their head. For example, suppose you were told that Linda is politically liberal and a philosophy major. Which of the following descriptions do you think is the most likely to be true and which the least: that Linda is a feminist, that she is a bank teller, or that she is both a feminist and a bank teller. Most people rate “feminist” as most likely and “bank teller” as least, although that is necessarily an error. Mathematically, the least likely event must be that Linda is both a feminist and a bank teller. Why? If Linda is both, then she is already a bank teller—and there is an added chance that she might be a bank teller without being a feminist. Thus, the single description of “bank teller” must be more probable than being both a teller and a feminist.
This conjunction fallacy (i.e., rating a combination of two events as more likely than one of its two individual component events) is caused by the representativeness heuristic. People form a schema of Linda and then quickly compare the various events (e.g., is a bank teller) to this stereotype. If the event matches the schema (e.g., is a feminist), it is seen as probable. If it does not (e.g., is a bank teller), it is seen as improbable. However, in using this heuristic, people commonly violate the simple mathematics inherent in the situation, and thus reach conclusions that cannot be right.
In the following heuristics, people also ignore other valuable information that would lead them to more accurate predictions. For example, people tend to neglect the base rates of events, even though these rates have a large impact on what will happen. Base rates refer to the commonness of an event. A high base rate means an event is common (e.g., people tend to have ten toes); a low base rate means that an event is rare (e.g., people tend not to have more than ten toes, although some do). The overall base rate of an event is a valuable indicator about whether or not it will occur in the future, but people, relying on availability and representativeness heuristics, tend not to factor base rates into their judgments and predictions. For example, let’s say that you know someone who is over seven feet tall and athletic. Is he more likely to be an NBA basketball player or an accountant? Most people quickly predict that this person is an NBA player, because that fits their schema of a professional basketball player (the representativeness heuristic at work), but they should actually predict that he is an accountant, because accountants far outnumber NBA basketball players. That is, the base rate of accountants is several times higher than the base rate of being an NBA basketball player. Because being an accountant is the much more common event, it is the event one should predict.
In another predominant theme, social cognitive work has also increasingly recognized that people possess two very different modes of thought. System 1 is a rapid mode of thought, in which people reach their judgments quickly through simple associations and heuristics, like the availability heuristic. System 2 is slower, conscious, deliberate, effortful, rule-based, and analytical.
Anyone who has solved a complex math problem is familiar with system 2. This is the system in which people consciously apply rules to compute some sort of judgment. People may not be as familiar with the operation of system 1. Indeed, at its extreme, system 1 may work so rapidly that a person is not even aware of its operation.
System 1 thinking is often associated with being automatic. There are many senses in which thought can be automatic. First, automatic thought can be quick. For example, people recognize the faces of their friends and family in an instant, without conscious deliberation. Second, automatic thought can be efficient, in that it does not detract from other tasks that people apply themselves to. For example, people can drive a car along a familiar route while fully engaged in other tasks, such as listening to the car stereo or talking to a passenger. Third, automatic thought can be completed without monitoring. People can form perfectly grammatical sentences, for example, without consciously monitoring the construction of each single phrase. Fourth, automatic can mean that the thought is outside of the control of the individual—that it just happens. Indeed, it often requires no conscious goal to set itself in motion. For example, few Americans can hear the date “September 11” without reflexively thinking of terrorism.
Finally, and perhaps most importantly, automatic thought can occur without one being aware of it. When this occurs, the thought is usually described as nonconscious (i.e., below awareness) or preconscious (i.e., occurring before any thought reaches consciousness). Ultimately, this means that the conclusions people reach can be shaped by influences they are not aware of.
These influences are most directly shown in studies of priming, in which people are exposed to incidental material that later shapes their conclusions about some seemingly irrelevant situation. For example, if people complete a sentence-completion task that contains such words as hostile, mean, and unfriendly, they will judge a person they encounter soon afterward as more unpleasant and aggressive than they would if exposed to the words kind, generous, and sociable. The influence of priming can occur even when people are not aware of the prime. For example, John A. Bargh, Mark Chen, and Lara Burrows (1996) exposed college students to words associated with elderly people (e.g., wisdom, Florida ) so quickly that the students were not aware that they had been shown any words at all. They thought they were merely seeing flashes on a computer screen. Despite this fact, as students left the experiment, exposure to these primes caused them to walk more slowly (a stereotypical attribute of the elderly) to the elevator as they left the experiment.
System 1, and the automatic thoughts that come with it, produces wide-ranging consequences. For example, the accuracy of people’s judgments, as described above, is heavily influenced by rapid use of availability and representativeness heuristics. Some forms of system 1 thinking can also be shown to trump system 2 thought. Norbert Schwarz and colleagues in a 1991 article showed how system 1 elements can have more influence than the actual content of conscious thoughts. In one study, they asked college students to write down six examples of their own assertive behavior. Students found this task easy. Another group was asked to write down twelve examples and found this task difficult. Later, when asked to rate their assertiveness, the first group saw themselves as more assertive than the second group—even though the second group had generated a greater number of examples indicating that they were assertive. Schwarz and colleagues argued that the first group had perceived themselves as more assertive because they were relying on the availability heuristic. Generating six examples was so easy and available that it tended to convince students that they were assertive. System 1 (the availability heuristic) in this case was a more powerful influence than system 2 (the actual number of examples in conscious thought).
The impact of system 1 thought is also evident in social attribution. People appear to reach attributions about others quickly and spontaneously, through system 1, even without a conscious goal of trying to understand those people. For example, if you mention that Janice helped the elderly woman carry her groceries to the car, many people will rapidly and unknowingly classify the behavior as helpful. (There is an ongoing debate about whether people think of the behavior or the person, Janice, as helpful.) Indeed, if cued with the word helpful later, people will be more likely to remember the sentence that inspired the thought.
Such spontaneous system 1 attributions may explain the fundamental attribution error. Daniel T. Gilbert and colleagues (1988) have proposed that people make rapid attributions to another person’s personality. Once made, people correct these quick personal attributions by considering the impact of the situation in a more effortful, conscious, system 2 way. In support of this idea, Gilbert and colleagues have shown that people make greater attributions to someone’s personality if they are distracted by some other task, because they are deprived of the cognitive capacity necessary to correct for the quick personal attributions produced by system 1.
System 1 also carries consequences for stereotyping. Patricia G. Devine (1989) has suggested that people apply stereotypes in their judgments of others in a quick, system 1 way. Importantly, these stereotype-inspired thoughts even occur to those who wish not to be influenced by them. People who consciously deny stereotypes based on gender, race, or age know that those stereotypes exist and what they are—and these stereotypes will produce automatic, system 1 associations even among these people. In response, those who wish to avoid using stereotypes must apply more effortful system 2 thought to correct for the impact of those stereotypes. However, when people do not have the cognitive capacity to perform this system 2 correction, they will commit stereotypical thinking even though they wish to prevent it. This may happen when they are tired or distracted by some other task.
System 1 also influences attitudes and persuasion. As Shelly Chaiken and Yaacov Trope (1999) have pointed out, people can be persuaded to hold an attitude via two different routes. Through a heuristic route, people can be persuaded in a system 1 way through rapid associations and rules of thumb. For example, people can be persuaded of a viewpoint if the person trying to persuade them is physically attractive, or has an impressive title, or just rattles off a large number of arguments. This type of persuasion occurs when people are not motivated to think deeply about what they are being told. However, when people are motivated, they more effortfully and consciously deliberate over what they are told. This is the systematic route to persuasion, and depends on whether people find the arguments they are given to be strong. (John T. Cacioppo and Richard Petty’s elaboration likelihood model offers a similar treatment of system 1 and 2 routes to persuasion).
The presence of system 1 also means that people may hold multiple, and sometimes contradictory, attitudes about social groups and issues. For example, at a conscious, explicit level, people may harbor no negative attitudes toward people from other racial groups, or the elderly, or the political party opposite their own. However, at an implicit level, below conscious awareness or control, people may hold such prejudices. That is, they may hold automatic negative associations to those groups that they are not aware they have.
Much research has shown how attitudes at the implicit level may differ from those at an explicit level. For example, people tend to deny explicitly having any negative opinions of racial groups different from their own. However, if placed in a performance task that assesses their automatic, implicit associations, such negative links are often found. For example, in one version of the implicit association task, people are asked to complete two tasks simultaneously. In one task, they are asked whether a face is of a European American or an African American, pressing a button with their left hand if the face is European and with their right if the face is African. Intermixed with this task, they are also shown words (e.g., puppy, disease ) and asked if each is positive or negative in nature, using their left hand to indicate the former and their right the latter. They then perform these two intermixed tasks again, but this time the hands indicating positive and negative words are switched.
Most European Americans find this second version of the task to be more difficult, in that they are using the same hand to indicate European and negative (an association they may not have at the automatic level) and the other hand to indicate African and positive (again, an association they may not possess at the automatic level). African American participants find the second version of the task to be easier than the first, presumably because it matches associations (e.g., African and positive ) that they possess at an implicit level.
Work in social cognition has moved vigorously in several directions since 1990. For example, all the work described so far paints social perceivers as cold, machinelike calculators, calmly using systems 1 and 2 to determine some judgment about their social worlds. More recent work since the 1990s has recognized that cognition need not only be “cold,” it can also be “hot,” involving vivid and full-blooded emotions. Thus, recent work in social cognition increasingly focuses on the role of emotions in social thought. For example, research has shown that emotional arousal prompts people to pay attention more to the evaluative charge of information in their environment—that is, to whether it is positive or negative—over other aspects of that information. Fear and anxiety also narrow attention to central and salient aspects of a situation, at the expense of more peripheral features.
Emotions also lead people to make different assumptions about a situation. When people become fearful, for example, they perceive themselves to lack control over a situation. Thus, they become more pessimistic and reluctant to take on risks. However, when people are angry, they perceive themselves as more in control and more likely to seek risks out. This was directly shown in a survey taken after the terrorist attacks in the United States on September 11, 2001. Those asked first to describe how the attacks made them fearful perceived the United States to be more at risk for future attacks than did those asked to describe how the attacks made them angry.
Work in social cognition since 1990 has also begun to explore the role played by culture, taking pains to study how social cognition operates around the world. In doing so, researchers have found that culture has a profound impact on the ways people think about their social world and the conclusions they then reach. Differences in culture also appear to extend even to perception of the physical world. For example, if people are shown a scene, North Americans are more likely to describe and remember central components of the scene. East Asian respondents, relative to their North American counterparts, are more likely to describe and remember the context surrounding those central components, better recalling peripheral details of a scene.
This different degree of attention paid to the center versus the context is echoed in social judgment. People from East Asia also tend to avoid the fundamental attribution error, more frequently emphasizing situational factors that may have produced a person’s behavior, in contrast to what is emphasized by their North American counterparts. In essence, people from East Asia tend to emphasize the surrounding situational context in their explanations for the behavior of other people, whereas North Americans tend to emphasize the central actor.
Finally, social cognition research since 2000 has increasingly delved into the neurophysiology of social thought, examining the brain structures that support judgments and decision making. By using such techniques as fMRI (functional magnetic resonance imaging) or ERP (event-related brain potentials), psychologists can determine which neural structures in the brain are active as people reach decisions. For example, people who possess negative implicit attitudes toward ethnic groups different from their own (as measured by the implicit association test described above) show more activation of the amygdala, a part of the brain associated with emotional learning and evaluation.
Studies of this type have also begun to map different neural routes that people take to reach conclusions about their social world. In a study on moral judgment by Joshua D. Greene and colleagues (2001), participants were asked how they would respond to the following two moral dilemmas. One dilemma concerned whether participants would switch the track that a train was traveling on to keep it from hitting and killing five people, knowing that the train on its new track would unfortunately kill one other individual. The second dilemma concerned whether participants would push a person in front of a train, killing him, in order to stop the train from killing five people further down the tracks. Although the two scenarios share the same overall structure (i.e., sacrificing one person to save five), people tend to reach different decisions about how to act, being more likely to switch the train track in the first scenario than to push the person onto the tracks in the second.
Participants also tend to reach these decisions via different neural routes. Participants in this study appeared to solve the first dilemma in a calculated system 2 way, analyzing the benefit of switching the train track. In support of this observation, fMRI measurements suggest that as people considered the scenario their parietal lobes, as well as the right middle-front gyrus, were active—areas associated with the “working memory” people use as they think through a decision. In contrast, people appeared to solve the second dilemma by going with their initial emotional reaction, with brain areas associated with emotion (e.g., the right and left angular gyrus, the bilateral posterior cingulate gyrus, and the bilateral medical frontal gyrus) being most active.
Social cognition is a vibrant area of research. Its influence is also increasingly felt in other scientific and professional disciplines, as scholars in medicine, law, business, education, and philosophy comb its insights to provide knowledge to address questions in those areas of endeavor. In a sense, this vibrancy should not come as a surprise. Every day, people expend a great deal of effort in social cognition, trying to make sense of themselves and the people around them. It is a safe bet that they will never find a point in their lifetime when they can stop doing this task. If this is true of people in everyday life, then it must also be true of the social cognition researcher, who, after all, is also just trying to make sense of what other people do.
SEE ALSO Attitudes; Attribution; Cognition; Decision-making; Perception, Person; Prototypes; Stereotypes
Bargh, John A., Mark Chen, and Lara Burrows. 1996. Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Priming on Action. Journal of Personality and Social Psychology 71: 230-244.
Chaiken, Shelly, and Yaacov Trope, eds. 1999. Dual-Process Theories in Social Psychology. New York: Guilford.
Chapman, Loren J., and Jean P. Chapman. 1967. Illusory Correlation as an Obstacle to the Use of Valid Psychodiagnostic Signs. Journal of Abnormal Psychology 72: 193-204.
Devine, Patricia G. 1989. Stereotypes and Prejudice: Their Automatic and Controlled Components. Journal of Personality and Social Psychology 56: 5-18.
Gilbert, Daniel T., Brett W. Pelham, and Douglas S. Krull. 1988. On Cognitive Busyness: When Person Perceivers Meet Persons Perceived. Journal of Personality and Social Psychology 54: 733-740.
Gilovich, Thomas, Dale Griffin, and Daniel Kahneman, eds. 2002. Heuristics and Biases: The Psychology of Intuitive Judgment. New York: Cambridge University Press.
Greene, Joshua D., R. Brian Sommerville, Leigh E. Nystrom, et al. 2001. An fMRI Investigation of Emotional Engagement in Moral Judgment. Science 293: 2105-2108.
Greenwald, Anthony G., David E. McGhee, and Jordan L. K. Schwarz. 1998. Measuring Individual Differences in Implicit Cognition: The Implicit Association Test. Journal of Personality and Social Psychology 74: 1464-1480.
Hamilton, David L., and Robert K. Gifford. 1976. Illusory Correlation in Interpersonal Perception: A Cognitive Basis of Stereotypic Judgments. Journal of Experimental Social Psychology 12: 392-407.
Kunda, Ziva. 1999. Social Cognition: Making Sense of People. Cambridge, MA: MIT Press.
Lerner, Jennifer S., Roxanne M. Gonzalez, Deborah A. Small, and Baruch Fischhoff. 2003. Effects of Fear and Anger on Perceived Risks of Terrorism: A National Field Experiment. Psychological Science 14: 144-150.
Moskowitz, Gordon B. 2005. Social Cognition: Understanding Self and Others. New York: Guilford.
Nisbett, Richard E. 2004. The Geography of Thought: How Asians and Westerners Think Differently … and Why. New York: Free Press.
Phelps, Elizabeth A., Kevin J. O’Connor, William A. Cunningham, et al. 2000. Performance on Indirect Measures of Race Evaluation Predicts Amygdala Activation. Journal of Cognitive Neuroscience 12: 729-738.
Schwarz, Norbert, Herbert Bless, Fritz Strack, et al. 1991. Ease of Retrieval as Information: Another Look at the Availability Heuristic. Journal of Personality and Social Psychology 45: 513-523.
"Social Cognition." International Encyclopedia of the Social Sciences. . Encyclopedia.com. (July 26, 2017). http://www.encyclopedia.com/social-sciences/applied-and-social-sciences-magazines/social-cognition
"Social Cognition." International Encyclopedia of the Social Sciences. . Retrieved July 26, 2017 from Encyclopedia.com: http://www.encyclopedia.com/social-sciences/applied-and-social-sciences-magazines/social-cognition
The basic goal of social cognition is to understand how people make sense of themselves, others, and events in everyday life. Research from the perspective of adult development and aging has focused on broadening the understanding of cognitive aging include how life experiences and changes in pragmatic knowledge, social expertise, and values influence age-related differences in how people think. In order to address these issues, one must consider both the basic cognitive architecture of the aging adult and the functional architecture of everyday cognition in a social context. Even if certain basic cognitive mechanisms decline (such as memory recall or how fast information is processed), older adults may still possess the social knowledge and skills that allow them to function effectively.
Social cognition and cognitive mechanisms
In the mainstream social cognition literature, researchers use an information-processing approach to examine social cognitive processes. In particular, they examine how the accuracy of social perceptions can be impaired by cognitive load (i.e., attending to too many cognitive activities at one time). A heavy cognitive load depletes the resources required to devote the time necessary to make an accurate judgment or assessment of a situation. A good illustration of impaired social perception accuracy is exemplified in how a person’s behavior is explained. If a person is observed behaving in an anxious way, it may erroneously be inferred that he or she is, in general, an anxious person. However, the judgment would have been more accurate if the observer had considered the situational information. In this case the person was waiting to give an important speech to an audience of over a hundred individuals. This is an example of a ‘‘correspondence bias’’ where the cause of a person’s behavior is attributed to a predisposed characteristic and the observer does not attend to compelling extenuating circumstances. Dan Gilbert and colleagues found that the propensity to commit a correspondence bias was exacerbated if individuals had to attend to another task at the same time (i.e., increased cognitive load). In other words, they lacked the cognitive resources to deliberate and adjust initial judgments about people and events because they were busy thinking about something else. What implication does this have for the aging adult? Because older adults typically exhibit lower levels of cognitive processing resources (e.g., they are slower at processing information), this may impact their social judgment processes.
The literature on aging supports this notion. Fredda Blanchard-Fields found in a number of studies that older adults consistently exhibit the correspondence bias. In this case individuals are presented with stories in which a main character is associated with a situation’s negative outcome. For example, Doug insists that he continue to work long hours despite his wife’s protests, which results in their divorce. Older adults blamed Doug more than young adults. Older adults relied more on dispositional information (personality characteristics, the character of the individual) to explain the behavior and ignored compelling situational information (such as the wife’s pressure). Her studies have repeatedly shown that older adults tend to blame the main character in relationship conflicts with negative outcomes despite the existence of situational causal factors. In a number of studies Thomas Hess and colleagues have shown that older adults tend to rely on easily accessible knowledge as opposed to engaging in more elaborative processing. They have found that older adults do not modify their first impression of an individual when presented with new information, especially when positive information follows an initially negative portrayal of the individual. For example, when older adults are initially given a description of a person portrayed as dishonest and subsequently receive information about that person performing honest behaviors, they do not adjust their initial impression regarding honesty. Overall, it appears from these studies that limitations on processing resources could play an important role in understanding why older adults produce biased social judgments.
Social cognition and social knowledge
In contrast to the influence of limitations on processing resources on social judgment biases, social cognition research also suggests that when strong beliefs and knowledge are activated automatically, they can influence social judgments in general, and invite social judgment biases in particular. Walter Mischel suggests that there are individual differences in the strength of social representations of rules, beliefs, and attitudes that are associated with specific situations. Thus, when individuals encounter specific situations, their belief systems trigger emotional reactions and goals that are closely linked to those situations, which in turn drive social judgments.
Blanchard-Fields and colleagues suggest that the observed dispositional biases described above might occur when older adults’ strongly held beliefs about how the particular character should have acted in the specific situation are violated. For example, in the case of Doug and his wife, older adults may have blamed Doug not because they did not have the cognitive capacity to do so, but because he violated the strongly held belief that marriage comes before career. Accordingly, the investigators examined the degree to which limitations on processing resources and/or strong social beliefs and social rules accounted for dispositional biases observed in older adults. They found that older adults produced more dispositional biases when placed under the cognitive constraint of a time limit to respond. However, they also found that older adults produced more social beliefs and rules pertaining to the main character than young adults did. This accounted for age differences in dispositional biases above and beyond the influence of time constraints. This provides evidence against a cognitive resource limitation explanation. It appears that the degree to which a social rule has been violated determines when a dispositional bias will be made. Such findings suggest that a social knowledge-based explanation of social judgments is a viable alternative to limitations on cognitive resources.
Social cognition and processing goals
Change in the relative importance of social goals as people grow older profoundly influences how individuals interpret and use social information. Laura Carstensen and colleagues suggest that emotional goals become increasingly important and salient across the adult life span. They have demonstrated that older adults pay more attention to emotional information in text, and thus remember it better than neutral information. Another motivational goal shown to influence social information processing is cognitive style or how one approaches problem solving. For example, an individual with a high need for closure, such as the need to come to quick and decisive answers without deliberation, is more likely to commit the correspondence bias. Hess and colleagues found that need for closure did not influence judgment biases in young and middle-aged adults, but did predict social judgment biases in older adults. Because of age-related changes in personal resources (both social and cognitive), motivational factors (such as need for closure) oriented toward conserving resources may become more important to the older adult.
Stereotypes and cognitive functioning
Finally, the literature suggests that there is a negative impact of age-related stereotypes on cognitive functioning in older adults. Such stereotypes represent a set of socially shared beliefs about personal attributes and behaviors of older adults. Studies examining age-related differences in the content and structure of stereotypes find that older and young adults hold similar age-related negative stereotypes, such as slow-thinking, senile, incompetent, and feeble. However, older adults display more complex representations of the category ‘‘older adults,’’ including both positive (e.g., wise, dependable) and negative stereotypes.
Studies have examined under what conditions stereotypes are activated, and if they are, how they affect behavior and social judgments. Claude Steele and colleagues found that stigmatized groups such as African Americans and women are vulnerable to fears of being judged in accordance with negative stereotypes about the group to which they belong. This in turn impairs performance relevant to the stereotype associated with their group, such as academic ability. Similarly, Becca Levy found that automatically activated stereotypes about aging and memory adversely affect the cognitive performance of older adults. Although at this point there is a need for replication for this study, it does suggest that social factors such as negative stereotypes may have some effect on decline in cognitive performance although not account for all of it.
In conclusion, research on social cognition and aging underscores the importance of social factors as potential mechanisms contributing to age-related differences in cognitive functioning. In addition, it suggests that it is important not to limit explanations of changes in social cognition to cognitive processing variables alone. The social factors highlighted above influence social information processing in important ways including how, when, and why older adults attend to specific information and how this information will be used.
See also Images of Aging; Intelligence; Memory; Motivation.
Blanchard-Fields, F. ‘‘Social Schematicity and Causal Attributions.’’ In Social Cognition and Aging. Edited by T. M. Hess and F. Blanchard-Fields. San Diego: Academic Press, 1999. Pages 222–238.
Carstensen, L. L., and Turk-Charles, S. ‘‘The Salience of Emotion across the Adult Life Span.’’ Psychology and Aging 9 (1994): 259–264.
Gilbert, D., and Malone, P. ‘‘The Correspondence Bias.’’ Psychological Bulletin 117 (1995): 21–38.
Hess, T. M. ‘‘Cognitive and Knowledge-Based Influences on Social Representations.’’ In Social Cognition and Aging. Edited by T. M. Hess and F. Blanchard-Fields. San Diego: Academic Press, 1999. Pages 239–267.
Levy, B. ‘‘Improving Memory in Old Age through Implicit Stereotyping.’’ Journal of Personality and Social Psychology 71 (1996): 1092–1107.
Mischel, W., and Shoda, Y. ‘‘A Cognitive-Affective System Theory of Personality: Reconceptualizing Situations, Dispositions, Dynamics, and Invariance in Personality Structure.’’ Psychological Review 102 (1995): 246–268.
Steele, C. ‘‘A Threat in the Air: How Stereotypes Shape Intellectual Identity and Performance.’’ American Psychologist 52 (1997): 613–629.
"Social Cognition." Encyclopedia of Aging. . Encyclopedia.com. (July 26, 2017). http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/social-cognition
"Social Cognition." Encyclopedia of Aging. . Retrieved July 26, 2017 from Encyclopedia.com: http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/social-cognition
Social Cognitive Theory
SOCIAL COGNITIVE THEORY
The self-management of health requires development of self-regulatory skills. This is achieved through self-regulatory subfunctions that provide guides and motivators for self-directed change. People have to keep track of their health habits. Self-monitoring provides the information needed for setting realistic goals and for evaluating one's progress toward them. People motivate themselves and guide their behavior by the goals and challenges they set for themselves. Goals motivate by enlisting self-evaluative involvement in the activity. The evaluative self-reactions provide the means by which personal standards regulate courses of action.
The self-management system operating through self-monitoring, goal setting, and self-reactive influence is rooted in beliefs of personal efficacy. This core belief system is the foundation of human motivation and action. Unless people believe they can produce desired effects by their actions, they have little incentive to act or to persevere in the face of difficulties.
In social cognitive theory, perceived efficacy is a key determinant because it affects lifestyle habits both directly and by its influence on other determinants. The stronger the perceived efficacy, the higher the goals people set for themselves, the more they expect their efforts to produce desired outcomes, and the more they view obstacles and impediments to personal change as surmountable.
Development of self-regulatory capabilities requires instilling a resilient sense of efficacy as well as imparting skills. Experiences in exercising control over troublesome situations serve as efficacy builders. If people are not convinced of their personal efficacy, they rapidly abandon the skills they have been taught when they fail to get quick results or suffer reverses. Efficacy beliefs affect every phase of personal change: whether people even consider changing their health habits; whether they enlist the motivation and perseverance needed to succeed; their facility to recover from setbacks; and how well they maintain the habit changes they have achieved. The self-efficacy belief system operates as a common mechanism through which psychosocial treatments affect different types of health outcomes.
PUBLIC HEALTH APPLICATIONS
People see little point in even trying if they believe they do not have what it takes to succeed. In community-wide health campaigns, people's preexisting efficacy beliefs and the efficacy beliefs instilled by the campaign contribute to adoption of health promoting habits. This calls for a change in emphasis from trying to scare people into health to enabling them to achieve self-directed change.
Effective self-management models inform people of the health risks and benefits of different lifestyles habits; create the self-regulatory skills needed to translate informed concerns into health promotive actions; build a resilient sense of efficacy to support control in the face of difficulties; and enlist social supports for desired personal changes. The guiding principles, implementative practices, and empirical documentation of effectiveness are reviewed in some detail in Self-Efficacy: The Exercise of Control (Bandura, 1997). By combining the high individualization of the clinical approach with the large-scale applicability of the public health approach, health self-management systems ensure high social utility.
It is easier to prevent detrimental health habits than to try to change them after they have become deeply entrenched as part of a lifestyle. The social cognitive model provides a valuable public health tool for societal efforts to promote the health of its youth. Preventive programs often produce weak results because they are heavy on didactics but meager on personal enablement. Health knowledge can be conveyed readily, but changes in values, attitudes, and health habits require greater effort. Health promotion programs that encompass the essential elements of the self-regulatory model achieve greater success.
The quality of health of a nation is a social matter, not just a personal one. It requires changing the practices of social systems that impair health rather than just changing the habits of individuals. People's beliefs in their collective efficacy to accomplish social change by perseverant group action play a key role in the policy and public health approach to health promotion. Given that health is heavily influenced by behavioral, environmental, and economic factors, health promotion requires emphasis on the development and enlistment of collective efficacy for socially oriented initiatives.
(see also: Behavior, Health-Related; Behavior Change; Communication for Health; Enabling Factors; Health Promotion and Education; Mass Media; Predisposing Factors; Psychology, Health; Social and Behavioral Sciences )
Bandura, A. (1986). Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice-Hall.
—— (1997). Self-Efficacy: The Exercise of Control. New York: Freeman.
Holden, G. (1991). "The Relationship of Self-Efficacy Appraisals to Subsequent Health Related Outcomes: A Meta-Analysis." Social Work in Health Care 16:53–93.
Maibach, E.; Flora, J.; and Nass, C. (1991). "Changes in Self-Efficacy and Health Behavior in Response to a Minimal Contact Community Health Campaign." Health Communication 3:1–15.
"Social Cognitive Theory." Encyclopedia of Public Health. . Encyclopedia.com. (July 26, 2017). http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/social-cognitive-theory
"Social Cognitive Theory." Encyclopedia of Public Health. . Retrieved July 26, 2017 from Encyclopedia.com: http://www.encyclopedia.com/education/encyclopedias-almanacs-transcripts-and-maps/social-cognitive-theory