Risk and Emotion

views updated


Technologies, particularly if they are new, often give rise to emotional reactions that are based on perceived risks. Recent examples of such technological risks involve cloning and genetically modified food; the use of nuclear energy continues to spark heated and emotional debates. Empirical research has shown that people rely on emotions in making judgments about what constitutes an acceptable risk (Slovic 1999). However, this does not answer the question of whether judgments that are based on emotions can provide a better understanding of the moral acceptability of risks than do judgments that do not take the emotions into consideration. Many scientists dismiss the emotions of the public as a sign of irrationality. Should engineers, scientists, and policy makers involved in developing risk regulation take the emotions of the public seriously?

Emotions and Moral Judgments

There are two major traditions in modern moral theory that deal with the role of emotions, going back to the Enlightenment thinkers David Hume (1711–1776) and Immanuel Kant (1724–1804). For the Scottish philosopher Hume ethics is based not on reason but on the emotions, particularly the sentiment of benevolence, which reason assists in achieving its goals. In opposition to that view the German philosopher Kant maintained that ethics depends on the rational determination of human conduct, with the emotions tending to function as distractions. In neither case, however, are the emotions understood to function in a cognitive manner to reveal something about the world. They are either the noncognitive source of moral value or a noncognitive distraction from moral rationality.

A quite different minority tradition in moral theory, however, grants the emotions cognitive value. This line of thought goes back to Aristotle (1925) who argued that through emotions we perceive morally salient features of concrete situations. In Hume's time the economist Adam Smith (1723–1790) suggested in Theory of the Moral Sentiments (1759) that emotional sympathies for others through imaginative identification with their pleasures and pains can provide knowledge about how other people experience the world. For Max Scheler the emotions are the motivators of decent behavior; they reveal the basic moral facts of life (Scheler 1913–1916).

In the 1970s such theories of the cognitive power of the emotions were given new support by developments in neurobiology, psychology, and the philosophy of the emotions. For scholars as diverse as Ronald De Sousa (1987), Robert Solomon (1993), Antonio Damasio (1994), and Martha Nussbaum (2001) emotions and cognitions are not mutually exclusive. Rather, to have moral knowledge, it is necessary to experience certain emotional states.

To be able to have moral knowledge, a person has to know or be able to imagine how it feels to be in a certain situation and to be treated by others in certain ways as well as how it feels when one is humiliated and hurt or cherished and embraced. These emotions are fundamental features of human life that point to what morality is really about. It is not possible to understand moral life without knowing these emotions and without having the ability to feel sympathy and compassion for others. Hence, only beings with the ability to have emotions can make justified moral judgments. The moral point of view implies that people can feel with others or at least imagine what their emotions might be like and that people care about morally important aspects of the lives of others (Schopenhauer 1969, Scheler 1970).

Emotions and Judging the Acceptability of Risks

A cognitive theory of emotions provides new insights about emotions toward acceptable risks. With the traditional picture one would have to choose between the horns of the Hume-Kant dilemma: either take emotions seriously but forfeit claims to rationality or emphasize rationality at the expense of the emotions. With a cognitive theory of emotions, however, one can argue for taking emotions seriously in order to achieve a more comprehensive rationality, particularly with respect to the moral acceptability of technological risks.

As an example, if people are forced against their will to do something they consider dangerous, this is most likely to result in emotions of anger or frustration. However, that is a completely reasonable response. A prima facie injustice has been done to them, and only if they can be persuaded that there are good reasons why they should undergo this specific risk will their anger subside. In contrast, if no good explanation can be given, they will remain upset. In fact, one might find a person irrational who would not get upset by such an injustice. One would judge a person confused who said, "I know company X is not respecting my rights by building this chemical plant in my neighborhood without informing me or asking my consent, and I think it is not fair, but I don't care." A moral judgment that does not lead to an appropriate emotion is seriously flawed.

Some cognitive theories of emotions would take this analysis even further and claim that without certain feelings or emotions a person is unable to have appropriate moral judgments (e.g., De Sousa 1987, Solomon 1993, Damasio 1994, Nussbaum 2001). When people fail to become outraged in response to abridgments of their autonomy, they may not fully grasp the injustice being done to them.

Moreover, people find it morally reasonable not only for the victim of an injustice to be outraged but also for witnesses to be affected in the same way. People even expect that those who inflict an injustice on others should be forced to reassess their actions if they truly care about those they harm. When such agents are unmoved by feelings of sympathy, they are thought of as hard-hearted and egoistical. Emotions thus help assess not only one's own situation but that of others as well as one's own actions in relation to others. In such ways emotions may lead to fairer social arrangements concerning technological risks.

Evaluation of Emotions Concerning Risks

The idea that emotions are useful pathways to moral knowledge concerning risks does not entail the idea that emotions are infallible as normative guides. Emotions also can be wrongheaded or misguided. Emotions can help people focus on certain salient aspects, but they also can lead people astray. Engineers may be enthusiastic about their products and overlook certain risks. The public may be ill informed and thus focus only on risks and overlook certain benefits. Both parties may be biased, and their emotions may reinforce those biases.

In such situations followers of Hume might claim that emotions should rule. Followers of Kant, by contrast, might argue that emotions should be set aside in favor of purely rational analysis. Those who adopt a cognitive theory of the emotions would defend the emotions as a potential source of new knowledge. Not only can reason be brought to bear in a critical manner on the emotions, the emotions may be used as a basis for critical assessments of reason. Indeed, the emotions themselves may be played off against each other in pursuit of mutual emotional assessment. One example would be the development of affective appreciation through sympathy with opposing perspectives. Engineers might try to make an emotional identification with the perspectives of the public, and vice versa, and those who benefit from technology might try to appreciate the perspectives of those who incur its costs. Without emotions being brought into the mix, well-founded judgments about the moral acceptability of technological risks are unlikely.


SEE ALSO Emotion;Emotional Intelligence;Hume, David;Kant, Immanuel; Risk;Risk Assessment;Risk Perspection;Risk Society.


Aristotle. (1925). Nichomachean Ethics, trans. W. D. Ross. London: Oxford University Press. A classic source of emotional cognitivism in ethics.

Damasio, Antonio. (1994). Descartes' Error: Emotion, Reason, and the Human Brain. New York: Putnam. Readable classic in recent neuropsychology, arguing that without emotions, people cannot be practically rational.

De Sousa, Ronald. (1987). The Rationality of Emotion. Cambridge, MA: MIT Press.

Nussbaum, Martha. (2001). Upheavals of Thought. Cambridge, UK: Cambridge University Press. A strong defense of cognitivism in the philosophy of emotions.

Scheler, Max. (1913–1916). Der Formalismus in der Ethik und die Materiale Wertethik. Formalism in Ethics and Non-Formal Ethics of Values: A New Attempt toward a Foundation of an Ethical Personalism (1973), trans. Manfred S. Frings and Roger L. Funk. Evanston, IL: Northwestern University Press.

Scheler, Max. (1970). The Nature of Sympathy, trans. Peter Heath. New York: Archon Books. A philosopher from the continental phenomenological tradition argues that people acquire moral knowledge through sympathy.

Schopenhauer, Arthur. (1969). The World as Will and Representation, trans. E. F. J. Pane. New York: Dover. Originally published as Die Welt Als Wille und Vorstellung (1819). Defends the importance of sympathy in ethics.

Slovic, Paul. (1999). "Trust, Emotion, Sex, Politics, and Science: Surveying the Risk-Assessment Battlefield." Risk Analysis 19: 689–701. Psychologist who studies the role of the emotions in assessing risk.

Smith, Adam. (1976). Theory of Moral Sentiments. Oxford: Oxford University Press. Defends the importance of moral emotions.

Solomon, Robert. (1993). The Passions: Emotions and the Meaning of Life. Indianapolis: Hacket. Solomon started a renewed interest in emotional congitivism in philosophy in the 1970s.