# Prisoner's Dilemma

# PRISONER'S DILEMMA

The Prisoner's Dilemma is one of the simplest yet most widely applicable situations studied in game theory. The Prisoner's Dilemma was discovered by Melvin Dresher and Merrill Flood at the Rand Corporation in 1950, but its name comes from the following story, which was supplied shortly afterward by the Princeton mathematician Harold Kuhn. The story and its analysis have been used in different ways to draw forth ethical implications.

## The Basic Story

Two men are caught committing an illegal act. If neither one confesses, there is enough evidence to ensure that each man will get one year in jail. If both confess, each one gets five years in jail. However, if one confesses and the other does not, the man who does not confess gets ten years in jail but the confessor who incriminates his partner gets off free. This is a special case of the normal form game illustrated in matrix form in Figure 1.

The *normal form* specifies a *strategy set* for each player and a *payoff* for each player as a function of the choice of strategy by each player. In this matrix player 1, the row player, chooses a row, and player 2, the column player, chooses a column. In general the Prisoner's Dilemma requires that *T* > *R* > *P* > *S*. This means that if both players defect, each one receives *P*, whereas if they both cooperate, each one receives *R* > *P*. If one player cooperates and the other defects, the defector gets *T*, which is the largest of the four numbers, and the cooperator gets *S*, which is the smallest. In Kuhn's example cooperate means "not confess," defect means "confess," and the four payoffs are *T* = 0, *R* = -1, *P* = -5, and *S* = -10. As a mnemonic device one can say that *T* is the temptation to defect, *R* is the reward for mutual cooperation, *P* is the penalty for mutual defections, and *S* is the sucker's payoff for cooperating when one's partner is defecting.

Note that whatever player 1 does, the best response for player 2 is to confess. This is the case because *T* > *R* (0 > -1 in our example), and so player 2 does better to confess if player 1 cooperates and *P* > *S* ( -5 > -10 in our example), and so player 2 does better to confess if player 1 confesses. Therefore, if both players are self-interested and rational (i.e., they maximize their payoffs), both players will defect, and so their payoffs will be (*P, P*), which is (-5, -5) in this example.

The Prisoner's Dilemma also can be described in an *extensive form,* which involves displaying the various moves of the players, as well as the payoffs, using a *game tree,* as is shown in Figure 2.

Perhaps the most important application of the Prisoner's Dilemma is to increase one's understanding of the role of market competition in promoting efficiency, growth, and material wealth. Although traditional economic theory posits the ability of markets to "get the prices right" and thus achieve allocational efficiency, a more important effect of market competition is to subject producers in the same industry to Prisoner's Dilemma–like situations in which mutual defection means that each producer chooses to produce high quality at a low price.

Consider, for instance, an industry with two firms. If they cooperate, they will choose a common price that maximizes total profits (the so-called *monopoly price*) and split total sales. However, each has an incentive to undercut the other's price to increase its own profits by taking sales away from the other. Thus, each producer will "defect" by charging the competitive price no matter what the other producer does (Gintis 2000). In effect, market competition, at least when it is working properly, *disciplines* producers, forcing them to act in the public interest.

## The Public Goods Game

When there are *n* players, the Prisoner's Dilemma is known as the Public Goods Game, which is described as follows. Suppose a team of *n* players can each contribute an amount *b* to the group at a cost *c* < *b* to each contributor. Each player decides independently of the others whether to cooperate (contribute) or defect (not contribute). Suppose at the end of the game the *n* players share their proceeds equally. Then if *m* of the players cooperated, each cooperator will earn *mb/n*, whereas each defector will earn *mb/n − c*. To see whether it pays to cooperate, consider one of the players, say, player A, and assume that *m* − 1 other players cooperate. By cooperating, player A earns *mb/n − c*, whereas by defecting, player A earns (*m* − 1)*b/n*. Comparing these two quantities, one can see that cooperating pays off more than defecting does precisely when *b* > *nc*. That is, a self-interested player A will cooperate only if A's share of the *b* that A contributes to the group, which is *b/n*, is greater than A's cost *c*. If *n* = 2 and *b* > *c* > *b*/2, the Public Goods Game becomes a Prisoner's Dilemma in which *T*> *R*> *P*> *S* becomes *b*/2 > *b* − *c* >0> *b*/2 − *c*.

The Public Goods game was made famous by Garrett Hardin (1915–2003), whose article "The Tragedy of the Commons" (1968) argued that all people have a collective interest in maintaining the natural environment, yet if all people are self-interested, each one will overexploit the environment, even though each one hopes that others will act to preserve the environment. For instance, if ten fishers share a lake, the number of tons of fish that can be harvested season after season (the so-called *sustainable yield* of the lake) may be 1,000, which is 100 tons per fisher. However, each individual fisher may prefer to take 200 tons even if this endangers the yields in future years. In this case, cooperate means "take 100 tons of fish" and defect means "take 200 tons of fish." A fisher who is self-interested will hope others cooperate but will defect no matter what the other fishers do.

Other examples of social situations that can be couched as *n*-player Prisoner's Dilemmas are (a) pollution, in which each firm hopes the others cooperate (refrain from polluting a river) but defects no matter what the others do; (b) population control, in which each family hopes the other families limit the number of children they bear but bears as many children as it can no matter what the others do; (c) community participation, in which all benefit when all contribute to community projects (schools, roads, public parks, and gardens) but each community member would rather stay home and let the others do the work; and (d) a situation in which a group of farmers share irrigated water; each gains from diverting a large amount of water from their common pool, but all benefit when the water is used in moderation.

Perhaps the most important aspect of the Prisoner's Dilemma is that empirical investigation shows that in real life communities have a variety of resources available to moderate the use of the commons in a reasonable way (Yamagishi 1986, Ostrom 1990). Both state control and privatization of common resources have been advocated, but neither the state nor the market has been uniformly successful in solving common pool resource problems. This is the case because state officials have priorities that often conflict with those of the local resource users and because privatization often concentrates power and wealth in the hands of the individual or group to which the common goods are assigned.

In contrast to the proposition of the tragedy of the commons argument, common pool problems sometimes are solved by voluntary organizations rather than by a coercive state. Among those cases are communal tenure in meadows and forests, irrigation communities and other water rights, and fisheries. These cases often involve local self-organizing regimes that rely on implicit or explicit principles, norms, rules, and procedures rather than the command and control of a central authority.

If agents were truly self-interested, it is not clear how such self-organization could work effectively. However, the fact is that when people play the Prisoner's Dilemma in the laboratory for real money, they very often prefer to cooperate rather than defect as long as their partners cooperate as well (Kiyonari, Tanida, and Yamagishi 2000). Thus, people are generally not well described by the self-interest principle, a fact that has opened up a new research area in human behavior in recent years (Gintis, Bowles, Boyd, and Fehr 2004). This human tendency to cooperate lies at the root of self-organized solutions to common pool resource problems.

## Ethical Implications

The Prisoner's Dilemma has important implications for ethical theory. It shows, for instance, that the philosopher Immanuel Kant's (1724–1804) categorical imperative is at best highly ambiguous and at worst fatally flawed. The categorical imperative states that one ought to "act according to that maxim which the actor would at the same time will to become a universal law" (*Critique of Practical Reason,* 1788). In the Prisoner's Dilemma each party would prefer that cooperating were a universal law because in that case the mutually desired outcome would be attained.

However, only in very special cases do players coordinate on the mutual cooperation outcome, and almost never does the duty to cooperate seem to be a defensible ethical commitment. For instance, producers in the same industry who cooperate on Kantian grounds would harm a market economy by colluding to maximize profits at the expense of the public. Similarly, if a person believes that his or her partner will defect, the first person nevertheless is obliged by the categorical imperative to cooperate. Although cooperating in this case may be a nice thing to do ("turn the other cheek"), it would be difficult to defend as a moral duty.

Of course, Kantian ethics is not the only ethical theory that is compromised by game theory in general and by the Prisoner's Dilemma in particular. Utilitarianism also suggests that people act to maximize the sum of utility. In the case of the Prisoner's dilemma this means that each player should cooperate no matter what the other player does. This also lacks plausibility as a general ethical principle.

HERBERT GINTIS

SEE ALSO *Decision Theory;Game Theory;Rational Choice Theory*.

## BIBLIOGRAPHY

Gintis, Herbert. (2000). *Game Theory Evolving.* Princeton, NJ: Princeton University Press.

Gintis, Herbert; Samuel Bowles; Robert Boyd; and Ernst Fehr. (2004). *Moral Sentiments and Material Interests: On the Foundations of Cooperation in Economic Life.* Cambridge, MA: MIT Press.

Hardin, Garrett. (1968). "The Tragedy of the Commons." *Science* 162: 1243–1248. Hardin's influential argument is expanded in John A. Baden and Douglas S. Noonan, eds., *Managing the Commons,* 2nd edition. (Bloomington: Indiana University Press, 1998), originally edited by Hardin and Baden (San Francisco, Freeman, 1977).

Kiyonari, Toko; Shigehito Tanida; and Toshio Yamagishi. (2000). "Social Exchange and Reciprocity: Confusion or a Heuristic?" *Evolution and Human Behavior* 21: 411–427.

Ostrom, Elinor. (1990). *Governing the Commons: The Evolution of Institutions for Collective Action.* Cambridge, UK: Cambridge University Press.

Poundstone,William. (1992). *Prisoner's Dilemma.* New York: Doubleday.

Yamagishi, Toshio. (1986). "The Provision of a Sanctioning System as a Public Good." *Journal of Personality and Social Psychology* 51: 110–116.

#### More From encyclopedia.com

#### You Might Also Like

#### NEARBY TERMS

**Prisoner's Dilemma**