Skip to main content
Select Source:

Rationality

Rationality

PHILOSOPHY AND RATIONALITY

BELIEF AND INFERENCE

PREFERENCES

DECISION MAKING

RATIONAL CHOICE THEORY

BIBLIOGRAPHY

Rationality in its ordinary sense is reasonableness. It requires justified beliefs and sensible goals as well as judicious decisions. Scholars study rationality in many ways and adopt diverse views about it.

Some theorists adopt a technical definition of rationality, according to which it is just maximization of utility. This definition is too narrow. It considers only adoption of means to reach ends, that is, instrumental rationality. It also evades a major normative question, namely, whether rationality requires maximization of utility. The definition simply stipulates an affirmative answer.

A traditional theory of the mind takes reason as a mental faculty. It characterizes humans as rational animals because they have the faculty of reason, whereas other animals lack that faculty. According to this tradition, any behavior resulting from reasoning is rational. This account of rationality sets the bar low. Most accounts hold that the products of reasoning must meet certain standards before they qualify as rational. A conclusion, for example, must fit the evidence to be rational. It is not rational simply because it results from an inference. Reasoning must be good to yield rational beliefs reliably.

For simplicity, some theorists take being rational to be the same as being self-interested. Being rational differs from being self-interested, however. Promoting self-interest means doing what is good for oneself. Doing what is good for others promotes their interests, not ones own. Rationality may require some measure of self-interestedness but does not require exclusive attention to self-interest. It permits altruism, as Amartya Sen (1977) and Howard Rachlin (2002) explain.

Epistemologists treat justified belief. Under one interpretation, a justified belief is just a rational belief. However, other interpretations of justification are common because a conventional view takes knowledge to be true, justified belief. Making justification fit into that view of knowledge motivates taking justified belief to differ from rational belief. Children rationally believe many true propositions without having knowledge of them because the grounds for their beliefs do not amount to justification.

Rationality is a normative concept. Principles of rationality state how people should behave rather than how they behave. However, some fields assume that people are rational by and large and then use principles of rationality to describe and explain behavior. For instance, some economic theories assert that consumers make purchases that express their preferences. They take this as a fact about consumers behavior rather than as a norm for it. Psychologists seeking to infer a persons beliefs and desires from the persons behavior may assume that behavior maximizes utility. The assumption simplifies inference of beliefs and desires. Several prominent representation theorems show that if a persons preferences concerning acts meet certain axioms, such as transitivity, then one may infer the persons probability and utility assignments (given a choice of scale) from the persons preferences, under the assumption that preferences concerning acts agree with their expected utilities. Richard Jeffrey ([1965] 1983) presents a theorem of this sort.

PHILOSOPHY AND RATIONALITY

Philosophy treats rationality because it is the most important normative concept besides morality. Understanding how a person should conduct her or his life requires a thorough understanding of rationality. Being a rational person requires being sufficiently rational in the various aspects of ones life. Common principles of rationality attend to beliefs and desires and to the decisions they yield. Some principles evaluate character traits and emotions. They judge, for example, that some fears are rational and that others are irrational. Principles of rationality extend from individuals to groups. Committees may pass rational or irrational resolutions. Political philosophy evaluates social contracts for rationality. Mancur Olson (1965) investigates rational collective action and the conditions that foster it.

A traditional metaphysical question asks for the grounds of principles of rationality. What makes consistency a requirement of rationality? Are the principles grounds conventions or something more universal? A common answer claims that natural properties realize normative properties. Consistency increases prospects for true beliefs.

A traditional practical question asks for reasons to be rational. A common answer is that being rational yields the best prospects (with respect to ones evidence) for meeting ones goals and so achieving a type of success. Decisions that maximize expected utility are more likely to be successful than decisions that do not maximize expected utility.

Some philosophers hope to derive principles of morality from principles of rationality. Kantians, for example, hold that a rational person acts in accord with moral principles. Hobbesians hold that the legitimacy of a government derives from the rationality of the social contract that creates it. Rawlsians hold that principles of justice emerge from a hypothetical social contract rational to adopt given ignorance of ones position in society.

BELIEF AND INFERENCE

A general principle states that rationality is attainable. Its attainability follows from the familiar principle that ought implies can. Well-established principles of rationality also treat formation of belief and inference. Consistency is a noncontroversial requirement. Holding inconsistent beliefs is irrational unless some extenuating factor, such as the difficulty of spotting the inconsistency, provides an excuse. Perceptual beliefs are rational when the processes producing them are reliable. Vision in good light yields reliable judgments about the colors of objects. Logic describes in meticulous detail patterns of inference that are rational. For example, if one believes a conditional and believes its antecedent, then believing its consequent is a rational conclusion. Repeated application of rules of inference to prove theorems in logic requires sophistication that ordinary rationality does not demand. Rationality requires an ideal agent to believe each and every logical consequence of her or his beliefs. Its requirements for real people are less demanding. (For a sample of principles of rationality concerning belief, see Foley 1993; Halpern 2003; and Levi 2004.)

Rationality governs both deductive and inductive inference. Principles of statistical reasoning express principles of rational inductive inference. If one knows that an urn has exactly eighty red balls and twenty black balls, then it is rational to conclude that 80 percent is the probability that a random draw will yield a red ball. Given a statistical sample drawn from a population, principles of statistical inference attending to the size of the sample and other factors state reasonable conclusions about the whole population.

PREFERENCES

Preferences may arise from partial or complete consideration of relevant reasons. Common principles of rational preference apply to preferences held all things considered.

The principle of transitivity requires preferring A to C given that one prefers A to B and also prefers B to C. The principle of coherence requires having preferences among acts that may be represented as maximizing expected utility. The definition of preference affects the force of such principles. The ordinary sense of preference acknowledges the possibility of weakness of will and acting contrary to preferences. However, some theorists for technical reasons define preferences so that a person acts according to preferences, so telling a person to pick an option from the top of her or his preference ranking of options has no normative forceshe or he does that by stipulation.

The principle of consumer sovereignty puts basic preferences beyond criticism. Some basic preferences are irrational, however. Forming preferences among ice cream flavors one has not tasted may be irrational. Having a pure time preference may be irrational. That is, it may be irrational to prefer the smaller of two goods just because it will arrive sooner than the larger good. Certainty of having the larger good if one waits for it is a strong reason for waiting.

The chief principle of rational decision making is to pick an option from the top of ones preference ranking of options. If some options are gambles, a supplementary principle says to prefer one option to another option just in case its expected utility is higher than the expected utility of the other option. J. Howard Sobel (1994) and Paul Weirich (2001) analyze such principles of rational choice.

DECISION MAKING

Rationality evaluates free acts that an agent fully controls. Decisions are in this category; so are acts such as taking a walk. Rationality evaluates acts an agent controls directly by comparing them with rivals and evaluates acts an agent controls indirectly by evaluating their components. An agent directly controls a decision, and so rationality evaluates it by comparing it with its rivals. An agent indirectly controls taking a walk, and so rationality evaluates it by evaluating its directly controlled components. The rationality of a series of acts, such as having dinner and going to a movie, depends on the rationality of its temporal components.

Game theory, expounded in classic texts by John von Neumann and Oskar Morgenstern (1944) and R. Duncan Luce and Howard Raiffa (1957), addresses decisions people make in contexts where the outcome of one persons decision depends on the decisions that other people make. Strategic reasoning looks for combinations of decisions that form an equilibrium in the sense that each decision is rational given the other decisions. A common principle for such strategic situations recommends making a decision that is part of an equilibrium combination of decisions. Edward McClennen (1990), Robert Stalnaker (1998, 1999), Weirich (1998), Andrew Colman (2003), and Michael Bacharach (2006) conduct critical appraisals of principles of rationality widespread in game theory.

A principle of rationality may be controversial. A common pattern for controversy begins with a claim that in some cases thoughtful people fail to comply with the principle. Some respond that in those cases people are rational and the principle is faulty. Others respond that the principle is fine and people are irrational. Still others hold that people in the problem cases actually comply with the principle, contrary to the initial claim.

For example, Amos Tversky and Daniel Kahneman (Tversky and Kahneman 1982) present cases in which people form judgments that fail to comply with the probability axioms. In their study a story describes a young woman as a political activist and a college graduate with a philosophy major. People asked to speculate about the womans current activities may put the probability of her being a feminist and a bank teller higher than the probability of her being a bank teller only. This ignores the law that the probability of a conjunction cannot be higher than the probability of a conjunct. Some theorists may conclude that people are irrational in their probability judgments, others that people have in mind the probability that the woman is a feminist given that she is a bank teller rather than the probability of the conjunction that she is a feminist and is a bank teller. In this particular example, few dispute the law of probability concerning conjunctions.

Kahneman and Tversky (Kahneman and Tversky 1979) also present cases in which it seems that people fail to comply with the principle to maximize expected utility. A person may prefer a gamble that pays a guaranteed $3,000 to a gamble that pays $4,000 with probability 80 percent and $0 with probability 20 percent. The same person may prefer a gamble that pays $4,000 with probability 20 percent and $0 with probability 80 percent to a gamble that pays $3,000 with probability 25 percent and $0 with probability 75 percent. Let U stand for utility. If the first preference agrees with expected utilities, it seems that U ($3,000) > 0.80 U ($4,000). If the second preference agrees with expected utilities, it seems that 0.20 U ($4,000) > 0.25 U ($3,000) and hence, multiplying both sides by 4, that 0.80 U ($4,000) > U ($3,000). Because the inequalities for the two preferences are inconsistent, it seems impossible that both preferences agree with expected utilities.

One response is to reject the principle of expected utility maximization. Another response denies the rationality of having the pair of preferences. A third response claims that people care about factors besides monetary outcomes. They may, for instance, value certainty and the elimination of risk. Then the pair of preferences may agree with broadly based expected utilities without implying inconsistent inequalities.

RATIONAL CHOICE THEORY

Rational choice theory uses principles of rationality to explain behavior. The social and behavioral sciences and even literary interpretation employ it. Proponents claim that rational choice theory yields insightful analyses using simple principles of rational behavior. Critics claim that those simple principles are too austere to adequately characterize human behavior. This debate turns on the principles of rationality at issue. Some rational choice theorists may use only principles of instrumental rationality. In that case, evaluation of basic goals is missing. Other rational choice theorists use more comprehensive principles of rationality to extend the theorys scope. They provide for principles that evaluate basic goals.

Various applications of rationality yield distinct types of rationality, such as bounded, procedural, and substantive rationality. Herbert Simon (1982) is famous for treating these types of rationality. Principles of bounded rationality set standards for people and other nonideal agents with limited cognitive power. Contrasting principles set high standards for ideal agents with unlimited cognitive power. Rationality may require ideal agents to maximize utility, whereas it requires real people to satisfice, that is, to adopt the first satisfactory option discovered. The principle to satisfice is a principle of procedural rationality because it recommends a procedure for making a decision and does not characterize the content of the decision it recommends. A substantive principle may recommend making a decision that maximizes utility. Whether a decision maximizes utility depends on its content. It depends on the prospects of acting according to the decision. Compliance with a substantive principle of rationality, such as utility maximization, may require a procedure that is more trouble than its outcome justifies. Spending hours to make a move in a chess game may sap the games fun. Sometimes thorough calculation is too costly, and one should make a quick decision. It may be sensible to adopt the first satisfactory course of action that comes to light instead of running through all options, calculating and comparing their utilities.

An evaluator may apply a substantive principle to an act already performed. The principle judges the act without regard for the process that produced it. The principle of utility maximization gives an optimal option high marks whether it arose from a thorough or a hasty review of options. The principle evaluates the option adopted and not the method of its adoption. In contrast, an agent applies a procedural principle to discover an act to perform. A rational procedure may culminate in an act that is not optimal. Rationality does not require calculating in all cases. In many cases, weighing pros and cons, computing utilities, and comparing all options is not a rational way to make a decisionspontaneity may be appropriate. A rational decision procedure takes account of circumstances. Brian Skyrms (1990), Ariel Rubinstein (1998), Gerd Gigerenzer (2000), Gigerenzer and Reinhard Selten (2000), Weirich (2004), and John Pollock (2006) pursue these themes.

Principles of rationality vary in the scope of their evaluations of acts. Some principles evaluate a decision for instrumental rationality, taking for granted the beliefs and desires that generate it. Others evaluate the beliefs and desires along with the decision. Principles of rationality also adopt conditions. A principle may evaluate a decision, assuming unlimited time and cognitive resources for reaching it. Idealizations play a crucial role by generating an initial theory with simplified principles of rationality. Relaxing idealizations later leads to more general principles and to a more realistic theory.

Principles of conditional rational also provide a way of putting aside mistakes. A persons act may be rational given his or her beliefs, though his or her beliefs are mistaken and if corrected would support a different act. Evaluating his or her act for nonconditional rationality requires a complex assessment of the significance of the mistaken beliefs. Conditional rationality has an interesting structure resembling the structure of conditional probability. The rationality of an act given a background feature is not the rationality of the conditional that if the background feature holds then the act is performed. Nor is it the truth of the conditional that if the background feature holds then the act is rational.

Theoretical rationality treats belief formation, and practical rationality treats action. A theory of practical reasoning formulates rules of inference, leading to a conclusion that an act should be performed. It classifies reasons for acts and assesses their force. (For a survey, see Parfit 1984; Bratman 1987; Broome 2001; and Bittner 2001.)

Some arguments that degrees of belief should conform with the probability axioms point out that failure to comply leaves one open to a series of bets that guarantees a loss, that is, a Dutch book. This observation yields pragmatic reasons for compliance with the axioms. Some theorists hold that the probability axioms require a purely epistemic justification.

The principle to maximize expected utility uses probability, and so there are grounds for holding that probability is not purely epistemic and that its axioms do not need a purely epistemic justification. In contrast, probabilitys role in assessing an options prospects requires that it represent only the strength of evidence. If it is sensitive to an agents goals, even cognitive goals, then using it to calculate an options expected utility counts the agents goals twice: one time in calculating the utilities of the options possible outcomes and a second time in calculating the probabilities of the possible outcomes. A purely epistemic justification of the probability axioms may be required given probabilitys role in the calculation of an options expected utility. It may be required because of probabilitys role as a guide to action.

Studies of rationality are multidisciplinary because several fields have a stake in their outcomes. Progress with theories of rationality is broadly rewarding, and many scholars are contributing.

SEE ALSO Altruism; Behavior, Self-Constrained; Collective Action; Economics, Experimental; Epistemology; Expected Utility Theory; Game Theory; Information, Economics of; Kant, Immanuel; Logic; Maximization; Minimization; Morality; Optimizing Behavior; Philosophy; Probability Theory; Psychology; Random Samples; Rawls, John; Risk; Sen, Amartya Kumar; Simon, Herbert A.; Social Contract; Theory of Mind; Utility, Von Neumann-Morgenstern

BIBLIOGRAPHY

Bacharach, Michael. 2006. Beyond Individual Choice: Teams and Frames in Game Theory. Eds. Natalie Gold and Robert Sugden. Princeton, NJ: Princeton University Press.

Bittner, Rüdiger. 2001. Doing Things for Reasons. Oxford: Oxford University Press.

Bratman, Michael. 1987. Intention, Plans, and Practical Reason. Cambridge, MA: Harvard University Press.

Broome, John. 2001. Normative Practical Reasoning. Proceedings of the Aristotelian Society 75 (supp.): 175-193.

Colman, Andrew. 2003. Cooperation, Psychological Game Theory, and Limitations of Rationality in Social Interaction. Behavioral and Brain Sciences 26: 139-198.

Foley, Richard. 1993. Working without a Net. New York: Oxford University Press.

Gigerenzer, Gerd. 2000. Adaptive Thinking: Rationality in the Real World. New York: Oxford University Press.

Gigerenzer, Gerd, and Reinhard Selten. 2000. Rethinking Rationality. In Bounded Rationality: The Adaptive Toolbox, eds. Gerd Gigerenzer and Reinhard Selten, 1-12. Cambridge, MA: MIT Press.

Halpern, Joseph. 2003. Reasoning about Uncertainty. Cambridge, MA: MIT Press.

Jeffrey, Richard. [1965] 1983. The Logic of Decision. 2nd ed. Chicago: Chicago University Press.

Kahneman, Daniel, and Amos Tversky. 1979. Prospect Theory: An Analysis of Decision under Risk. Econometrica 47: 263-291.

Levi, Isaac. 2004. Mild Contraction: Evaluating Loss of Information due to Loss of Belief. Oxford: Clarendon.

Luce, R. Duncan, and Howard Raiffa. 1957. Games and Decisions: Introduction and Critical Survey. New York: Wiley.

McClennen, Edward. 1990. Rationality and Dynamic Choice: Foundational Explorations. Cambridge, U.K.: Cambridge University Press.

Mele, Alfred, and Piers Rawling, eds. 2004. The Oxford Handbook of Rationality. New York: Oxford University Press.

Olson, Mancur. 1965. The Logic of Collective Action: Public Goods and the Theory of Groups. Cambridge, MA: Harvard University Press.

Parfit, Derek. 1984. Reasons and Persons. Oxford: Clarendon.

Pollock, John. 2006. Thinking about Acting: Logical Foundations for Rational Decision Making. New York: Oxford University Press.

Rachlin, Howard. 2002. Altruism and Selfishness. Behavioral and Brain Sciences 25: 239-296.

Rescher, Nicholas. 1988. Rationality: A Philosophical Inquiry into the Nature and the Rationale of Reason. Oxford: Clarendon.

Resnik, Michael. 1987. Choices. Minneapolis: University of Minnesota Press.

Rubinstein, Ariel. 1998. Modeling Bounded Rationality. Cambridge, MA: MIT Press.

Sen, Amartya. 1977. Rational Fools. Philosophy and Public Affairs 6: 317-344.

Simon, Herbert. 1982. Behavioral Economics and Business Organization. Vol. 2 of Models of Bounded Rationality. Cambridge, MA: MIT Press.

Skyrms, Brian. 1990. The Dynamics of Rational Deliberation. Cambridge, MA: Harvard University Press.

Sobel, J. Howard. 1994. Taking Chances: Essays on Rational Choice. Cambridge, U.K.: Cambridge University Press.

Stalnaker, Robert. 1998. Belief Revision in Games: Forward and Backward Induction. Mathematical Social Sciences 36: 31-56.

Stalnaker, Robert. 1999. Knowledge, Belief, and Counterfactual Reasoning in Games. In The Logic of Strategy, eds. Cristina Bicchieri, Richard Jeffrey, and Brian Skyrms, 3-38. New York: Oxford University Press.

Tversky, Amos, and Daniel Kahneman. 1982. Judgments of and by Representativeness. In Judgment under Uncertainty: Heuristics and Biases, eds. Daniel Kahneman, Paul Slovic, and Amos Tversky, 84-98. Cambridge, U.K.: Cambridge University Press.

Von Neumann, John, and Oskar Morgenstern. 1944. Theory of Games and Economic Behavior. Princeton, NJ: Princeton University Press.

Weirich, Paul. 1998. Equilibrium and Rationality: Game Theory Revised by Decision Rules. Cambridge, U.K.: Cambridge University Press.

Weirich, Paul. 2001. Decision Space: Multidimensional Utility Analysis. Cambridge, U.K.: Cambridge University Press.

Weirich, Paul. 2004. Realistic Decision Theory: Rules for Nonideal Agents in Nonideal Circumstances. New York: Oxford University Press.

Paul Weirich

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Rationality." International Encyclopedia of the Social Sciences. . Encyclopedia.com. 19 Aug. 2017 <http://www.encyclopedia.com>.

"Rationality." International Encyclopedia of the Social Sciences. . Encyclopedia.com. (August 19, 2017). http://www.encyclopedia.com/social-sciences/applied-and-social-sciences-magazines/rationality

"Rationality." International Encyclopedia of the Social Sciences. . Retrieved August 19, 2017 from Encyclopedia.com: http://www.encyclopedia.com/social-sciences/applied-and-social-sciences-magazines/rationality

rationality

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"rationality." A Dictionary of Sociology. . Encyclopedia.com. 19 Aug. 2017 <http://www.encyclopedia.com>.

"rationality." A Dictionary of Sociology. . Encyclopedia.com. (August 19, 2017). http://www.encyclopedia.com/social-sciences/dictionaries-thesauruses-pictures-and-press-releases/rationality

"rationality." A Dictionary of Sociology. . Retrieved August 19, 2017 from Encyclopedia.com: http://www.encyclopedia.com/social-sciences/dictionaries-thesauruses-pictures-and-press-releases/rationality