Unintended Consequences

views updated

UNINTENDED CONSEQUENCES

Human activities often produce consequences very different from those intended. Indeed this is a theme of classical tragedy and much premodern argument about the indeterminacy of human affairs. Sociologist Robert K. Merton was one of the first to subject "The Unanticipated Consequences of Purposeful Action" (1936) to systematic analysis, noting the influences of the need to act in spite of uncertainties, the allocation of scarce resources such as time and energy, and how personal interests shape perspectives and decisions. Advances in science and technology seem particularly likely to change the world in unanticipated ways. Innovations are by definition something new and are likely to involve unknowns. Innovations may be used in unplanned ways that trigger surprising results. The more complex a system, the harder it is to anticipate its effects. Unintended consequences can shift the cost-benefit analysis of a new technology, theory, or policy; distribute costs and benefits inequitably; or lead to other direct or indirect social problems. Such consequences raise questions of responsibility and liability; decision making under uncertainty; equity and justice; and the role of individual citizens, corporations, universities, and governments in managing science and technology.


Types of Unintended Consequences

Unintended consequences occur in many forms, although the categories are neither entirely discrete nor universally recognized. Accidents are usually immediate and obvious, and result from problems such as mechanical failure or human error, such as the disastrous 1986 explosions, fires, and releases of radiation at the nuclear rector in Chernobyl, Russia.


Side effects are additional, unanticipated effects that occur along with intended effects, such as gastrointestinal irritation resulting from aspirin taken to relieve pain. Double effects, meaning simply two effects, often refer to simultaneous positive and negative effects, as in the aspirin example. Many medical side effects are well documented, such as the devastating effects of diethylstilbestrol (DES) and thalidomide and the ability of bacteria to develop resistance to antibiotics (Dutton et al. 1988).


Surprises could apply to any unintended consequence, but the term is more specifically used, along with false alarms, to describe errors in prediction. A false alarm is when a predicted event fails to occur, such as the millennium computer bug, whereas a surprise is an unexpected event, such as the 2004 Indian Ocean tsunami (Stewart 2000).

Henry N. Pollack (2003) refers to inadvertent experiments, in which human actions unwittingly allow and sometimes force society to consider the effects of its actions. He cites the hole in the ozone layer and climate change as classic examples. Historians of science and technology also have noted the occasional benefits of serendipity in both discovery and invention.


More provocatively science and technology sometimes have the reverse of their intended effects. In the 1970s Ivan Illich (1973) among others argued that scientific and technological development, after crossing a certain threshold, may exhibit a counterproductivity, producing new problems even as it solves old ones. Extending this notion into political theory, Ulrich Beck (1986) argues that unintended consequences in the form of boomerang effects are transforming politics into a concern for the just distribution not of goods but of risks.

With a more individualist focus, Edward Tenner identifies revenge effects as the "ironic unintended consequences of mechanical, chemical, biological, and medical ingenuity" or, more anthropomorphically, as "the tendency of the world around us to get even, to twist our cleverness against us" (Tenner 1997, p. 6). He further divides revenge effects into rearranging effects, that shift the locus or nature of a problem, such as urban air-conditioning making the outside air hotter; repeating effects, that have people "doing the thing more often rather than gaining time to do other things"; recomplicating effects such as the annoying loops of voice mail systems; regenerating effects, in which a proposed solution such as pest control makes a situation worse; and recongesting effects, such as the human ability to clog space with debris from space explorations (Tenner 1997, p. 10).


Direct effects are those that occur fairly quickly, with no intervening factors. Indirect effects are likely to take longer to develop and may involve interactions with other factors; latent side effects also refer to impacts that occur later in time. Secondary effects are the next level of impacts resulting from direct effects; they generally impact people or places other than those a product or activity is intended to affect; these may also be called ripple effects. The secondary effects of smoking on non-smokers have been well documented. N-order effects are even more removed from the direct effects. Cumulative effects are additive. Combinations of substances, particularly pesticides or medicines, are sometimes called cocktail effects, especially in the United Kingdom. Interaction effects are those resulting from a combination of two or more factors that act on or influence each other to produce a result different from either acting alone.

The military uses the term collateral damage to describe injuries to people and property other than intended targets, such as the destruction of the Chinese Embassy during the 1999 North Atlantic Treaty Organization (NATO) bombing campaign in Yugoslavia. Civilian casualties are often framed as collateral damage because ethical principles of noncombatant immunity proscribe the deliberate injury of civilians.

Economists often refer to unintended consequences as externalities, "An action by either a producer or a consumer that affects other producers or consumers, yet is not accounted for in the market price" (Pindyck and Rubinfeld 1998, p. 696). Pollution is usually considered an externality, as its effects on human health, safety, and quality of life are often not factored into industrial costs. Externalities may require management such as government imposed regulations, subsidies, or market-based mechanisms to prevent economic inefficiencies. Externalities such as pollution or hazardous wastes often impose unequal burdens on the poor or powerless, raising questions about equity and environmental justice.

Unintended consequences are different from unanticipated consequences, in which effects may be suspected or known to be likely but are not part of the intended outcome. Some anticipated consequences may be ignored if they interfere with the interests of decision makers or seem relatively minor; cumulative or interactive effects may make them more serious. Knowledge about effects, or effects that should have been anticipated, may be important in deciding who, if anyone, should be held legally, politically, or morally responsible for unintended outcomes.


Causes and Effects

Unintended consequences of science and technology can have many causes. Design flaws may lead to project failure. Materials may not meet expectations. Assumptions may prove incorrect.

Human factors frequently trigger unintended consequences. Human errors, sometimes interacting with technical failures and environmental stresses, often cause accidents, such as the 1984 release of poisonous gas from Union Carbide's pesticide plant in Bhopal, India (Jasanoff 1994). People often use science and technology in unexpected ways. What appears to be operator error may be the result of an overly complex or inherently unsafe technology. Additionally safety measures such as seat belts sometimes may actually increase hazards as people compensate by taking more risks, illustrating a phenomenon known as risk homeostasis.

Unintended consequences may have social, economic, or behavioral as well as physical causes and impacts, especially when transferred from one culture to another. Anthropologists, for instance, have well documented the often unintentionally destructive outcomes of technology transfer across cultures (Spicer 1952). The movie The Gods Must Be Crazy (1981) depicts a comic version of this phenomenon. Effects may be catastrophic, even when the transfer is only from laboratory to market place.

Richard A. Posner (2004), for instance, distinguishes four types of catastrophe, all but one resulting from the unintended consequences of science and technology. The exception is a natural catastrophe. The other categories are accidents from the products of science and technology, such as particle accelerators or nanotechnology; unintended side effects of human uses of technology, such as global climate change; and the deliberate triggering of destruction made possible by dangerous innovations in science and technology, which can be considered technological terrorism. Posner also notes "the tendency of technological advance to outpace the social control of technology" (Posner 2004, p. 20), an instance of cultural lag.

Not all unintended consequences are bad; many innovations have beneficial side effects, and effects can be mixed. For example, 2004 studies on some pain relievers, such as Vioxx or Celebrex, suggest that they may reduce cancer risks while enhancing risks of heart attacks. From the perspective of social scientist Michel de Certeau creative, unintended uses may actually serve as a means for the assertion of human autonomy; using products in ways unintended by the designer is a way of resisting technological determination. Some writers see occasional benefits even in negative unintended consequences. Fikret Berkes and Carl Folke suggest that in some cases, "breakdown may be a necessary condition to provide the understanding for system change," although crisis cannot be allowed to reach the point where it imperils the survival of the system (Berkes and Folke 1998, p. 350). Complexity theorists have even argued the emergence of new forms of spontaneous order from unintended chaotic situations.


Managing Unintended Consequences

How should unintended consequences be managed? Some impacts may be avoided with more careful planning in the design and implementation of innovations, but many writers assume that unexpected negative consequences are inevitable, normal accidents (Perrow 1984), and advocate systems that either minimize such effects or try to manage them.

Unintended consequences often cross temporal and spatial boundaries. When effects cross physical or political barriers, unintended consequences raise questions about responsibility. Indeed, one ethical response to such technological changes in the scope and reach of human action is to argue for the articulation of a new imperative of responsibility (Jonas 1984). How does one country hold another responsible when pollution or other effects cross borders? This is a major question in climate change, where industrialized countries have been the major human source of greenhouse gases but developing countries will suffer the most severe impacts expected, such as sea rise and increased and prolonged regional droughts. In some limited cases national tort law provides compensation for injuries caused by actions taking place outside the borders of the sovereign state. International law is even more problematic, since there is no sovereign providing enforcement, and countries must rely on their ability to reach international agreements to deal with novel and intractable problems such as the hole in the ozone.

Conventional methods of dealing with risk, such as insurance, legal remedies, and emergency procedures, were not designed to deal with the current spread of side effects. When effects occur much later in time they affect future generations, raising issues of intergenerational equity. Is it fair to leave a seriously degraded and hazardous world for future generations?

Three types of errors may be made at the more mundane level of managing unintended consequences (Tenner 1997). Type I errors are those where unnecessary preventive measures are taken, such as keeping a safe and effective product off the market. Type II errors occur when an important protective measure is not taken, such as allowing the use of a very harmful product. Type III errors involve displaced risks, new risks created by protective measures, such as the economic effects of unnecessary environmental regulations.

David Collingridge describes the essential problem with technology, the dilemma of control: "Attempting to control a technology is difficult, and not rarely impossible, because during its early stages, when it can be controlled, not enough can be known about its harmful social consequences to warrant controlling its development; but by the time those consequences are apparent, control has become costly and slow" (Collingridge 1980, p. 19) He proposes "a theory of decision making under ignorance" to make decisions more "reversible, corrigible, and flexible" (p. 12). He works within the fallibilist tradition, which "denies the possibility of justification, and sees rationality as the search for error and the willingness to respond to its discovery" (p. 29). Collingridge advocates a decision process that allows errors to be identified quickly and managed inexpensively. Options should be kept open so that changes can be made as new information becomes available, but this becomes more difficult the longer a technology is in use.

Others have suggested similar systems. Aaron Wildavsky talks about the resilience of systems and advocates a gradual system of response as new information becomes available. Steve Rayner (2000) also stresses the importance of developing resilience to improve society's ability to deal with surprises. Sheila Jasanoff (1994) advocates planning in both the anticipation of and the response to disasters. Kai Lee (1993) and Berkes and Folke (1998) propose using adaptive management to build resilience into the management of natural resources.

Arguing that science and technology themselves can play multiple roles, not only as a source of risks but as means to help identify and prevent problems, as well as to develop adaptation measures to ease negative impacts, Posner (2004) recommends the use of cost-benefit analysis to evaluate risks, saying it is an essential component of rational decisions. He also recognizes that uncertainties create many ethical, conceptual, and factual problems and suggests several methods for coping. Some application of the precautionary principle, or the better safe than sorry approach to decisions, may be appropriate as a variation of cost-benefit analysis in which people choose to avoid certain risks.

John D. Graham and Jonathon B. Wiener (1993) describe the risk tradeoffs that are inevitably faced in protecting human health and the environment; minimizing one risk may actually increase other countervailing risks. In some cases, reducing one risk will cause other coincident risks to decrease, as well. The authors propose a risk trade-off analysis to reveal the tradeoffs likely in any decision, and examine ethical as well as scientific issues. Factors to be considered in evaluating risks include "magnitude, degree of population exposure, certainty, type of adverse outcome, distribution, and timing" (Graham and Wiener 1993, p. 30). Consideration of these factors before making a decision may make it possible to reduce but not eliminate surprise effects.

Corporations, think tanks, universities, or other private institutions may not consult the public about their scientific and technological decisions. Even government-sponsored research and regulation typically involve little public participation. Yet the public is usually the intended user of innovations and bears many of the benefits and burdens of both intended and unintended consequences. Questions for a democratic society include whether the public should play a larger role in decisions regarding science and technology, how meaningful public involvement can be achieved, and how public opinions should be balanced with scientific expertise. Greater public involvement would increase the diversity of interests and values brought to an analysis of and debate about the risks and benefits of innovations in science and technology.

Science and technology funding raise questions about the optimal allocation of public and private funds. Funding rarely is devoted to assessing risks of innovations. Funding to develop solutions to one problem may end up creating other unintended consequences. Should funding agencies require more analysis of possible consequences of funded projects, and should the agencies be held partially responsible for consequences?


Conclusion

The unintended consequences of science and technology are ubiquitous and complex in the contemporary world. They raise important questions about the kind of society in which humans choose to live in, including issues relating to allocation of scarce societal resources; the types and levels of risks society is willing to tolerate; the attribution of responsibility and liability; the right to compensation for injury, the equitable distributions of societal costs and benefits; and the role of individuals, corporations, governments, and other public and private institutions in the control of science and technology.


MARILYN AVERILL

SEE ALSO Enlightenment Social Theory;Normal Accidents;Precautionary Principle;Uncertainty.

BIBLIOGRAPHY

Beck, Ulrich. (1986). Riskogesellschaft: Auf dem Weg in eine andere Moderne [Risk society: Towards a new modernity]. Frankfurt am Main: Suhrkamp. English version, translated by Mark Ritter (London: Sage Publications, 1992). Beck maintains that the changing nature of risks is transforming the developed world from an industrial society to a risk society in which the side effects of industrialization increase in frequency and severity and force society to focus more on risk production than on wealth production.

Berkes, Fikret, and Carl Folke, eds. (1998). Linking Social and Ecological Systems: Management Practices and Social Mechanisms for Building Resilience. Cambridge, UK: Cambridge University Press. This edited volumes presents case studies of the interactions of social and ecological systems in response to environmental change. Authors focus on the nature of the problems faced, responses to those problems, and lessons to be learned.

Collingridge, David. (1980). The Social Control of Technology. New York: St. Martin's Press. Focusing on the social effects as well as the social control of technology, Collingridge describes the problem of control when undesirable effects are not known until change has become more difficult. He proposes a theory of decision making under ignorance so that decisions can be modified as new information becomes available.

De Certeau, Michel. (1984). The Practice of Everyday Life, trans. Steven F. Rendall. Berkeley: University of California Press.

Dutton, Diana D.; Thomas A. Preston; and Nancy E. Pfund. (1988). Worse than the Disease: Pitfalls of Medical Progress. Cambridge, UK: Cambridge University Press.

Graham, John D., and Jonathon Baert Wiener. (1993). Risk vs. Risk: Tradeoffs in Protecting Health and the Environment. Cambridge, MA: Harvard University Press. This book describes the many tradeoffs that must be made in attempting to manage risks, including risks relating to unintended consequences. It provides case studies of trade-offs that have been made, analyzes the problems inherent in such tradeoffs, and suggests ways to manage risks more effectively and democratically.

Illich, Ivan. (1976). Medical Nemesis: The Expropriation of Health. New York: Pantheon Books. In this work, Illich criticized the medical establishment as a major threat to health, claiming that even though more money is being spent on healthcare, fewer benefits are being realized.

Jasanoff, Sheila. (1994). "Introduction: Learning from Disaster." In Learning from Disaster: Risk Management after Bhopal, ed. Sheila Jasanoff. Philadelphia: University of Philadelphia Press. Authors from a variety of disciplines seek to identify lessons to be learned from the 1984 lethal gas leak from a plant in Bhopal, India. The book discusses the problems with transplantation of technologies across cultures and the way culture, politics, and other human factors shape approaches to risk management.

Jonas, Hans. (1984). The Imperative of Responsibility: In Search of an Ethics for the Technological Age. Chicago: University of Chicago Press. Viewed as Jonas' most important work. Here he reflects on the challenges brought to society by nuclear weapons, chemical pollution, and biomedical technologies.

Lee, Kai. (1993). Compass and Gyroscope: Integrating Science and Politics for the Environment. Washington, DC: Island Press. Sustainable development; civic science, and adaptive management are discussed within the context of the controversies over salmon populations in the Columbia River Basin. Lee recommends that science be coupled with public debate in addressing environmental problems.

Merton, Robert K. (1936). "The Unanticipated Consequences of Purposive Social Action." American Sociological Review 1(6): 894–904.

Perrow, Charles. (1999). Normal Accidents: Living with High Risk Technologies. Princeton: Princeton university press.

Pindyck, Robert S., and Daniel L. Rubinfeld. (1998). Microeconomics, 4th edition. Upper Saddle River, NJ: Prentice Hall.

Pollack, Henry N. (2003). Uncertain Science ... Uncertain World. Cambridge, UK: Cambridge University Press.

Posner, Richard A. (2004). Catastrophe: Risk and Response. Oxford: Oxford University Press. Posner Addressed Risks that could actually threaten the future of human life on earth; several of these Potential disasters are the unintended outcomes of innovations in science and technology. He calls for interdisciplinary policy responses to address these potential catastrophies.

Rayner, Steve. (2000). "Prediction and Other Approaches to Climate Change Policy." In Prediction: Science, Decision Making, and the Future of Nature, ed. Daniel Sarewitz, Roger A. Pielke Jr., and Radford Byerly Jr. Washington, DC: Island Press.

Spicer Edward H., (1952). Human Problems in Technological Change: A Cashbook. New York: John Wiley. Includes Lauriston Sharp's widely referenced case study on "Steel Axes for Stone Age Australians."

Stewart, Thomas R. (2000). "Uncertainty, Judgment, and Error in Prediction." In Prediction: Science, Decision Making, and the Future of Nature, ed. Daniel Sarewitz, Roger A. Pielke Jr., and Radford Byerly Jr. Washington, DC: Island Press.

Tenner, Edward. (1997). Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Vintage Books. Unintended consequences are at the center of this book, which describes how and why technology, when combined with human behavior and institutions sometimes seems to turn on the society that produced it. Extensive examples are provided relating to medicine, the environment, computers, and the spread of pests.