Double Effect and Dual Use
DOUBLE EFFECT AND DUAL USE
In moral philosophy the principle of double effect traditionally refers to conflict situations in which an action or series of actions will result in multiple effects, some good and some bad. It parallels the contemporary policy concept of dual use: the idea that scientific knowledge or technological artifacts can serve multiple purposes, depending on the context. Dual use targeting and dual use research are areas that sometimes raise ethical dilemmas about the production and use of scientific knowledge and technologies but on other occasions provide multiple justifications for a single policy. Double effect seldom is referred to explicitly in those situations, but its general conditions may provide conceptual clarity with regard to moral permissibility. However, at the level of practical political decision making activities such as risk assessment, technology assessment, and scenario building provide better guidance for handling the ethical problems posed by dual use situations than does double effect reasoning.
Still widely discussed in the bioethics literature, the principle of double effect originated in Catholic scholastic moral philosophy, specifically in the discussion by the theologian Thomas Aquinas (1224–1274) of killing in self-defense:
A single act may have two effects, of which only one is intended, while the other is incidental to that intention. But the way in which a moral act is to be classified depends on what is intended, not what goes beyond such an intention. Therefore, from the act of a person defending himself a twofold effect can follow: one, the saving of one's own life; the other the killing of the aggressor. (Summa theologiae, IIaIIae, q.64, a.7)
This raises the central distinction in double effect reasoning between intention and foresight (Aulisio 1995). In a morally acceptable case of killing in self-defense, the death of the aggressor is a foreseeable effect but the intention is to preserve one's own life. If, however, the killing was intended and not merely foreseen, it is considered homicide.
Originally formulated in slightly more complex terms, the principle of double effect commonly is stated as follows: An action with multiple effects, good and bad, is permissible if and only if (1) one is not committed to intending the bad effects either as the means or the end and (2) there is a proportionate reason for bringing about the bad effects (Bole 1991). The proportionality clause arises from Thomas's insistence that one should not use more violence than necessary in defending oneself: "An act that is properly motivated may, nevertheless, become vitiated, if it is not proportionate to the end intended" (Summa theologiae, IIaIIae, q. 64, a. 7). Subsequent interpreters saw this condition as referring more broadly to the overall balance of good and bad effects.
Paradigm applications of double effect in Catholic bioethics pertain to cases of maternal-fetal conflict and distinctions between palliative care and euthanasia. Double effect also has been used in debates about the use of embryos in medical research. Many theorists question the relevance of double effect reasoning outside the Catholic moral framework (Boyle 1991). Some have argued that although the distinction between intention and foresight is difficult to apply practically, double effect nonetheless applies in any of the multiple moral frameworks that incorporate deontological constraints (in the form of intention) on consequentialist considerations (Kagan 1989). (Deontology asserts that certain acts are intrinsically right or wrong, whereas consequentalism asserts that the rightness or wrongness of an act depends on its consequences.) Traces of double effect reasoning can be seen even in Anglo-American law, for example, in the distinction between first-degree murder and manslaughter.
Double Effect and Dual Use
The concept of dual use is not well formulated for general use but can be understood in light of the principle of double effect as referring to any activity, artifact, or body of knowledge that is intended to bring about good effects but also has foreseeable negative consequences. This definition, however, excludes one of its most common applications: cost-sharing research programs involving industry and the military. For example, the U.S. Department of Defense operates a Dual Use Science and Technology Program to fund jointly with industry partners technologies that can be of use both on the battlefield and in the market. Defined in this sense, dual use is somewhat difficult to consider under the principle of double effect because there is no admitted or foreseen bad result, only multiple good ones. It merely refers to basic research with the potential for positive benefits in more than one sector of the economy and thus offers multiple justifications for governmental support. It is often the case that if political support for a research program cannot be marshaled with one argument (knowledge production alone), scientists have few qualms about appealing to others, such as military or health benefits and economic competitiveness. However, in this case ethical questions arise about whether both uses are equally sound or valid and whether rhetorical appeals to one may contaminate the other.
Insofar as dual use implies both good and bad outcomes, the concept presents even more fundamental challenges for social policies in regard to public support of science and technology. Stanley Rosen introduces the problem by noting that "all fundamental aspects of the natural sciences … may lead to the destruction of the human race. … Whereas no one would argue the wisdom of attempting to prevent a nuclear holocaust or the biochemical pollution of the environment, not many are prepared to admit that the only secure way in which to protect ourselves against science is to abolish it entirely" (Rosen 1989, p. 8). Security requires not only the abolition of science but also the destruction of all children because it is impossible to be certain who eventually may produce knowledge that threatens human existence. Rosen calls this the "ultimate absurdity of the attack against the enlightenment" (Rosen 1989, p. 9).
This absurdity follows from the notion of dual use because nearly all knowledge and artifacts, despite good intentions, could produce foreseeable bad effects. Examples can be as exotic as the "grey goo" (uncontrolled replication of nanotechnology) envisioned by Bill Joy (2000), as mundane as using a pen as a stabbing instrument, or as horrifying as the deadly use of commercial airplanes by terrorists on September 11, 2001. Rosen's point is that the only way to guarantee safety is to ban science and its technological products entirely.
Of course, society does not follow this absurd logic because most people feel that the benefits provided by science and technology (the intended good effects) make it worthwhile to risk some of the foreseeable bad effects. People seek a judicious regulation of scientific inquiry and technological progress, and it is in this middle ground that the major ethical questions are raised by dual use phenomena: Do the foreseeable bad effects outweigh the intended positive ones? Are there ways to minimize the negative effects without compromising the positive ones? Are there some foreseeable consequences that are so appalling that people should ban the production or dissemination of knowledge in a certain area altogether?
These questions show the importance of the proportionality condition of the principle of double effect. In fact, proportionality is disclosed through activities such as risk assessment, technology assessment, and scenario building. Those activities involve processes of weighing the good and bad effects of research and technology in light of uncertainty about their relative probabilities. The distinction between intention and foresight is less difficult to apply, at least in theory, because if someone is attempting intentionally to bring about bad effects, say, by engineering a supervirulent pathogen, it seems obvious that there should be intervention to end that work. Indeed, in the realm of biotechnology dual use situations are difficult to deal with precisely because bad effects are not intended (cures, vaccines, and other good effects are intended) but nonetheless are foreseeable. Dual use situations present practical challenges to regulate research and ensure the proper use of technology in cases in which double effect analysis provides some insight and conceptual clarity. Dual use can be conceived of more broadly than can the conditions of double effect, however, because some bad effects of science and technology may be unforeseeable, let alone unintended.
Conduct of War and Biological Research
Precision-guided munitions and satellite-aided navigation have enhanced the accuracy of aerial bombardment. Although this has improved the ability of military planners to minimize collateral damage, it has raised an ethical dilemma: Military leaders are faced with questions of the legitimacy of dual use targeting, or the destruction of targets that affect both military operations and civilian lives. An example of such dual use targeting was the destruction of Iraqi electrical power facilities by the U.S. military in Operation Desert Storm in 1991.
Under the principle of double effect such activity would be deemed morally acceptable if the intention was not to harm or kill civilians (a bad effect that is foreseen but unintended) and the good effects outweighed the bad. This application of the principle of double effect relates to the idea of the just war that can be traced back to the theologian Augustine 354–430). Thomas expanded Augustine's idea that one cannot be held accountable for unintended effects caused by chance by applying that principle to include even foreseeable unintended effects that are not due entirely to chance. Like all versions of morality in terms of principles or formulas, however, the principle of double effect only establishes basic guidelines, and the majority of the work lies in deciding how and by whom such judgments about good and bad effects should be made.
Nuclear science provides the paradigmatic case of dual use summarized in the tension between physicists' initial hopes of "atoms for peace" and the grim reality of international proliferation of nuclear weapons. The dual nature of civilian and military use of nuclear science and technology poses grave problems in international relations, as witnessed by suspicions that Iran and other nations were developing nuclear weapons while claiming that such research was intended for civilian use only. The added possibility that terrorists could acquire weapons-grade nuclear material raises the stakes even higher.
The same concerns have surfaced around nanotechnology but have taken on a more mature form in regard to biological research. In 2004 the U.S. National Research Council (NRC) issued a report titled Biotechnology Research in an Age of Terrorism. Presenting recommendations to minimize the misuse of biotechnology, the authors warned: "In the life sciences … the same techniques used to gain insight and understanding regarding the fundamental life processes for the benefit of human health and welfare may also be used to create a new generation of [biological warfare] agents by hostile governments and individuals" (U.S. National Research Council 2004, p. 19). Attention was paid to the risk that dangerous research agents could be stolen or diverted for malevolent purposes and the risk that research may result in knowledge or techniques that could facilitate the creation of novel pathogens. The report characterizes the central tension as one of reducing the risks of the foreseeable unintended bad effects while allowing for the continuation of the good effects yielded by biomedical research. One major dilemma is the trade-off between national security and scientific freedom of inquiry.
The distinction between intention and foresight and the proportionality condition are reasonable concepts for understanding the nature of this dual use situation. Clearly, mechanisms must be in place to ensure that researchers are not working intentionally toward bad effects either directly in the laboratory or covertly by sharing information with terrorists or other enemies. The more difficult questions, however, are left even when the assumption is made that no malevolent intentions exist.
The U.S. government established the National Science Advisory Board for Biosecurity (NSABB) to provide advice to federal departments and agencies on ways to improve biosecurity, which refers to practices and procedures designed to minimize the bad effects of biological research while maximizing the good effects. The U.S. Patriot Act of 2001 and the Bioterrorism Preparedness and Response Act of 2002 established the statutory and regulatory basis for protecting biological materials from misuse. The NSABB develops criteria for identifying dual use research and formulates guidelines for its oversight and the public presentation, communication, and publication of potentially sensitive research. It works with scientists to develop a code of conduct and training programs in biosecurity issues. NSABB rules apply only to federally funded research. A possible avenue for the oversight of dual use research is Institutional Biosafety Committees (IBCs) for case-by-case review and approval.
The mechanisms fashioned by the NSABB for the regulation of dual use research are a good example of how the general spirit of double effect analysis is manifested in specific actions, raising political issues such as the proper balance of self-regulation by the scientific community and outside intervention. Members of IBCs and those involved in implementing other NSABB rules face the challenge of interpreting and applying the general guidelines provided by the principle of double effect in the sense that they must wrestle with difficult ethical dilemmas posed by good intentions and their foreseeable bad effects.
Aulisio, Mark P. (1995). "In Defense of the Intention/Foresight Distinction." American Philosophical Quarterly 32(4): 341–354. Defends the distinction against challengers who claim that foresight of a probable consequence of one's actions is sufficient to consider that consequence part of one's intentions.
Bole, Thomas J. (1991). "The Theoretical Tenability of the Doctrine of Double Effect." Journal of Medicine and Philosophy 16: 467–473. Contends that the principle of double effect is relevant to different moral frameworks.
Boyle, James. (1991). "Further Thoughts on Double Effect: Some Preliminary Responses." Journal of Medicine and Philosophy 16: 467–473.
Joy, Bill. (2000). "Why the Future Doesn't Need Us." Wired 8(4): 238–262. Also available from http://www.wired.com/wired/archive/8.04/joy.html. A pessimistic outlook on the impending loss of human control as genetics, nanotechnology, and robotics become integrated research programs.
Kagan, Shelly. (1989). The Limits of Morality. New York: Oxford University Press. Argues that the ordinary understanding of limits imposed by morality and limits on what morality can demand of people cannot be defended adequately. Contains a section on intending harm.
Rosen, Stanley. (1989). The Ancients and the Moderns: Rethinking Modernity. London: Yale University Press.
U.S. National Research Council: Committee on Research Standards and Practices to Prevent the Destructive Application of Biotechnology. (2004). Biotechnology Research in an Age of Terrorism. Washington, DC: National Academies Press.