Terrorism and Science

views updated

TERRORISM AND SCIENCE

When the U.S. Department of Homeland Security (DHS) was proposed in 2002, President George W. Bush (b. 1946) noted that "in the war against terrorism, America's vast science and technology base provides us with a key advantage." What he failed to mention is that science and technology are also major sources of vulnerability to terrorist attacks, requiring decisions about censorship of publication and restriction of access to sensitive areas and materials. Thus terrorism poses special problems for the scientific and technical community in two respects: how to limit terrorist access to sensitive knowledge and technology, and what scientific research and technological developments to pursue in the interests of countering terrorist threats. Although scientists and engineers must bring their professional ethical responsibilities to bear on both tasks, it is equally important that decision makers understand the related limitations of science and technology.

Limiting Terrorist Access

Because of their multiple use capabilities, scientific knowledge and technological devices can be used by terrorists for purposes other than those originally intended. Preventing such misuse presents policy makers and the scientific and engineering communities with two challenges. First, they must insure that knowledge and information are not inappropriately disclosed. Second, they must secure existing and proposed technologies (e.g., nuclear power plants) and research materials (e.g., pathogens). In general, policies in the first case involve restricting the availability of sensitive information by the government, scientists, or both. Actions in the second case generally involve containment, monitoring, and restriction of access. Both actions raise tensions between the goals of security and scientific freedom and openness in the creation and exchange of knowledge and products. Striking the proper balance between these competing goods has taken on heightened importance since the terrorist attacks of September 11, 2001, and the responses by the governments of the United States and other nations.

The situation is made more complex by the notion that some degree of scientific freedom is necessary for national security, because it facilitates the creation of new knowledge and artifacts that may be useful in preventing or responding to terrorist attacks. Especially in the biomedical field, circumstances are further complicated by the potential twin effects of secrecy and restricted access. In some cases, these effects may protect public health by preventing terrorist from acquiring sensitive information or dangerous pathogens. In others, they may harm public health by preventing the development of cures and vaccines or inhibiting the coordination of response efforts to disease outbreaks. In some cases, the potential benefits of researching pathogens to mitigate the effects of terrorist attacks may not be worth the risks. This has sparked controversies about the creation and siting of biosafety laboratories that handle dangerous pathogens.

The free creation and exchange of knowledge by scientists can present dangerous, unintended consequences for society. A paper by Ronald Jackson and other researchers found that the insertion of IL-4 genes into mousepox viruses resulted in near total immunosuppression (Jackson, Ramsay, Christensen, et al. 2001). This advanced valuable knowledge about immune system functioning, but it also evoked fears that terrorists could use such knowledge to engineer hyper-virulent viruses. Similarly, the journal Science published a paper in 2002 that showed how to assemble a poliovirus from readily available chemicals (Cello, Aniko, Eckerd 2002). The threat of terrorist acts has caused political leaders and members of the scientific community to question whether such knowledge should be created, and if so, how its publication and exchange should be regulated.

In New Atlantis (1627), Francis Bacon (1561–1626) imagined the self-censoring activity of scientists in recognition of the fact that politically authorizing experimental science entails societal risks. The twentieth century provided several examples of tradeoffs between security and openness in the pursuit of knowledge. The Manhattan Project that produced the first atomic bomb cultivated a culture of secrecy. A similar culture developed among researchers studying microwaves during World War II. During the Cold War, the U.S. government attempted to constrain information exchange in some areas of mathematics and the physical sciences that may have aided Soviet nuclear weapons development (Monastersky 2002). Physicist Edward Teller (1908–2003) and others eventually persuaded policy makers that openness, rather than secrecy, was the best tactic for security during the Cold War.

In 1975, an international group of scientists held the Asilomar conference to debate the proper use and regulatory oversight of recombinant DNA research. During the late 1970s, the National Security Agency (NSA) regulated cryptographers developing new algorithms, but the two groups eventually agreed to a system of voluntary submission of papers for review. In 2002, the U.S. government began to withdraw from public release more than 6,600 technical documents dealing mainly with the production of germ and chemical weapons. In a controversial move, the U.S. national policy for the restriction of information that may threaten national security was altered in the wake of the September 11 attacks to include restrictions on publication of federally-financed research deemed to be "sensitive but not classified" (Greenberg 2002).

As these examples illustrate, limitations on research and the availability of technical knowledge can come in the form of self-imposed screening mechanisms by the scientific community or government regulation. The Asilomar conference, for example, led to a suite of self-policing mechanisms within the scientific community, including the decentralized system of Institutional Biosafety Committees (IBCs). This same mechanism has been proposed by the National Science Advisory Board for Biosecurity (NSABB) as a way to prevent the misuse of biological research by terrorists. The NSABB also works to develop codes of conduct for researchers and laboratory workers, which underscores the importance of ethical conduct by individuals, especially where no rules exist or where the precise meaning of rules is unclear. Some professional associations and journals, including Science and Nature, have instituted procedures to give special scrutiny to papers that raise security concerns (Malakoff 2003). Putting such control in the hands of journal editors has caused some to argue that an advisory group like the Recombinant DNA Advisory Committee (RAC) would be a better mechanism.

Mitchel Wallerstein (2002) points out that the dangers posed by terrorists acquiring sensitive science and technology information differ from the state-related threats that were of primary concern during World War II and the Cold War. Terrorists generally do not seek out and would not be able to use the results of most basic research, but states may possess the intellectual and financial capital necessary to turn basic research into weapons. Daniel Greenberg (2002) contends that terrorists do not rely on new science. Rather, readily accessible information that has long been available suffices to fulfill most of the goals of terrorist organizations.

Biological weaponry is the area of science that could most directly benefit terrorist organizations. Wallerstein writes, "Information that improves knowledge of dangerous pathogens, their safe handling, and their weaponization increases the likelihood that such weapons could be produced covertly on a small scale" (p. 2169). His general conclusion is that restrictions on scientific and technical communications need occur only on a much smaller scale than during the Cold War. In fact, many echo his conclusion that sensitive research is a very narrow slice of the scientific world, which allows for severe but highly targeted restrictions.

Restricting the publication of information deemed sensitive and controlling access to technologies and research materials can help achieve security goals, but not without costs (Knezo 2002a). Some impacts are relatively minor, such as new standards for the construction and management of laboratories. Other impacts are more severe, including the impact of national security policy measures on the research process. Tightened laboratory access policies, publication rules, and visa restrictions may reduce the number of applications by foreign students to U.S. universities and colleges. This could hamper cross-cultural understanding. According to State Department rules, consular officials may deny visas for study in the United States in sixteen categories specified on the Technology Alert List to students from countries listed as "state sponsors of terrorism." Additional exemptions to the Freedom of Information Act (FOIA) and the withdrawal of information from federal agency websites have also sparked concerns about constraints on legitimate scientific work and academic freedoms.

Economic losses are also a concern about some legislative responses to security risks posed by science and technology. Instituting security and tracking measures in academic laboratories entails additional costs for researchers. Restrictions on foreign researchers can damage technological developments and economic productivity. The U.S. Immigration and Customs Enforcement (ICE) agency operates "Project Shield America" to prevent the illegal export of sensitive munitions and strategic technology to terrorists. It is intended to prevent terrorism, but may also entail losses to economic competitiveness.

Science and Technology to Counter Terrorism

Since the September 11 attacks, science and technology have increasingly been advertised as ways to prevent terrorist attacks as well as reduce vulnerabilities and minimize impacts of such attacks (e.g., Colwell 2002). This is in part a response by scientists and engineers to the sizeable increases in homeland security and counterterrorism research and development (R&D).

The National Research Council's Committee on Science and Technology for Countering Terrorism issued a report in 2002 that described the ways in which science and engineering can contribute to making the nation safer against the threat of catastrophic terrorism. It outlined both short-term applications of existing technologies and long-term research needs. The report recommended actions for all phases in countering terrorist threats, which can be roughly ordered as awareness, prevention, protection, response, recovery, and attribution. Different threats pose different challenges and opportunities across these phases. For example, nuclear threats must be addressed at the earliest stages, whereas biological attacks are more difficult to preempt, but more opportunities exist for technological intervention to mitigate their effects.

Scientific research and technological innovations can improve performance of all phases, from threat analyses and vulnerability assessments to post-attack investigations and restoration of services. For example, the Bush administration established BioWatch, a nationwide system of sensors to detect the presence of certain pathogens, and a public-health surveillance system that monitors the databases of eight major cities for signs of disease outbreaks. Early warning systems can detect the presence of certain pathogens by utilizing computer chips and antibodies or pieces of DNA (Casagranda 2002). Explosives-detection technologies have also been spurred since September 11, 2001 in order to bolster airline security.

Other examples include the use of biometrics (e.g., fingerprints and retinal signatures) to develop national security identity cards. The shipping industry is slowly adopting new security measures such as sophisticated seals and chemical sensors. Other researchers are developing strategies for securing information systems. Military infrared countermeasures for surface-to-air missiles may be used on civilian aircraft. Technologies for decontamination, blast-resistant walls, and protective gear for first responders are other components of research programs. Increasing flexibility and innovating measures to isolate failing elements could increase security of more complex technical systems such as transportation and communication infrastructures. Researching and developing broader applications of renewable energy can harden the energy infrastructure. Social scientists and psychologists also provide research for understanding causes and motivations of terrorists as well as the dynamics of terrorist group formation. Some (e.g., Susser, Herman, Aaron 2002) have demonstrated that, because terrorists choose targets to maximize psychological impact, mental health must be considered a top response priority.

With all of these potential applications of science and technology, decision makers need to address questions about how to coordinate, organize, prioritize, and evaluate investments to serve the goals of security and public health. Genevieve Knezo (2002b) reported that prior to September 11, 2001, the Government Accountability Office (GAO) and other authorities had questioned whether the U.S. government was adequately prepared to conduct and use R&D to prevent and combat terrorism. Partially in response to the need to better coordinate counterterrorism efforts (including R&D), the cabinet-level Department of Homeland Security (DHS) was created by legislative act in 2002. This incorporated half of all homeland security funding within a single agency. In addition to legislative activity, new advisory bodies such as the NSABB have been formed to guide the creation of new rules and development of new institutions to maximize the benefits of science and technology while minimizing unintended negative impacts.

Since September 11, 2001, established institutions have benefited from significantly increased funding for homeland security and public health research. For example, in 2002 President Bush proposed a 2,000 percent budget increase for the National Institute of Allergy and Infectious Diseases (NIAID) from pre-September 11 levels. Other institutions and agencies have either received additional funding (especially the National Institutes of Health) or made attempts to restructure their priorities to take advantage of shifts in R&D funding priorities (Congressional Research Service 2002; American Association for the Advancement of Science 2004).

Investments in science to reduce terrorist threats raise several ethical issues. First, the scale of vulnerabilities outstrips resources to reduce them, which raises equity issues in the process of prioritizing investments. For example, bioweapons detectors are too expensive to deploy on every street corner, so locations must be prioritized. Likewise, not all areas pose equal risks from terrorist attacks, so efforts need to be targeted to match threats.

Second, Arthur Caplan and Pamela Sankar (2002) note the increase in "research protocols that call for the deliberate exposure of human subjects to toxic and noxious agents" (p. 923). Such dilemmas are not new, as many trials on U.S. Navy and Army crew members took place in the 1960s in an effort to document the effects of biological and chemical weapons. Many research subjects were neither informed of the details of the study nor issued protective gear (Enserink 2002). Such research needs clear guidelines and unequivocal justification for its relevance to national security. Professional ethical issues also arise when unemployed scientists and engineers face financial incentives to aid terrorist organizations (Richardson 2002).

Third, the integrated nature of socio-technical systems raises considerations of equity and civil liberties. For example, forty percent of all containerized cargo that arrives in the Long Beach harbor in Los Angeles is destined for the U.S. interior. How should the burden of increased security costs be distributed? Furthermore, the process of hardening these systems can reduce access and curtail certain civil liberties (Clarke 2005).

Finally, several analysts have criticized dominant U.S. counterterrorism science policies as ineffective. Bruce Schneier, security technologist and cryptographer, argues that managers too often seek technological cure-alls and rarely consider the consequences of system failures (Mann 2002). For example, all security systems require secrets, but they should be the components that are most easily changed in case system integrity is breached. Biometric identity devices that use fingerprints can centralize so many functions that they create "brittle" systems that fail poorly in case they are stolen. New banking account numbers can be issued in case of fraud, but not new fingerprints. Schneier contends that in airline security the only effective measures are the low-tech solution of reinforcing cockpit doors and the nontechnical fact that passengers now know to fight back against hijackers. Both measures pass Kerckhoffs' principle, which occurs when a system remains safe even when almost all of its components are public knowledge. Schneier also holds that security systems are at their best when final decision-making responsibility is given to humans in close proximity to the situation, not computers. Security systems should be ductile, small-scale, and compartmentalized to mitigate the effects of inevitable failures.

Stephen Flynn (2004) focused less on the inherent limitations of technology as a means of countering terrorism; rather he critiqued government R&D prioritizations. Flynn argued that some high-tech solutions such as digital photographs of container loading processes, internal emissions sensors in cargo containers, and GPS tracking devices can improve security, but they have not been given adequate funding.

The 2002 report by the Committee on Science and Technology for Countering Terrorism openly recognizes the fact that science and technology are only one part of a broad array of strategies for reducing the threat of terrorism that includes diplomacy, cross-cultural learning, and economic, social, and military policies. Furthermore, as the U.S. experience in the Vietnam War and the Soviet experience in the 1980s invasion of Afghanistan demonstrate, technological superiority does not guarantee victory. Success in the war on terror is measured by accomplishments, not R&D budgetary numbers.

From communism to environmental problems and the challenges posed by a globalizing economy, science and technology have often been put forward as ways to protect national interests and secure prosperity (Jenkins 2002). Scientists, engineers, and politicians often define problems in ways that call for technical solutions, but they must be held accountable for such problem definitions. Scientists and engineers especially must exercise ethical responsibility by not unduly exaggerating arguments that their research will serve societal goals.


Assessment

The two sections of this entry are interrelated in that increased scientific research on counterterror measures will create new knowledge and opportunities for terrorist exploitation, which will create new challenges for securing that knowledge. Given that security, health, and civil liberties are at stake in decisions about science and terrorism, it is important that measures be taken to involve and inform citizens. This entry has focused on actions by the U.S. government because it plays a leading role in matters of science and terrorism. But other countries and international coalitions face similar ethical dilemmas and policy choices. Private companies own many of the infrastructures that are targets for terrorist attacks, so regulations may be required to induce the private sector to invest in counterterrorism technologies that may not have commercial markets. Some scientific research, however, may have viable market applications, meaning that some of the R&D burden can be privatized, which raises other ethical issues that partially mirror those involved in the privatization of war.


ADAM BRIGGLE

SEE ALSO Biological Weapons;Building Destructions and Collapses;Chemical Weapons;Fire;Information Ethics;International Relations;Security;Terrorism.

BIBLIOGRAPHY

Caplan, Arthur L., and Pamela Sankar. (2002). "Human Subjects in Weapons Research." Science 298(5595): 923.

Casagrande, Rocco. (2002). "Technology against Terror." Scientific American 287(3): 83–87.

Cello, Jeronimo, Paul V. Aniko, and Wimmer Eckard. (2002). "Chemical Synthesis of Poliovirus cDNA: Generations of Infectious Virus in the Absence of Natural Template." Science 297(5583): 1016–1018.

Clarke, Richard A. (2005). "Ten Years Later." Atlantic Monthly 295(1): 61–77. Imagines future scenarios of terrorist attacks and government reactions.

Enserink, Martin. (2002). "Secret Weapons Tests' Details Revealed." Science 298(5593): 513–514.

Committee on Science and Technology for Countering Terrorism, National Research Council. (2002). Making the Nation Safer: The Role of Science and Technology in Countering Terrorism. Washington, DC: National Academy Press.

Flynn, Stephen. (2004). America the Vulnerable: How Our Government Is Failing to Protect Us from Terrorism. New York: HarperCollins.

Greenberg, Daniel S. (2002). "Self-Restraint by Scientists Can Avert Federal Intrusion." Chronicle of Higher Education October 11: B20.

Jackson, Ronald J., Alistair J. Ramsay, Carina D. Christensen, et al. (2001). "Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox." Journal of Virology 75(3): 1205–1210.

Jenkins, Dominick. (2002). The Final Frontier: America, Science, and Terror. London: Verso. Presents the argument that changes in technology have created a situation where all citizens are vulnerable to catastrophic terrorist attacks.

Malakoff, David. (2003). "Researchers Urged to Self-Censor Sensitive Data." Science 299(5605): 321.

Mann, Charles C. (2002). "Homeland Insecurity." Atlantic Monthly 290(2): 81–102.

Monastersky, Richard. (2002). "Publish and Perish?" Chronicle of Higher Education October 11: A16–A19. Focuses on the dilemma of scientific openness and national security.

Richardson, Jacques G. (2002). War, Science and Terrorism: From Laboratory to Open Conflict. London: Frank Cass. Sees the connection between science and terrorism largely as an outgrowth from the partnership between science and the military and asks to what extent science is promoted by actual or threatened armed conflict and whether war is an extension of science by other means.

Susser, Ezra S., Daniel B. Herman, and Barbara Aaron. (2002). "Combating the Terror of Terrorism." Scientific American 287(2): 70–77.

Wallerstein, Mitchel B. (2002). "Science in an Age of Terrorism." Science 297(5590): 2169.

INTERNET RESOURCES

American Association for the Advancement of Science. (2004). "Defense and Homeland Security R&D Hit New Highs in 2005; Growth Slows for Other Agencies." Available from http://www.aaas.org. A summary of AAAS estimates and analyses of appropriations in the FY 2005 omnibus bill and final FY 2005 appropriations bills for federal R&D.

Colwell, Rita R. (2002). "Science as Patriotism." Available from http://www.sciencecoalition.org/presskit/articles/puboped/colwell_jan302002.pdf. Paper presented at the Annual Meeting of the Universities Research Association, Washington DC.

Congressional Research Service. (2002). "Science and Technology Policy: Issues for the 107th Congress, 2nd Session." Available from http://www.ncseonline.org.

Knezo, Genevieve. (2002a). "Possible Impacts of Major Counter Terrorism Security Actions on Research, Development, and Higher Education." CRS Report for Congress. Available from http://www.fas.org.

Knezo, Genevieve. (2002b). Federal Research and Development for Counter Terrorism: Organization, Funding, and Options CRS Report RL31202. Abstract available from http://www.pennyhill.com/index.php?lastcat=200&catname=Terrorism&viewdoc=RL31202.

About this article

Terrorism and Science

Updated About encyclopedia.com content Print Article