Technology is often couched in terms of solving problems such as curing disease, providing for reliable food production, or affording efficient means of transportation. Indeed, technology has proved powerfully effective for solving any number of problems, from the massive project of sending people into space to the minor chore of fastening pieces of paper together. But in a 1966 article, atomic physicist Alvin M. Weinberg raised the following question: Are there some types of problems that cannot—or should not—be fixed by technology? Weinberg coined the term technological fix to describe the use of technology to respond to certain types of human social problems that are more traditionally addressed via political, legal, organizational, or other social processes. Although Weinberg advocated the use of technological fixes in some cases, the term has come to be used frequently as a pejorative by people critical of certain uses of technology.
Writing during the cold war, Weinberg cites nuclear weapons as an example of a technological fix for war. The technological ability to unleash global devastation serves as a deterrent to international aggression. But critics argue that such a solution is at best tenuous, and at worst lessens people's resolve to work diplomatically at ameliorating the underlying clashes of ideology, economy, and culture that lead to war. Nuclear weapons also served as an alternative to maintaining a large standing army such as that of the Soviet Union, thus shifting social sacrifice from the less to the more democratically acceptable—from personal service to government investment in advanced technological weapons research and development. It is this aspect of technological fixes—their tendencies to mask the symptoms of complex social problems without addressing their causes or true costs—that generally evokes ethical concern.
For example, if large numbers of children are being disruptive or having trouble concentrating in school, is the liberal prescription of psychotropic drugs a viable technological way to ease the problem, or does this simply allow parents and teachers to abdicate their responsibilities for good parenting and maintaining discipline, respectively? If employees are using company computers for personal business or entertainment, is installing software to monitor and curb such behavior a viable technological solution, or does this simply foster an atmosphere of distrust without addressing the causes of the problem, perhaps poor morale or inefficient tasking?
These are difficult questions because there are surely some children who could benefit from psychotropic drugs, and there are arguably certain situations in which an employer has a legitimate need to monitor an employee's use of the computer. But once such technological fixes become available, they run the risk of proliferating into universal easy ways out. Or they may simply shift the locus of the problem; in the case of the work computers, spy software does not guarantee greater employee productivity, only that employees will not be unproductive in a particular way.
Despite these criticisms, sociologist Amitai Etzioni (1968) defended the use of what he called technological shortcuts. Etzioni argued that many of the concerns levied against such shortcuts were based on conjecture rather than hard evidence. For example, when better lighting is installed on city streets in an effort to discourage crime, critics claim that this approach treats only the symptoms and does not do anything to address the underlying motivations for crime, nor does it necessarily reduce crime overall; rather, they claim, it just shifts the criminal activities to other locations. But while sounding plausible, such criticisms are typically unsupported by any definitive data. The questions to be asked in this example are, where do criminals go, and what do they do, when their previous stalking grounds are illuminated? "No one knows," writes Etzioni, but "[t]he one thing we do know is that the original 'symptom' has been reduced" (p. 45).
Etzioni also pointed to the deep-seated and intractable nature of many social problems, which suggests the near impossibility of ever implementing any comprehensive solutions via social transformation, particularly given fervent political disagreement about the propriety of various transformation strategies. Thus stopgap shortcuts may be the only recourse. "Often," writes Etzioni, "our society seems to be 'choosing' not between symptomatic (superficial) treatment and 'cause' (full) treatment, but between treatment of symptoms and no treatment at all" (p. 48).
The fundamental difficulty with technological fixes—or shortcuts—is the inherent incompatibility between problem and solution. Technologies are most useful for solving specific, well-defined, and stationary problems, such as how to get cars from one side of a river to the other (for example, using bridges). In contrast, social problems, such as crime, poverty, or public health, are broad, ill-defined, and constantly evolving. Weinberg, like Etzioni, was not naïve about this difficulty, writing, "Technological Fixes do not get to the heart of the problem; they are at best temporary expedients; they create new problems as they solve old ones" (p. 8).
BYRON P. NEWBERRY
SEE ALSO Science, Technology, and Society Studies.
Etzioni, Amitai. (1968). "Shortcuts to Social Change?" Public Interest 12: 40–51. A sociologist argues for the use of technological means to treat the symptoms of pressing social problems.
Weinberg, Alvin M. (1966). "Can Technology Replace Social Engineering?" Bulletin of the Atomic Scientists 12(10): 4–8. An atomic scientist discusses the merits of using technology to eliminate or attenuate social problems as an alternative to pursuing the more difficult strategy of changing prevalent social attitudes.