Technological innovation has been a leading agent of social change, worldwide, since the late 1700s, serving as the conduit into society of developments in science and technology. As such, it has been at the center of ethical issues ranging from the morality and justice of the early Industrial Revolution to the consequences of genetic engineering, nanotechnology, and artificial intelligence (AI). In spite of its extraordinarily high social visibility, however, innovation is almost universally misunderstood and misrepresented, typically as synonymous with invention. Invention, in turn, is presented as a value-free, hence ethically neutral, application of new or existing technical knowledge. Treating innovations as inventions implies that ethical issues associated with their implementation derive not from factors intrinsic to innovations, but from how society chooses to implement them. Such an interpretation frees innovators from moral responsibility for the ethically problematic consequences of their activities, as well as buffering these activities from public assessment.
What Innovation Is
Innovation is a social process in which technical knowledge and inventions are selectively exploited on behalf of (corporate or government) institutional agendas driven by marketplace values or political policies. Inventions, and more broadly scientific and engineering expertise, are merely raw materials for technological innovation, which is the value-laden, ethically provocative process that determines whether an invention is introduced into a society, the form in which it is introduced, and the direction of its subsequent development as society responds to the innovation. The introduction of the automobile, television, nuclear power plants, and the Internet are examples of the value-laden innovation process, including how societal responses feed back into the course of innovation developments over time.
Conceptual Emergence and Practical Engagement
The beginning of the twentieth century saw leading economists focused on determining the conditions for supply-demand equilibrium. For Austrian economic theorist Joseph Schumpeter (1883–1950), however, what needed to be analyzed was not equilibrium but the disequilibrium created by economic growth. Looking back over the nineteenth century and the first decade of the twentieth, Schumpeter argued that entrepreneurship in combination with technological innovation—that is, risking capital by creating new businesses that transform inventions into innovations—was the engine of economic growth in modern societies. This combination of innovation and entrepreneurship created new wealth, destroyed old wealth, and created new concentrations of social and political power. Schumpeter defended what he called the creative destruction that often accompanied implementing innovations. The creation of synthetic dye, electric power, and the automotive industries, for example, undermined established industries based on natural dyes, steam and water power, and horse drawn transportation. Businesses were indeed destroyed, jobs were lost, people suffered but, Schumpeter claimed, better businesses were created, employing more people in better jobs. Schumpeter eventually also defended the wasteful and often frivolous character of the combination of innovation and entrepreneurship in an industrial capitalist environment driven by opportunistic profit-seeking.
After World War I, individual thinkers, among them the American economist Thorstein Veblen (1857–1929) and future U.S. president Herbert Hoover (1874–1961), argued that technological innovation would be central to national security and industrial competitiveness. Only in Germany, however, was there a strong national commitment to an innovation-driven military and industrial agenda, initiated by Prince Otto von Bismarck in the 1860s and developed further by all subsequent German governments, especially the National Socialists. In the United States and Great Britain, by contrast, calls for such national commitments were repeatedly rejected. For example, George Ellery Hale (1868–1938), one of the world's leading astronomers and the person responsible for maintaining America's leadership in telescopy from 1897 into the 1980s, failed in his attempt to win government acceptance of his plan to harness academic scientists to the nation's war effort during World War I. He failed again in his postwar attempt to create a national research foundation to be cosponsored by the federal government and major corporations.
World War II changed all this. The role that technology and science played in waging and winning the war for the Allies, especially the role of the U.S. Office of Scientific Research and Development (OSRD) headed by Vannevar Bush (1890–1974), led if anything to an overestimation of the power of innovation in the postwar period. In his report titled Science: The Endless Frontier (1945), Bush argued that U.S. industrial prosperity and military security would in the future be critically dependent on continuous science-based technological innovation. The federal government needed to create mechanisms for government-subsidized basic research, primarily at universities, to feed the commercial innovation process. For Bush, this was the lesson of such OSRD accomplishments as the Manhattan Project, of the Massachusetts Institute of Technology's (MIT) Radiation Laboratory or RadLab that produced a constant stream of electronic warfare and counterwarfare technologies, and of mass-produced cheap antibiotics and blood products. Yet as Bush later acknowledged, this push or linear model, in which basic research leads to applied science, which then leads to commercial technological innovations, overestimates the dependence of innovation on basic science. This view was confirmed in Project Hindsight (1966), a Department of Defense study of twenty weapons systems, introduced since 1946, that concluded that basic science affected less than 10 percent of these systems. A follow-up study by the National Science Foundation (NSF), TRACES (Technology in Retrospect and Critical Events in Science ), defended the basic research-driven model in the Bush report by looking back fifty years instead of twenty.
Since 1970 research by historians of technology has supported a version of the Project Hindsight conclusion. While basic research sometimes pushes innovation, innovation far more often pulls research, which may then enable further innovation. The exponential growth of innovation in the semiconductor and computer industries exemplifies this relationship.
Bush's report and its basic science push model nevertheless anchored postwar-U.S. science and technology policy. For the first time in U.S. history, there was a mandate for large-scale federal support of basic as well as applied scientific research. The ethics of giving scientists public funds to do research on subjects of their choice gave rise to contentious political debates that held up creation of the NSF in 1950. But the NSF budget for basic research was then and has remained modest compared to the budgets for applied research linked to innovation, which until 1989 was driven primarily by Cold War military agendas and secondarily by the evolving war on cancer, war on AIDS, and Human Genome Project agendas of the National Institutes of Health (NIH) and the U.S. space program.
In the 1960s leading political figures including Presidents John F. Kennedy, Lyndon B. Johnson and Richard M. Nixon promoted innovation as the key to U.S. economic growth. In 1962 President Kennedy explicitly identified industrial innovation as the source of new jobs and new wealth that would be shared by all. But it was only in the 1970s and after, in the wake of the Silicon Valley phenomenon and the astonishing pace of wealth creation in the semiconductor and computer industries, that a national consensus recognized the civilian economy as critically dependent on innovation for growth. It was in the 1960s and 1970s that Schumpeter's identification of innovation and entrepreneurship as engines of economic growth was rediscovered. It had sparked little interest when published in 1911 or even after Schumpeter's migration to Harvard University in the 1930s. Nor did University of Chicago economist Frank Knight (1885–1982) stimulate interest in the link between innovation and entrepreneurship with his pioneering 1921 study of the dynamic role played by risk in creating new businesses. Knight coupled a penetrating analysis of the economics of innovation-driven entrepreneurship to a stinging moral critique of the wastefulness of innovation in a capitalist economy. The importance of the ideas of Schumpeter and Knight would be appreciated only when innovation had engaged the general political consciousness and conscience. Early-twenty-first-century American economist Paul Romer is an influential neo-Schumpeterian, arguing that growth is generated by ideas of which innovation is a symptom and defending the virtues of the unmanaged U.S. innovation model over the managed innovation models in Japan and east Asia.
The Ethics of Innovation
Recognition of the scale and scope of innovation-enhancing policies provoked broad criticism of social and ethical implications of the dependence of society on innovation. Jacques Ellul in The Technological Society (1954), for instance, argued that such dependence reflected a gamble that would compel societies to transform themselves into vehicles for supporting continuous innovation at the expense of traditional personal and social values. Ellul's ethical and political critique of technology-based society attracted many followers who developed it further in the 1960s and 1970s, and were significantly responsible for the creation of university-based science, technology, and society (STS) studies programs as an academic response to the new institutionalization of innovation by government and industry. Alvin Toffler's Future Shock (1970) was a more popular caution against and criticism of the personal as well as social disorientation caused by continuous innovation. Its commercial success suggests a responsive chord of concern in the general public, which nevertheless embraced the flood tide of innovations affecting every aspect of personal and social life, locally, nationally, and globally, that poured into the marketplace during the last third of the twentieth century.
By the turn of the twenty-first century, that economic prosperity was keyed to continuous technological innovation in a global competitive environment was enshrined as an ineluctable fact, a principle of nature, a kind of categorical imperative. Innovate or stagnate not just economically, but culturally as well. Open to serious debate in principle were such questions as whether innovation-induced social change constituted true growth or was just change; whether such change was progressive, improving the quality of life, or just sound and fury busyness signifying nothing very deep. Yet public debates on such questions rarely took place. What was broadly recognized as inescapable, though, was that the innovation-driven economic growth process institutionalized after World War II and adopted globally by 2000 was characterized by a kind of positive feedback. Only continuous growth was possible; stasis, with the loss of the expectation of growth, threatened economic collapse.
Meanwhile the accumulated scholarship of the STS studies community generated new insights into the innovation process. Contrary to the inherited wisdom that technical knowledge was value-free, innovation is in fact ethically preloaded. Innovations enter the marketplace incorporating a broad range of value judgments primarily determined by the agendas of the commercial institutions and governmental agencies pursuing innovation on behalf of those agendas. The so-called negative externalities of innovation—including Schumpeter's creative destruction of superseded technologies along with their institutions, facilities, and people—also include negative environmental impacts, the introduction of new forms of personal and social life, and the creation of new vested economic, social, and political interest groups and power centers, each committed to perpetuating itself. All such concomitants of innovation raise ethical concerns that dwarf the public processes available for addressing them.
Organizational theorist and Nobel economics laureate Herbert Simon noted in the 1960s that complex systems are by definition ones whose behaviors include unpredictable outcomes. Technological innovations often result in the implementation by society of complex systems to support them. As a result, even with the best of corporate, governmental, and public intentions, it is impossible to predict in advance all of the consequences, negative or positive, of innovations in, for example, antibiotics, television, the Internet, and cell phones. Such unpredictability motivated Bill Joy—a cofounder of Sun MicroSystems Corporation, its chief scientist, and a cocreater of the Java programming language—to issue a passionate call in 2001 for a moratorium on innovation in biotechnology, nanotechnology, and robotics. Joy's argument was that these three technologies were converging and had the potential for unpredictable consequences that posed profound threats to human survival. Joy stumped the nation warning academic, industrial, and public audiences of the potential for catastrophic harm from continuing our postwar policy of unfettered innovation followed by catch-up attempts at regulation as problems arose.
A similar moratorium had been argued for in 1974 by Paul Berg, inventor of recombinant DNA technology. Berg's call, following a year-long cessation of research in his own lab, led to the 1975 Asilomar Conference, which substituted heightened laboratory safeguards for a moratorium, and subsequently sanctioned a biotechnology innovation free-for-all. In the 1980s, Jeremy Rifkin and others attempted to block innovation in genetically modified food crops and plants, to little if any avail. Joy's call did provoke a substantial response within the technology community. Raymond Kurzweil, an eminent engineer-inventor, debated Joy on a number of occasions, orally and in print, championing unrestricted innovation as both progressive and capable of containing any unanticipated harmful consequences of innovation. In spite of rapid commercial development of biotechnology and nanotechnology industries at the start of the twenty-first century, the public was not engaged in the ethical issues raised by innovations that were under research and development, in the prototype stage, or being introduced into the marketplace.
STEVEN L. GOLDMAN
Bush, Vannevar. (1980). Science: The Endless Frontier. New York: Arno Press. Originally published in 1946, the seminal U.S. science policy document.
Chandler, Alfred Dupont. (1980). The Visible Hand. Cambridge, MA: Belknap Press. A history of American industry's exploitation of innovation.
Ellul, Jacques. (1967). The Technological Society. New York: Vintage Books. The most influential attack on modern technology as antidemocratic and antihuman.
Erwin, Douglas H., and David C. Krakauer. (2004). "Insights into Innovation." Science 304: 1117–1118.
Galbraith, John K. (1998). The Affluent Society. Boston: Houghton Mifflin. Originally published in 1958, a still powerful critique of consumerism and greed.
Hughes, Thomas P. (1989). American Genesis: A Century of Invention and Technological Enthusiasm, 1870–1970. Chicago: University of Chicago Press. An excellent history of technological innovation American style.
Mowery David C., and Nathan Rosenberg. (1998). Paths of Innovation. Cambridge, UK, and New York: Cambridge University Press. An in-depth examination of four industries created by innovation.
Rosenberg, Nathan. (1983). Inside the Black Box: Technology and Economics. Cambridge, UK, and New York: Cambridge University Press. An economist's analysis of the relation between invention, innovation, commerce, and society.
Schumpeter, Joseph. (1983). Theory of Economic Development: An Inquiry into Profits, Capital, Credit, Interest, and the Business Cycle, trans. Redvers Opie. New Brunswick, NJ: Transaction Publishers. Expands Schumpeter's focus on innovation to a comprehensive theory of business cycles as inevitable in a growing economy.