Ethical responsibility is one of the most commonly employed concepts in discussing the ethics of science and technology. Scientists have obligations for the "responsible conduct of research." The professional responsibility of engineers calls for attending to the public safety, health, and welfare consequences of their work. Entrepreneurs have responsibilities to commercialize science and technology for public benefit, and the public itself is often called on for the responsible support of science and technology. Consumers are admonished to be responsible users of technology. Yet the abstract noun responsibility is no more than 300 years old and has emerged to cultural and ethical prominence in association with modern science and technology from diverse legal, social, professional, religious, and philosophical perspectives.
The legal term for responsibility is liability. Law makes explicit certain customary understandings of liability in two areas: criminal law and civil law. Criminal law deals with those offenses prosecuted and punished by the state. Civil law includes breaches of explicit or implicit contract in which injured parties may sue for compensation or damages.
Criminal liability was originally construed to follow simply from a transgression of the external forum of the law—doing something the law proscribes or not doing something it prescribes. But as it developed in Europe under the influence of a Christian theology of sin, which stresses the importance of inner consent, criminal liability was modified to include appreciation of the internal forum of intent. The result is a distinction between unintended transgressions such as accidental homicide and intentional acts such as first-degree murder; punishments for the former are less severe than for the latter.
In contrast to the historical development of restrictions on criminal liability, civil liability has expanded in scope through delimitions on the requirements for intentionality. Civil liability can be incurred by contract or it can be what is called "strict liability." In the case of explicit or implicit contract, intentional fault or negligence (a kind of failure of intention) must be proved. In the case of strict liability there need be no fault or negligence per se.
The concept of strict or no-faulty liability as a special kind of tort for which the civil law provides redress developed in parallel with modern industrial technology. In premodern Roman law, for instance, an individual could sue for damages only when losses resulted from intentional interference with person or property, or negligence. By contrast, in the English common law case of Rylands v. Fletcher, decided on appeal by the House of Lords in 1868, Thomas Fletcher was held liable for damages caused by his industrial undertakings despite their unintentional and nonnegligent character. Fletcher, a mill owner, had constructed a water reservoir to support his mills. Water from the reservoir inadvertently leaked through an abandoned mine shaft to flood John Rylands's adjacent mine. Although he admitted Fletcher did not and perhaps could not have known about the abandoned mine shaft, Rylands sued for damages. The eventual ruling in his favor argued that the building of a dam, which raised the water above its "natural condition," in itself posed a hazard for which Fletcher must accept responsibility.
In the early twenty-first century, the most common kinds of civil liability are just such no-fault or prima facie liabilities related to "nonnatural" industrial workplaces and consumer products in which activities or artifacts in themselves, independent of intent, pose special hazards. In the United States one of the key cases establishing this principle was that of Greenman v. Yuba Power Products, Inc., decided on appeal by the California Supreme Court in 1963. In the words of Chief Justice Roger Traynor, in support of the majority:
A manufacturer is strictly liable in tort when an article he places on the market ... proves to have a defect that causes injury to a human being. ... The purpose of such liability is to insure that the costs of injuries resulting from defective products are borne by the manufacturers ... rather than by the injured persons who are powerless to protect themselves.
The term responsibility derives from the Latin respondēre, meaning "to promise in return" or "to answer." As such it readily applies to what is perhaps the primordial experience of the Judeo-Christian-Islamic tradition: a call from God that human beings accept or reject. Given this reference—together with its regular embodiment in the "responsorials" of liturgical practice—it is remarkable that the term did not, until the twentieth century, play any serious role in European religious-ethical traditions.
The discovery and development of religious responsibility has again paralleled rising appreciation of the ethical issues emerging from science and technology. It is in opposition, for instance, to notions of secularization and control over nature that the Protestant theologian Karl Barth (1886–1968) distinguished between worldly and transcendent relationships. God is the wholly other, the one who cannot be reached by scientific knowledge. There is thus a radical difference between the human attempt to reach God (which Barth calls religion) and the human response to God's divine revelation (a response Barth identifies as faith). In his Church Dogmatics (1932) Barth goes so far as to identify goodness with responsibility in the sense of responding to God.
Catholic theologians have been no less ready to make responsibility central to ethics. For the Canadian Jesuit Bernard Lonergan (1904–1984), "Be responsible" is a transcendental precept coordinate with duties to "Be attentive," "Be intelligent," and "Be reasonable." Responsibility also plays a prominent role in the documents of Vatican II. At one point, after referencing the achievements of science and technology, Gaudium et Spes (1965) adds that, "With an increase in human powers comes a broadening of responsibility of the part of individuals and communities" (no. 34). Later, this same document on the church in the modern world suggests that, "We are witnesses of the birth of a new humanism, one in which man is defined first of all by his responsibility toward his brothers and toward history" (no. 55).
The most sustained effort to articulate a Christian ethics of responsibility is, however, that of H. Richard Niebuhr's The Responsible Self (1963). In this work Niebuhr contrasts the Christian anthropology of the human-as-answerer to the secular anthropologies of human-as-maker and human-as-citizen. For human-as-maker, moral action is essentially consequentialist and technological. For human-as-citizen, morality takes on a distinctly deontological character. With human-as-answerer, the tension between consequentialism and deontology is bridged by responsiveness to a complex reality, by an interpretation of the nature of this reality—and by an attempt to fit in, to act in harmony with what is already going on. "What is implicit in the idea of responsibility is the image of man-the-answerer, man engaged in dialogue, man acting in response to action upon him" (p. 56). Niebuhr's ethics of responsibility is what might now be called an ecological ethics.
Responsibility in Philosophy
The turn to responsibility in philosophy, like that in theology, exhibits two faces: first, a reaction to the challenge posed by the dominance of scientific and technological ways of thinking; and second, an attempt to take into account the rich and problematic complexity of technological practice. The first is prominent in Anglo-American analysis discourse, the second in European phenomenological traditions of thought.
According to Richard McKeon (1957), interest in the concept of responsibility can be traced to diverse philosophical backgrounds, one of which is the Greek analysis of causality (or imputability) and punishment (or accountability) for actions. As McKeon initially notes: "Whereas the modern formulation of the problem [of responsibility] begins with a conception of cause derived from the natural sciences and raises questions concerning the causality of moral agents, the Greek word for cause, aitia (like the Latin word causa), began as a legal term and was then extended to include natural motions" (pp. 8–9). But it was in efforts to defend moral agency against threats from various forms of scientific materialism that the term became prevalent in analytic philosophy. For instance, H. L. A. Hart's distinctions between four kinds of responsibility—role, causal, liability, and capacity—(Hart 1968) are all related to issues of accountability as they arise in a legal framework, where they can help articulate a theory of punishment to meet the challenges posed by modern psychology.
McKeon's general thesis is that the term responsibility appeared in late-eighteenth and early-nineteenth-century moral and political discourse—as an abstract noun derived from the adjective responsible—in coordination with the expansion of democracy. But there are also numerous historical connections between the rise of democracy and the development of modern technology. On the theoretical level, the possessive individualism of homo faber, developed by Thomas Hobbes and John Locke, prepared the way for democracy and the new industrial order. On the practical level, democratic equality and technology clearly feed off one another.
But the connection goes deeper. According to McKeon, responsibility was introduced into the political context because of the breakdown of the old social order based on hierarchy and duty, and the inability of a new one to function based strictly on equality and self-interest. Whereas the former was no longer supported by the scientific worldview, the latter led to the worst exploitative excesses of the Industrial Revolution. To address this crisis there developed the ideal of relationship, in which individuals not only pursued their own self-interest but also tried to recognize and take into account the interests of others.
Something similar was called for by industrial technology. Good artisans, who dutifully followed the ancient craft traditions, were no longer enough, yet neither should they just be turned loose to invent as they pleased. Thomas Edison, after creating a vote register machine for a legislature, in which he subsequently discovered the legislature had no interest, resolved never again to invent simply what he thought the world needed without first consulting the world about what it wanted. The new artisan must learn to respond to a variety of factors—the material world, the economy, consumer demand, and more. This is what turns good artisans into responsible inventors and engineers. As their technological powers increase, so will their need to respond to an increasing spectrum of factors, to take more into account. Carl Mitcham (1994) has described this as a duty plus respicere, from the Latin to include more in one's circumspection.
Another argument to this effect is provided by John Ladd (1981) who, in considering the situation of physicians, argues that the expansion of biomedical technology has increased the private practitioner's dependence on technical services and undermined professional autonomy. Moral problems concerning physicians and society can no longer rest on an ethics of roles but involve the ethics of power, "the ethical side of [which] is responsibility" (p. 42).
The metaphysical elaboration of responsibility has taken place primarily in European philosophy. Lucien Levy-Bruhl's treatise titled The Idea of Responsibility (1884) is its starting point. After sketching a history of the idea from antiquity to the late nineteenth century, Levy-Bruhl argues surprise that a concept so basic to morality and ethical theory had not previously been subject to systematic investigation, especially since it is also manifested in a variety of ways across the whole spectrum of reality. There is responsibility or responsiveness at the level of physical matter, as atoms and molecules interact or respond to each other. Living organisms are further characterized by a distinctive kind of interaction or responsiveness to their environments and each other.
Extending this metaphysical interpretation Hans Jonas (1984), another philosopher in the European tradition, explored implications for science and technology. Responsibility is not a central category in previous ethical theory, Jonas argued, because of the narrow compass in premodern scientific knowledge and technological power. "The fact is that the concept of responsibility nowhere plays a conspicuous role in the moral systems of the past or in the philosophical theories of ethics." The reason is that "responsibility ... is a function of power and knowledge," which "were formerly so limited" that consequences at any distance "had to be left to fate and the constancy of the natural order, and all attention focused on doing right what had to be done now" (p. 123).
All this has decisively changed. Modern technology has introduced actions of such novel scale, objects, and consequences that the framework of former ethics can no longer contain them. ... No previous ethics had to consider the global condition of human life and the far-off future, even existence, of the race. These now being an issue demands ... a new conception of duties and rights, for which previous ethics and metaphysics provide not even the principles, let alone a ready doctrine. (pp. 6 and 8)
The new principle thus made necessary by technological power is responsibility, and especially a responsibility toward the future.
What for Jonas functions as a deontological principle, Caroline Whitbeck (1998) has argued may also name a virtue. When children are described as reaching "an age of responsibility," this indicates that they are able to "exercise judgment and care to achieve or maintain a desirable state of affairs" (p. 37). Acquiring the ability to exercise such judgment is to become responsible. At the same time, the term responsibility continues to name distributed obligations to practice such a virtue derived either from interpersonal relationships or from special knowledge and powers. "Since few relationships and knowledge are shared by everyone, most moral responsibilities are special moral responsibilities, that is, they belong to some people and not others" (p. 39).
Consideration of the special responsibilities that belong to scientists and engineers has been a major theme in advancing discussions of science, technology, and ethics. Although overlapping, these two discussions have nevertheless mostly taken place among different professional groups.
Efforts to define the social responsibility of scientists have involved an refinement of the representative Enlightenment view that science has the best handle on truth and is thus essentially and under all conditions beneficial to society. From such a perspective, the primary responsibility for scientists is thus to pursue and extend their disciplines.
Historically this responsibility found expression in Isaac Newton's hope for science as theological insight, Voltaire's belief in its absolute utility, and Benedict de Spinoza's thought that in science one possesses something pure, unselfish, self-sufficient, and blessed. A classic manifestation is the great French Encyclopédie (1751–1772), which sought "to collect all the knowledge that now lies scattered over the face of the earth, to make known its general structure to the men among whom we live, and to transmit it to those who will come after us." Such a project, wrote Denis Diderot, demands "intellectual courage."
The questioning of this tradition has roots in the Romantic critique of scientific epistemology and industrial practice, but did not receive a serious hearing among scientists themselves until after World War II. Since then one may distinguish three phases.
PHASE ONE: RECOGNIZING RESPONSIBILITIES. In December 1945 the first issue of the Bulletin of the Atomic Scientists led off with a statement of the goals of the newly formed Federation of Atomic (later American) Scientists. Members should "clarify ... the ... responsibilities of scientists in regard to the problems brought about by the release of nuclear energy" and "educate the public [about] the scientific, technological, and social problems arising from the release of nuclear energy." Previously scientists would have described their responsibilities as restricted to doing good science, not falsifying experiments, and cooperating with other scientists. Now, because of the potentially disastrous implications of at least one branch of science, scientists felt their responsibilities enlarge. They were called on to take into account more than the procedures of science; they must respond to an expanded situation.
The primary way that atomic scientists responded over the next decade to the new situation created by scientific weapons technology was to work for placing nuclear research under civilian control in the United States and to further subordinate national to international control. They did not, however, oppose the unprecedented growth of science. As Edward Teller wrote in 1947, the responsibility of the atomic scientists was not just to educate the public and help it establish a civilian control that would "not place unnecessary restrictions on the scientist," it was also to continue to pursue scientific progress. "Our responsibility," in Teller's words, "is [also] to continue to work for the successful and rapid development of atomic energy" (p. 355).
PHASE TWO: QUESTIONING RESPONSIBILITY. During the mid-1960s and early 1970s, a second-stage questioning of scientific responsibility emerged. Initially this questioning arose in response to the growing recognition of the problem of environmental pollution—a phenomenon that cannot be imagined as alleviated by simple demilitarization of science or increases in democratic control. Some of the worst environmental problems are caused precisely by democratic availability and use—as with pollution from automobiles, agricultural chemicals, and aerosol sprays, not to mention the mounting burden of consumer waste disposal. Rachel Carson's Silent Spring (1962) was an early statement of the problem that called for an internal transformation of science itself. But an equally focal experience during this second-stage movement toward an internal restructuring of science was the Asilomar Conference of 1975, which addressed the dangers of recombinant DNA research.
After Asilomar, the dangers of recombinant DNA research turned out to be not as immediate or as great as feared, and some members of the scientific community became resentful of post-Asilomar agitation—although others actually argued for even more stringent guidelines than those proposed (Sinsheimer 1976, 1978). Increased possible consequences nevertheless again broadened the scope of what could be debated as the proper responsibility of scientists. Robert L. Sinsheimer, for instance, himself a respected biological researcher and chancellor of the University of California, Santa Cruz, argued that modern science was based on two faiths. One is "a faith in the resilience of our social institutions ... to adapt the knowledge gained by science ... to the benefit of man and society more than the detriment"—a faith that "is increasingly strained by the acceleration of technical change and the magnitude of the powers deployed" (Sinsheimer 1978, p. 24). But even more telling is
a faith in the resilience, even in the benevolence, of Nature as we have probed it, dissected it, rearranged its components in novel configurations, bent its forms, and diverted its forces to human purpose. The faith that our scientific probing and our technological ventures will not displace some key element of our protective environment, and thereby collapse our ecological niche. A faith that Nature does not set booby traps for unwary species. (Sinsheimer 1978, p. 23)
This new argument was commensurate with the development of what Jerome R. Ravetz (1971) saw as the replacement of "academic science" by "critical science"—which is in turn related to what others have termed public interest science. Or as William W. Lowrance (1985) argued, beyond responsibility in the first-stage sense, there is a need to incorporate in science itself what he referred to as principles of "stewardship."
PHASE THREE: REEMPHASIZING ETHICS. The attempt to transform science from within was overtaken in the mid-1980s by a new external criticism not of scientific products (knowledge) but of scientific processes (methods). A number of high-profile cases of scientific misconduct raised questions about whether public investments in science were being wisely spent. Were scientists simply abusing a public trust? Moreover, some economists began to question whether, even insofar as scientists did not abuse the public trust, but followed ethical research practices—which was surely mostly the case—scientific research was as much of a stimulus to economic progress as had been thought.
The upshot was that the scientific community undertook a self-examination of its ethics and its efficiency. Efforts to increase ethics education, or education in what became known as the responsible conduct of research, became required parts of science education programs, especially in the biomedical sciences at the graduate level. And increased efficiency in grant administration and management became issues for critical assessment. Since the 1990s scientists have increasingly been understood to possess social responsibilities that include the promotion of ethics and efficiency in the processes of doing science.
At the same time, scientists have also attempted to reemphasize the importance of science to national health care, the economy, environmental management, and defense. In the face of the AIDS epidemic, biomedical research presents itself as the only answer. Computers and biotechnologies are offered as gateways to new international competitive advantage and the creation of whole new sectors of jobs. Global climate change, it is argued, can be adequately assessed only by means of computer models and the science of complexity. Finally, especially since 9/11, new claims have been made for science as a means to develop protections against the dangers of international terrorism. The social responsibility of science is defended as the ethically guided production of knowledge that addresses a broad portfolio of social needs: the promotion of health, the creation of jobs, the protection of the environment, and the defending of Western civilization.
Applied science professionals such as technologists and engineers are more subject than scientists to both external (legal, political, or economic) and internal (ethical) regulation. Indeed, engineers have since the early twentieth century attempted to formulate explicit principles of professional responsibility—precisely because of the technological powers they wield. Historically, similar discussions did not originate among scientists until the second half of the twentieth century, and scientific organizations remain in the early twenty-first century less likely to have formal codes of conduct than engineering associations.
Engineering associations aspire to the formulation of codes of conduct similar to those found in medicine or law. But unlike medicine, which is ordered toward health, or law, the end of which is justice, it is less obvious precisely what constitutes the engineering ideal that could serve as the basis for a distinctive internalist ethics of responsibility. The original engineer (Latin ingeniator) was the builder and operator of battering rams, catapults, and other "engines of war." Engineering was originally military engineering. As such, the power of engineers, no matter how great, was significantly less than the organized strength of the army as a whole. Moreover, as with all other soldiers, their behavior was guided primarily by their obligations to obey hierarchical authority.
The eighteenth-century emergence of civil engineering in the design of public works such as roads, water supply and sanitation systems, lighthouses, and other nonmilitary infrastructures did not initially alter this situation. Civil engineers were only small contributors to larger processes. But as technological powers in the hands of engineers began to enlarge, and the number of engineers increased, tensions mounted between subordinate engineers and their superiors. The manifestation of this tension is what Edwin T. Layton Jr. (1971) called the "revolt of the engineers," which occurred during the late nineteenth and early twentieth centuries. It is in association with this revolt and its aftermath that responsibility enters the engineering ethics vocabulary.
One influential if failed effort at formulating engineering responsibility led to what was known as the technocracy movement and its idea that engineers more than politicians should wield political power. Henry Goslee Prout, a former military engineer who had become general manager of the Union Switch and Signal Company, speaking before the Cornell Association of Civil Engineers in 1906, described the profession in just such leadership terms: "The engineers more than all other men, will guide humanity forward. ... On the engineers ... rests a responsibility such as men have never before been called upon to face" (quoted in Akin 1977, p. 8). At the height of this dream of expanded engineering responsibility, Herbert Hoover became the first civil engineer to be elected president of the United States, and an explicit technocracy movement fielded its own candidates for elective office. The ideology of technocracy sought to make engineering efficiency an ideal analogous to medical health and legal justice.
During World War II a different shift took place in the engineering conception of responsibility: not from company and client loyalty to technocratic efficiency but from private to public loyalty. A chastened version of responsibility nevertheless emphasized the potential for opposition between social and corporate interests. Having failed in trying to be responsible for everything, engineers came to debate the scope of more limited responsibilities—to themselves, to employers, and to the public. The need for this debate is still clearly dictated by the powers at their command and the problems such powers pose, even though it is not obvious that engineering entails responsibilities of any specific character.
With engineering under attack as a cause of environmental pollution, for the design of defective consumer goods, and as too willing to feed at the trough of the defense contract, one American engineer writing in the mid-1970s summed up the situation as follows. He first admitted that,
Unlike scientists, who can claim to escape responsibility because the end results of their basic research can not be easily predicted, the purposes of engineering are usually highly visible. Because engineers have been claiming full credit for the achievements of technology for many years, it is natural that the public should now blame engineers for the newly perceived aberrations of technology. (Collins 1973, p. 448)
In other words, engineers had oversold their responsibilities and were being justly criticized. The responsibilities of engineers are in fact quite limited. They have no general responsibilities, only specific or special ones:
There are three ways in which the special responsibility of engineers for the uses and effects of technology may be exercised. The first is as individuals in the daily practice of their work. The second is as a group through the technical societies. The third is to bring a special competence to the public debate on the threatening problems arising from destructive uses of technology. (Collins 1973, p. 449)
This debate, formalized in various technology assessment methodologies and governmental agencies, can be read as a means of subordinating engineers to the larger social order. In comparing responsibility in engineering with responsibility in science, it may thus appear that there has been more of a contraction than an expansion. Yet the issue of responsibility has so intensified that engineers now consciously debate the scope of their responsibilities in relationship to issues not previously acknowledged.
Too Much Responsibility?
One common worry about certain technologies is that they undermine human responsibility. For instance, reliance on computers in medical diagnostic processes or strategic missile defense systems transfers some decision making responsibilities from human beings to computers. But the same computer systems that assume practical responsibility for diagnosis or defense call for the exercise of a higher ideal of responsibility in their design and deployment. It is precisely because modern technology calls for so much responsibility at the ideal level that observers can be so sensitive to the issue at the practical level. It is not at all clear, for instance, that computers have in any way deprived human beings of responsibilities they formerly had. What physicians of the early nineteenth-century would have been responsible for diagnosing and then treating the array of obscure diseases for which twenty-first-century physicians are held accountable? It is more likely that new technologies make possible certain responsibilities which they can also be configured to assist.
But this raises a question: Are the responsibilities thus called forth truly reasonable? From the perspective of prudence, one should not take on or give to another too much responsibility. To do so is to invite failure if not disaster. Although exact boundaries are not easy to determine in advance, once overstepped they are difficult to recover. In light of this principle of prudence, then, one must ask: Can the principle of responsibility, and those who are called to live up to it, really bear the added burden being placed on it and them by contemporary science and technology?
Collins, Frank. (1973). "The Special Responsibility of Engineers." In "The Social Responsibility of Engineers," ed. Harold Fruchtbaum. Spec. issue, Annals of the New York Academy of Sciences, 196(10): 448–450.
Durbin, Paul T., ed. (1987). "Technology and Responsibility." In Philosophy and Technology, vol. 3. Boston: Dordrecht, Netherlands: D. Reidel. Seventeen papers from a conference; includes an annotated bibliography on the theme.
Greenman v. Yuba Power Products, Inc., 59 Cal. 2d 57 (1963).
Jonas, Hans. (1984). The Imperative of Responsibility: In Search of an Ethics for the Technological Age, trans. Hans Jonas and David Herr. Chicago: University of Chicago Press.
Ladd, John. (1981). "Physicians and Society: Tribulations of Power and Responsibility." In The Law-Medicine Relation: A Philosophical Exploration, ed. Stuart F. Spicker, Joseph
M. Healey Jr., and H. Tristram Engelhardt Jr. Dordrecht, Netherlands: D. Reidel.
Levy-Bruhl, Lucien. (1884). L'idée de responsabilité [The idea of responsibility]. Paris: Hachette.
Lowrance, William W. (1985). Modern Science and Human Values. New York: Oxford University Press.
McKeon, Richard. (1957). "The Development and the Significance of the Concept of Responsibility." Revue Internationale de Philosophie 11(1:39): 3–32.
Mitcham, Carl. (1987). "Responsibility and Technology: The Expanding Relationship." In Technology and Responsibility, ed. Paul T. Durbin, pp. 3–39. Dordrecht, Netherlands: D. Reidel. The present analysis is based on this earlier work.
Mitcham, Carl. (1994). "Engineering Design Research and Social." In Kristin Shrader-Frechette, Ethics of Scientific Research, pp. 153–168. Lanham, MD: Rowman and Littlefield.
Niebuhr, H. Richard. (1963). The Responsible Self. San Francisco: Harper and Row.
Ravetz, Jerome R. (1971). Scientific Knowledge and Its Social Problems. Oxford: Clarendon Press.
Rylands v. Fletcher, LR 3 HL 330 (1868).
Sinsheimer, Robert L. (1976). "Recombinant DNA—on Our Own." BioScience 26(10): 599.
Sinsheimer, Robert L. (1978). "The Presumptions of Science." Daedalus 107(2): 23–35.
Teller, Edward. (1947). "Atomic Scientists Have Two Responsibilities." Bulletin of the Atomic Scientists 3(12): 355–356.
U.S. Committee on Science, Engineering, and Public Policy. Panel on Scientific Responsibility and the Conduct of Research. (1992–1993). Responsible Science: Ensuring the Integrity of the Research Process. 2 vols. Washington, DC: National Academy Press.
Whitbeck, Caroline. (1998). Ethics in Engineering Practice and Research. Cambridge, UK: Cambridge University Press.
"Responsibility: Overview." Encyclopedia of Science, Technology, and Ethics. . Encyclopedia.com. (November 14, 2018). https://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/responsibility-overview
"Responsibility: Overview." Encyclopedia of Science, Technology, and Ethics. . Retrieved November 14, 2018 from Encyclopedia.com: https://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/responsibility-overview