One of the most startling and disturbing revelations from the Nuremburg Trials following World War II was the extent of the atrocities committed by the Nazi government in the name of genetics. Massive sterilizations and euthanasia programs for those deemed genetically unfit ("lives not worth living" in Nazi phraseology) had raised ethical issues about the use and abuse of science and the complicity of scientists at a level equaled only by that of development of the atomic bomb. By the end of the war, participation of physicists in the development of such a massively destructive force had already brought moral and ethical issues to the fore in a painful way. In a similar way, biologists—geneticists in particular—who had participated in the worldwide eugenics movement between 1910 and 1940 faced the enormous ethical and moral consequences of their pursuits. The legacy for those in the genetics community continues to be felt in the twenty-first century. Since the end of World War II it has become impossible to maintain the conventional myth of science as an ivory tower pursuit "following truth wherever it may lead." In Germany, many geneticists sought out ways to work for the Nazi cause, whether in research on eugenics or other biological problems such as germ warfare. In other countries, especially the United States, where eugenicists had struggled for over fifteen years to enact state compulsory sterilization laws for the "genetically unfit," there was widespread excitement (and a little envy) at the ease and rapidity with which German eugenicists and race hygienists passed a national eugenic sterilization law in 1933, only months after taking power.
This entry focuses on the ethical and moral issues raised by genetic technology during the whole of the twentieth century. The issues divide chronologically between the first and second halves of the century, associated with different genetic theories and technologies in each time period. Between 1900 and 1950, the science of genetics emerged as a professional field with enormous practical (agricultural, medical) and sociopolitical (race improvement) implications. This was the era in which heredity assumed center stage among the life sciences, beginning in 1900 with the rediscovery of Mendel's paper on hybridization in peas, originally published in 1866. As Mendel's generalizations were shown to apply to an ever-widening group of organisms, including humans, hopes ran high that at long last biologists would be able to solve the persistent problems of animal and plant breeding as well as understand the basis of many human physical and mental diseases. The new science was viewed with great esteem, as it also represented one of the first areas of biology to incorporate many of the characteristics of the physical sciences: experimentation, quantification, prediction, and mathematical analysis. Almost immediately, especially after 1910, the new Mendelian genetics was applied by a group of social reformers known as eugenicists to the solution of many seemingly intractable societal problems, from tuberculosis to "feeblemindedness," manic-depressive alcoholism, criminality, pauperism, and sexual perversion.
In the second half of the twentieth century a new genetic technology, molecular genetics, deriving from elucidation of the molecular structure of DNA and the mechanisms of replication and protein synthesis, led to a revival of many of the same sorts of agricultural, medical, and social hopes that had inspired classical geneticists fifty years earlier. In both periods new and exciting work in the laboratory led to a view of genetics as a "magic bullet" that would solve a host of agricultural, medical, and social ills. The argument that in the early twenty-first century we are not in danger of falling into the errors of the past because "we now know so much more" has a certain validity, since we do in fact know a great deal more about genetic mechanisms and the genetic basis of many diseases than in the 1920s or 1930s. But it is also true that the use to which our knowledge—or our still partial knowledge—is put continues to confront the same social, ethical, and legal issues that were raised in the early twentieth century. Thus, an understanding of both the historical and philosophical underpinnings of genetics since that time will provide the basis for evaluating issues in the twenty-first century: What are valid claims for the genetic basis of various traits (Huntington's disease versus intelligence or criminality)? How should genetic information be used in the public and private spheres? What sorts of genetic information should be stored as parts of a person's medical record? Who should have access to that record and under what conditions? Should humans be cloned to supposedly replace lost loved ones? Should embryos be cloned solely for providing replacement organs? Should stem cells be cultivated for medical and genetic therapies? Many of the twenty-first century's issues involve technologies that could not have been imagined in the early twentieth century. Yet the experience of those who have enthusiastically embraced the genetic technology of the day (past or present) without consideration of the full social, moral, and ethical issues involved can serve as a somber reminder that science must always be understood and used in its social context as a guide for exploring current genetics.
Eugenics and the Ethical Issues of Selective Breeding (1900–1945)
The term eugenics, derived from the Greek eugenes, was first coined by the English mathematician and geographer Francis Galton (1822–1911) in his book Inquiries into Human Faculty and Its Development (1883) to refer to one born "good in stock, hereditarily endowed with noble qualities." As an intellectual and social movement in the early twentieth century, eugenics came to mean, in the words of one of its strongest American supporters, Charles B. Davenport (1866–1944), "the improvement of the human race by better breeding." For both Galton and Davenport, better breeding implied improving the hereditary quality of the human species by using the known scientific principles of heredity and selective mating. Eugenics movements were prominent in many countries of Western Europe, Canada, Latin America, and Asia, though they were strongest and most long-lived in the United States, Britain, Scandinavia, and Germany (where eugenic principles became one of the cornerstones of the Nazi state). Although some eugenics movements (France, Brazil) were based on non-Mendelian concepts of heredity (most notably neo-Lamarckism, or the inheritance of acquired characteristics), the majority operated within the new Mendelian paradigm: genes were seen as discrete hereditary units, each controlling a specific adult trait (such as eye color, height, or skin color, the so-called unit-character hypothesis). Although laboratory geneticists recognized by 1920 that genes often interacted with one another (epistasis), or that most genes affected a number of traits (pleiotropy), among eugenicists the unit-character concept held sway, especially in the United States, well into the 1930s.
Unlike genetic research on laboratory organisms (mice, fruit flies, corn) eugenic research involved studying inheritance in an organism that could not be bred under controlled conditions and that had so few offspring that statistically significant results were hard to come by. As a result, eugenicists were forced to use the correlation and family pedigree methods of investigation. Correlation studies were based on choosing particular traits, such as height, that could be correlated between groups of known genetic relatedness, for example parents and offspring. Strong correlations (0.7 or higher out of a total 1.0) suggested a significant relationship, which eugenicists interpreted as demonstrating a strong genetic component. The problem with this method was that while it applied to groups, it did not provide any way of assessing the genetic influence in any individual case. Thus it was of little value in the long run for eugenical purposes, where the aim was to identify individuals who should be encouraged to breed (what was called positive eugenics) or discouraged or prevented from breeding (what was called negative eugenics). The alternative methodology, family pedigree studies, involved tracing a given trait (or traits) through numerous generations of a family line. The advantage of this method was that it provided data for a specific family, and if the family were large enough, some general predictions might be made for future reproductive decisions (especially for relatively clear-cut traits such as hemophilia or Huntington's disease). The disadvantage of the pedigree method is that reliable information on the traits was often difficult to come by, and families were often not large enough to provide statistically significant numbers. Where social and personality traits were concerned, both methods suffered from problems of defining clearly and objectively the conditions, especially complex ones such as "criminality," "alcoholism," or "feeblemindedness." In addition, neither method had any way to separate out the effects of environment (families share lifestyles, including diets, exercise, use of alcohol, and so on, as well as genes), and without knowing the details of both the genetic and environmental components, assessing the relative influence of each on the adult trait becomes exceedingly difficult. Such problems did not bother many eugenicists, however, who often made bold claims about the genetic basis of feeblemindedness (measured as scoring below 70 on the standard IQ tests of the day), alcoholism, criminality, even thalassophilia ("love of the sea").
Eugenicists did not restrict their efforts solely to research. Many were actively interested or engaged in various educational and political activities. Many wrote popular articles or gave public lectures, and there were eugenics societies in most major industrialized countries. There were several major international eugenics organizations, and three international congresses (in 1912 in London, 1921 and 1932 in New York). On the political front, eugenicists lobbied successfully in Britain for the Mental Deficiency Act (1913) and in the United States for immigration restriction based on eugenic claims and for passage of compulsory eugenic sterilization laws in over thirty states. Eugenical sterilization became a major international eugenics crusade, including in Canada, Germany, and the Scandinavian countries. It is ironic that Germany was one of the last countries to adopt eugenical sterilization—only after the Nazis came to power in 1933. By that time the United States was leading the world in the total number of eugenically based compulsory sterilizations (somewhere around thirty thousand). Germany ultimately sterilized over ten times that number before sterilization evolved into euthanasia. In all cases, however, one of the main (though not the only) argument for sterilization was to prevent the individual from passing on his or her defective genes to the next generation. (Other frequently cited reasons included the claim that, especially in the case of those with mental defects, even if the child were normal the parent would not be fit to raise him or her). In most countries, but especially in the United States and Germany, the rationale for sterilization (as opposed to segregation) was economic and social efficiency. It was argued that states (and countries) were spending millions of dollars a year to house defectives who "should never have been born." The rational and efficient procedure would be to sterilize the adult defective and, where possible, return him or her to society, or at least, if they remained incarcerated, to be sure reproduction was impossible. In Germany genetically defective or other nonproductive people were referred to as "useless eaters."
The ethical as well as legal dimensions of compulsory sterilization raised many questions both at the time and in subsequent years. The eugenical sterilization laws in countries other than Germany were aimed exclusively at institutionalized persons, including the mentally ill, paupers, criminals, and the mentally defective (in other words, those who were called "feebleminded" at the time). This group often represented the poorest and most vulnerable elements of society, those least able to defend themselves. All countries' sterilization laws specified some sort of due process by which the sterilization decision was made, allowed for a family member or someone else to appear on behalf of the patient, and had some provisions for appeal. In most cases that have been studied historically, however, it appears that the proceedings of such courts or due process committees were often perfunctory. Particularly vulnerable for sterilization were children brought up for sterilization in reformatories, as they entered puberty. In some cases sterilization was a condition for being released; in others as a condition for remaining in the institution (a particularly powerful threat for families unable or unwilling to deal with a retarded or recalcitrant child at home).
The ethical problems encountered with sterilization involve both biological and moral claims. On the biological side, the evidence that many of the traits for which people were sterilized (feeblemindedness, pauperism, criminality, sexual promiscuity, aggression) had a genetic basis was circumstantial at best, nonexistent at worst. Since the methods for determining human heredity could not effectively separate out genetic from environmental effects, the claims that individuals who came from families with various defects would automatically "pass on" these traits was biologically unsound. The moral questions that arose revolved around whether sterilization was a "cruel and unusual punishment," whether tampering with reproductive capacity involved the state superseding its proscribed powers, whether the state should decide who is "defective" or not, and whether having a child was a "right" (more or less the traditional view) or a "privilege" (the eugenical view). Religious groups, especially the Catholic Church, were among the strongest critics of eugenics, especially on the sterilization issue. In 1930 a papal encyclical, "Casti connubii," specifically targeted eugenical sterilization as violating Catholic doctrine. After Nuremberg, the issue became even more clear with regard to the state's right to determine who should and who should not be parents. The line has remained a fuzzy one, however, since it is generally accepted that the state has the right to regulate matters of public health: on the grounds that reproducing defective offspring constituted a public health hazard, it was argued that eugenic sterilization fell under a public health aegis. In setting out the majority opinion in the well-known 1927 Supreme Court case of Buck v. Bell, which tested the constitutionality of Virginia's eugenical sterilization law, Chief Justice Oliver Wendell Holmes argued that compulsory sterilization of genetically defective individuals could be justified by the same public health principle that required compulsory vaccination. Eugenics stood for the right of the community to safeguard itself against certain wasteful expenditures over the automatic right of the individual to bring children into the world.
DNA, Genomics, and the New Ethical Dilemmas
In many respects the genetic technology that has grown up around the elucidation of the molecular structure of DNA since 1953 has raised even more ethical and moral questions than its predecessor in the classical Mendelian era. The rapid and exciting development of molecular genetics in the period from 1953 to 1970 provided the basis for understanding aspects of genetics at the molecular level that had only been imagined by prewar geneticists. Understanding how DNA replicates itself and how genes control cell function by coding for proteins that serve both structural and catalytic (enzymatic) roles, the nature and universality of the genetic code itself, and the way in which genes are controlled (turned on and off during development) all suggested that soon human beings would be able to engineer organisms in almost any conceivable direction. Indeed, the term genetic engineering was coined during the 1960s and early 1970s to express the new hope and excitement held forth by the understanding of molecular mechanisms of heredity.
As with the rapid advances in Mendelian and chromosome genetics in the 1920s and 1930s, so with the advancement of molecular genetics; new genetic discoveries were being announced almost weekly. Although many theories of molecular genetics came and went, they were all subject to being tested and accepted or rejected. For example, initial claims about how transcription (forming messenger RNA from a DNA strand) and translation (using the messenger to synthesize a specific protein), based on work with prokaryotic (bacterial) systems, proved to have many differences from the same process in eukaryotic cells (as in all cells of higher organisms). Bacterial and viral chromosomes turned out to be organized quite differently from the chromosomes of the fruit fly or the human. The statement that "what is true for E. coli [a common bacterium used for molecular genetic research] is true for the elephant," as molecular biologist Jacques Monod put it, turned out to be not quite that simple. But yet there did appear to be an evolutionary unity among all forms of life on earth that was even more apparent at the molecular level than at the level of gross phenotype.
The application of the new genetics to practical concerns, both in agriculture and medicine, raised a number of social, political, and ethical issues, some of which overlapped with concerns from the classical era and some of which were quite new to the molecular era. At the agricultural level, one of the first great controversies to emerge concerned the technology for transferring genes from one organism to another. The common method for doing this was to use a bacterial or viral plasmid (a small chromosome-like element of DNA) as a "vector." An isolated segment of DNA from one type of organism could be inserted into the plasmid, which, because of its size, could be incorporated into another cell type and eventually integrated into the host cell's genome. This meant that the foreign, or transplant, DNA would subsequently be replicated every time the DNA of the host cell replicated. Characteristics such as insect, mold, and frost resistance could thus be genetically engineered by transferring DNA (genes) from a species that had one of these traits of great commercial value. The controversies arising from the appearance of this technology reached significant proportions in the early 1980s in Cambridge, Massachusetts, where much of the experimental work was being carried out by Harvard and Massachusetts Institute of Technology biologists. Fear that viral plastids could "get loose" into the community through the massive use of the new technology sparked a series of public meetings and calls for a moratorium on all genetic engineering until safeguards could be assured. A meeting of many of the leading biotechnological researchers at Asilomar, California, in 1975 brought to the fore a discussion of the potential hazards from inserting genes from one kind of organism into the genome of another. Although hailed as one of the boldest exercises in social responsibility by scientists since World War II, interestingly enough, the most dangerous use of the new biotechnology—that is, to create biological weapons—was discussed). Later guidelines incorporated into all grants funded by the National Institutes of Health were based on some of the early decisions among molecular biologists themselves.
Especially in the agricultural realm, the issue of "genetically modified organisms" (GMOs) became a matter of global concern in the 1980s and 1990s. Although the use of viral and bacterial plasmids turned out not to pose as serious a threat as originally thought, critics of biotechnology argued that GMOs could have altered metabolic characteristics in a way that could adversely affect the physiology of the consumer and of the environment at large. One such case became a cause célèbre in 1999 when it was found that corn genetically modified to contain a bacterial gene, Bt, that made the corn insect resistant, was killing off monarch butterflies in various localities in Britain where the corn was being planted. Indeed, as mega-corporations such as Monsanto and others turned aggressively to exploiting the GMO market, many countries, especially those in the European Union and Africa, began to place restrictions on, or even ban, the sale or importation of GMOs within their borders. The issue was less the effect on a specific species such as the monarch butterfly than the fact that destruction of the monarch symbolized a major problem with GMOs: as a result of competitive pressure from rival companies they were often rushed onto the market without thorough testing. Long-standing distrust of corporate agribusiness, where quick profits have been seen as taking precedence over human health and the quality of the environment, has fueled much of the negative response to GMOs worldwide.
Equally as important has been the issue of using human subjects in genetic research. The problem of "informed consent," never something biologists routinely worried about prior to World War II (though some were scrupulous about informing their subjects about the nature of the research in which they were involved) became a central aspect of the ethics of all human subject research protocols from the 1970s onward. All universities and hospitals engaged in any sort of human genetics (or other) research now have internal review boards (IRBs) responsible for overseeing projects in which human subjects are involved. With regard to genetic information about individuals, the issue of consent is meant not only to insure that individual subjects understand the nature of the research of which they are a part, and to insure their safety, but also to place tight restrictions on who has access to the resulting information. A particular concern regarding genetic information about people in clinical studies is whether individual subjects could be identified from examining published or unpublished reports, notebooks, or other documents. Preserving anonymity has become a hallmark of all modern genetic research involving human subjects.
The question of accessibility of genetic information has had ramifications in another aspect of medicine as well as that of designing research protocols. As testing for genes known to be related to specific human genetic diseases, such as sickle-cell anemia, Huntington's disease, or cystic fibrosis (CF) has been made available to clinicians, two questions have loomed large, especially in the United States: the accuracy of the tests (that is, incidence of false positives) and the question of who should have access to the information. Fears that genetic information might lead to job or health care discrimination have surfaced throughout the United States as genetic screening programs have become more technically feasible and thus more frequently employed. Perhaps the more general concern is the potential for insurance companies to obtain, or even require, genetic testing of adults as the basis for medical coverage, or harkening back to what seems like an almost eugenic-like view of testing fetuses, with the threat of loss of coverage if a fetus with a known genetic defect is born. Medical insurance companies in the past have tried to classify genetic diseases as "prior conditions" that are thus exempt from coverage. Most of these attempts have not been carried through, but the threat is there and raises a host of legal as well as social and psychological concerns. As of 2004 a small number (seven) of states in the United States have passed legislation specifically prohibiting insurers from denying coverage to individuals based on genetic data.
Human Behavior Genetics
In the latter half of the twentieth century a field known as behavior genetics came to prominence, focusing largely on animal models (fruit flies, mice, honeybees, spiders). Specific behaviors, such as Drosophila mating dances, were observed to involve several different genetic components, a mutation in any one of which could alter the course and outcome of the mating response. Inevitably attempts were made to apply similar claims to complex human behavior—indeed, to many of the same behaviors that had been the subject of investigation by eugenicists a half century earlier. Starting the late 1960s an up-surge in claims about the genetic basis of traits such as IQ, alcoholism, schizophrenia, manic depression, criminality, violence, shyness, homosexuality, and more newly named traits such as attention deficit hyperactivity disorder (ADHD), obsessive-compulsive disorder (OCD) and "risk taking" became widespread in both clinical and popular literature. Many of these newer studies were carried out by psychologists and psychiatrists employing the more traditional methods of family, twin, or adoption studies, correlated with genetic markers, that is, marker regions of chromosomes or DNA. Many of these studies attracted controversy that threw doubts on much of the methodology on which the current human behavior genetics is based. Among the controversial studies were those published in 1969 by the Berkeley psychologist Arthur Jensen claiming that IQ is 80 percent heritable, based on data collected over a half century by the British psychologist Cyril Burt (later claimed to be spurious or even falsified); a study published in the late 1980s by the Minnesota psychologist Thomas Bouchard of identical twins raised apart, which claimed that traits such as liking John Wayne movies, having wives with the same names, or driving identical cars are genetic in origin (no similar results have been confirmed by other researchers); and a 1994 study by Dean Hamer that claimed to have found a genetic marker associated with homosexuality in thirty-three out of forty pairs of gay brothers but that could not be replicated by a separate lab study using a different study population.
The same methodological problems that confronted eugenicists has confronted many of these current theories: difficulty in defining behaviors in a clear manner; treating complex behaviors as if they were a single entity; the difficulty of separating out familial and cultural inheritance from biological inheritance; the problematic use of statistics, especially "heritability"; and difficulty in replicating the results of one study using a different population. As in the eugenic period, critics of hereditarian studies have argued that despite the uncertainty of the conclusions, the widespread dissemination of the results as positive outcomes serves the social function of distracting attention from social and economic reforms that might go a long way toward altering the prevalence of certain "problem" behaviors. As Thomas Hunt Morgan (1866–1945), the first Nobel laureate in genetics (1933) stated in 1925: "In the past we could have bred for greater resistance to cholera or disease; but it was quicker and more satisfactory to clean up the environment." Many fear that modern claims for a genetic basis of many social problems will serve as a smokescreen for "cleaning up the environment" by blaming societal problems on the "defective biology" of individuals.
Cloning and Stem Cell Research
With increased research in molecular genetics per se two related technologies have come to the fore as centers of ethical concern: cloning and research on embryonic stem cells. Cloning is the process of creating a new organism that is a carbon copy of an already existing organism. In one sense cloning is not a new technology, since bacterial and all other asexually reproducing organisms reproduce in clones (a bacterial colony is a clone, as are tissue cultures that have been used for decades in biological research). What is new in the controversy about cloning in the twenty-first century is the prospect of producing a higher organism such as a mammal or human being from a single cell of an already existing adult organism. In 1997 Dolly the sheep became the most sensational and well-known example of a cloned organism. Dolly was produced by removing the nucleus from the egg cell of one female sheep and causing that enucleated cell to fuse with an adult cell from (with its nucleus intact) a different donor sheep. The new "hybrid" egg cell contained a full set of chromosomes from the donor and thus would develop into a genetic replica of the donor. While transplanting nuclei from one variety of animal to another had been accomplished with amphibians in the early 1950s, nothing of the sort had been accomplished with mammals until the mid-1990s. The advantage of cloning for agriculture is clear: the genetic composition of the offspring is completely predictable (which is not true in conventional breeding methods). The major biological question raised by Dolly was whether cloned organisms are as healthy and long-lived as ones produced by natural fertilization of an egg by a sperm. As it turned out, Dolly showed signs of early aging, a result perhaps of the fact that the donor cell came from a six-year-old ewe, who was already halfway through the sheep's natural life span. The chromosomes of mammals gradually shorten at the ends (telomeres) with aging, a process that is not completely understood but which appears to have important implications for cloning from already aging adult cells.
The leap from Dolly to cloning humans and other mammals (businesses sprang up offering to clone family pets, for example) was quick in coming. Among the ethical problems raised here was the expectation that by having an organism with the same genetic composition it was going to be a replica in every way of the adult from which the donor cell was taken. With humans or pets this meant to many people having a new individual with the same personality and behavioral traits as the donor. Such expectations were based on a simplistic understanding of genetics, especially with regard to complex characteristics such as behavior and personality. One of the important lessons of modern genetic research has been that genes do not unfold automatically into an adult trait. Genes interact with other genes and with the environment to produce a number of variant outcomes, so that a genetic clone would no more behave like its progenitor than any two organisms of the same species.
Cloning humans, of course, raises all sorts of other ethical issues, particularly those surrounding what has come to be known as fetal selection. What does it mean if parents want to control so completely all the physical and physiological, not to say psychological, characteristics of their children? Bioethicists raise the question of how far engineering human traits should be allowed to go. Would it be permissible to clone an embryo from a person who needs a kidney or liver transplant just to get an immunologically compatible organ? Is an embryo produced by cloning really a human being, since it has not been formed by union of egg and sperm? If the answer is yes, then should cloning and bringing the embryo to full term be allowed? Should we be able to clone a terminally ill child in order to provide a replacement? These are not simple questions, but as the technology becomes more certain and available (as it no doubt will) the social and ethical questions must be faced critically and squarely.
Stem cell research poses many of the same problems raised by cloning, but it has a more realistic and immediate medical application as well as some distinct ethical issues of its own. Stem cells are undifferentiated embryo-like cells that are found in various tissues of the adult body. Among the earliest recognized and prolific stem cells are those in bone marrow, but biologists have now found stem cells even in brain tissue that has previously been regarded as incapable of regeneration. What has attracted so much attention about stem cells is that for the most part they have retained the ability to differentiate into a variety of other specialized cells. This is particularly true of embryonic stem cells, which are the most totipotent (capable of differentiation into virtually all other body cell types) of all stem cells. Research into how to culture and deliver stem cells to specific tissues in individuals suffering from particular diseases (for example providing brain stem cells to a person with Alzheimer's or to someone who has suffered brain damage from a stroke) thus offers considerable potential for treating conditions that are now considered incurable. Creating embryos by cloning and growing them just long enough to harvest embryonic stem cells would provide the most ready source of totipotent cells. But many ethicists and religious leaders claim that such embryos are truly human beings and that to grow them only for stem cells, like growing them only for organs and tissues, would amount to murder. Issues like this surfaced after biologists began using existing embryonic stem cell lines for research (derived from frozen embryos left over from fertility clinics) with the result that U.S. President George W. Bush banned the production of any more embryos specifically for the purpose of culturing stem cells (an existing 1993 law had prevented the use of any tax dollars for research on human embryos). The questions of how a human life is defined, at what point in the biological life cycle does it become "human," and, regardless of how we define it, how to form humane social and legal policies regarding early human embryos as research objects are all issues about which there is a great deal of current disagreement. Biomedical researchers and many others think that early embryos (less that twelve weeks) should be available for research purposes, while many political conservatives and religious spokespeople oppose the use of any human embryo that has the capability of developing normally into a fetus. Many countries of Europe, especially England, have been more liberal with their policies regarding embryonic stem cell research, with the consequence that some U.S. researchers have either moved or have contemplated moving their laboratories abroad.
New genetic technologies, whether those associated the classical genetics of the first half of the twentieth century or the molecular genetics and genomics of the second half, have al- ways raised a wide variety of ethical issues within the larger society. Whether genetic knowledge is being used politically to place blame for social problems on "defective biology" or genetic engineering technologies are being used to produce "designer babies," geneticists have continually found themselves in the midst of highly controversial issues, ones that are often far more difficult and complex than those associated with other biomedical technologies. This may be in part a result of the long-standing, though mistaken, view that "genetics is destiny" and that knowing the genotype (genetic makeup) of an organism can lead to accurate predictions about its ultimate phenotype (that is, what actual traits will appear and in what form). But it is also in part due to Western society's optimistic faith that science and technology can provide answers to larger economic and social issues. This is an unrealistic view of what role scientific and technological information can play in human life. There is no question that knowing the science involved in any given area of biomedicine (especially human genetics) is critical for making social and political decisions. But it is never enough. Even if scientists could predict with complete accuracy the exact clinical effects that would characterize a fetus with Down's syndrome or Huntington's disease, the decision about how to respond to that knowledge would involve social, political, economic, and philosophical considerations that lie outside of the science itself. As much as anything else, consideration of the ethical and moral aspects of genetic technology should be a reminder that science itself is not, nor has it ever been, a magic bullet for the solution of social problems. Nowhere has that been demonstrated more clearly than in the history of genetics in the twentieth century.
See also Behaviorism ; Bioethics ; Biology ; Determinism ; Eugenics ; Genetics: History of ; Health and Disease ; Medicine: Europe and the United States ; Nature .
Allen, Garland E. "The Ideology of Elimination: American and German Eugenics, 1900–1945." In Medicine and Medical Ethics in Nazi Germany: Origins, Practices, Legacies, edited by Francis R. Nicosia and Jonathan Huener, 13–39. New York: Berghahn Books, 2002.
Goodman, Alan H., Deborah Heath and M. Susan Lindee. Genetic Nature/Culture: Anthropology and Science beyond the Two-Culture Divide. Berkeley: University of California Press, 2003.
Kevles, Daniel J. In the Name of Eugenics. New York: Knopf, 1985.
Maienschein, Jane. Whose View of Life? Embryos, Cloning and Stem Cells. Cambridge, Mass.: Harvard University Press, 2003.
Paul, Diane B. Controlling Human Heredity, 1865 to the Present. Atlantic Highlands, N.J.: Humanities Press, 1995.
Weir, Robert F., Susan C. Lawrence, and Evan Fales, eds. Genes and Human Self-Knowledge: Historical and Philosophical Reflections on Modern Genetics. Iowa City: University of Iowa Press, 1994.
Garland E. Allen