Skip to main content

Genetic Research and Technology


The early twenty-first century is an era of genetics. Genetic science, genetic technologies, genetically based diseases, animal and human cloning, and genetically modified organisms are regular visitors to the news and entertainment culture. Together with the revolution in information technologies, and sometimes going hand in hand, the biotech revolution promises to transform the world. The well-known successes of molecular biology in the 1950s and 1960s have transformed biology and especially genetics. But because from the very beginning genetics has been intimately involved with human values, the revolutionary changes of this science and technology have challenged moral reflection.

A brief historical review of the development of genetics research will help place such challenges in context. For present purposes this history may conveniently be divided into three periods. The first, and longest, period was one of protogenetics, in which human values played a dominant role. The second period saw the emergence of genetics as a science and its revolutionary research successes. During this period, the science aspired to a complete independence of any specific moral interests that were not directly entailed by the pursuit of scientific knowledge itself. Finally, the third period, although still trying to promote an ideal of value neutrality, may be characterized as making some efforts to bridge science and ethics.

Protogenetics: From Premoderns to the Eighteenth Century

Humans have long interacted with plants and animals, seeking to improve human life through their manipulation. Thus, before there was a formal science of genetics, humans developed tacit or implicit knowledge of how to genetically alter plants and animals for human use. Human needs and values guided these manipulations and search for knowledge. Plants and animals were selectively bred for their usefulness, and microorganisms were used to make food items such as beverages, cheese, and bread.

Early farmers noted that they could improve each succeeding harvest by using seeds from only the best plants of the current crop. They noticed that plants that gave the highest yield, stayed the healthiest during periods of drought or disease, or were easiest to harvest tended to produce future generations with these same characteristics. Through several years of careful seed selection, farmers could maintain and strengthen such desirable traits.

The ancient Greeks also gave careful attention to the heredity of humans. The accounts given were largely speculative, and many aimed at the continuation of noble lineages. Plato (428–347 b.c.e.) in The Republic proposed strict laws governing human reproduction in order to perfect and preserve an ideal state. He presented what is known as the "noble myth," according to which rulers were fashioned from gold, those who would occupy the middle rung in the state were fashioned from silver, and the farmers and artisans were fashioned with bronze. Such an ideology would explain to people that differences between them were in their very nature and needed to be preserved by laws governing procreation.

The fourth century b.c.e. also brought the theory of pangenesis, according to which, the reproductive material included atomic parts that originated in each part of the parental body. This theory was used to explain the transmission of traits from parents to children. Hippocrates (460–377 b.c.e.) also determined that the male contribution to a child's heredity is carried in the semen and argued that because children exhibit traits from both parents, there was a similar fluid in women. Aristotle (384–322 b.c.e.) rejected pangenesis, in part because traits often reappear after generations, which the theory could not explain. He argued that an individual's development was determined by internal nature, and that semen alone determined the baby's form; the mother merely provided the material from which the baby is made.

During Roman and medieval times in Europe, little was added to human understanding of reproduction and heredity. During the seventeenth century, a new conception of natural science began to develop. This new understanding of the scientific enterprise focused on experimental designs and empirical proofs. The belief that the natural sciences were completely value free and, therefore, the best means to understand the natural world began to take root. In this context, the development of the natural sciences brought a renewed attention to human reproduction and heredity. William Harvey (1578–1657) concluded that plants and animals alike reproduced in a sexual manner and defended the idea of epigenesis, that the organs of the body were assembled and differentiated as produced. Opposing epigenesis, Marcello Malpighi (1628–1694) developed the idea of preformation, according to which new organisms are fully present and preformed within either the egg or the sperm. By the middle of the seventeenth century, however, the idea of preformation was called into question by a variety of scientists. Pierre-Louis Moreau de Maupertuis (1698–1759) rejected preformationism by appealing to observations about the blending of traits. Also, the development of a theory of the cell by Kasper Friedrich Wolff (1734–1794) further supported epigenesis.

The Rise of Modern Genetics: From Mendel to Watson and Crick

The late eighteenth century and the beginning of the nineteenth century in Europe saw the advent of vaccinations, crop rotation involving leguminous crops, and animal-drawn machinery. The growth of modern science and of scientific technologies further contributed to the idea that science should be pursued for its own sake.

MENDELIAN GENETICS. Throughout this period, a number of hypotheses were proposed to explain heredity. The one that would prove most successful was developed by Austrian monk Gregor Johann Mendel (1822–1884). (The part of Austria where Mendel was born and lived is now located in the Czech Republic.) Through a variety of experiments, Mendel realized that certain traits showed up in offspring plants without any blending or mixing of the parent's characteristics. The traits were not intermediate between those of different parents. This observation was important because it contested the leading theory in biology at the time. Most of the scientists in the nineteenth century, including Charles Robert Darwin (1809–1882), believed that inherited traits blended from generation to generation.

Mendel used common garden pea plants for his research because they could be grown easily in large numbers and their reproduction easily manipulated. Pea plants have both male and female reproductive organs. As a result, they can either self-pollinate or cross-pollinate with another plant. In cross-pollinating plants that produce either yellow or green peas exclusively, Mendel found that the first offspring generation (f1) always had yellow peas. However, the following generation (f2) consistently had a 3:1 ratio of yellow to green (See Figure 1).

This 3:1 ratio occurred in subsequent generations as well. Mendel thus thought that this was the key to understanding the basic mechanisms of inheritance (See Figure 2). He came to four important conclusions from these experimental results:

  • that the inheritance of each trait was determined by "units" or "factors" that were passed on to descendents unchanged (now called "genes");
  • that an individual inherited one such unit from each parent for each trait (the principle of segregation);
  • that a trait might not show up in an individual, but could still be passed on to the next generation;
  • that the inheritance of one trait from a particular parent could be independent of inheriting other traits from that same parent (the principle of independent assortment).

Mendel's ideas were published in 1866. However, they remained unnoticed until 1900, when Hugo Marie de Vries (1838–1945), Erich Von Tschermak-Seysenegg (1871–1962), and Karl Erich Correns (1864–1933) independently published research corroborating Mendel's mechanism of heredity.

POST-MENDEL DEVELOPMENTS. By the late 1800s, the invention of better microscopes allowed biologists to describe specific events of cell division and sexual reproduction. August Friedrich Leopold Weismann (1834–1914), who coined the term "germ-plasm," asserted that the male and female parent each contributed equally to the heredity of the offspring and that sexual reproduction generated new combinations of hereditary factors. He also argued that the chromosomes were the bearers of heredity. Edouard van Beneden (1846–1910) discovered that each species has a fixed number of chromosomes. He later discovered the formation of haploid cells during cell division of sperm and ova.

The publication of Darwin's The Origin of Species (1859), together with an incomplete understanding of human heredity, were used as grounds to support the idea of carefully controlling human reproduction to perfect the species. In 1883, Sir Francis Galton (1822–1911) coined the term eugenics to refer to the science of improving the human condition through "judicious matings." In the twentieth century, eugenics would be used to justify forced sterilization programs and immigration restrictions in the United States, and human experimentation in Nazi Germany.

After 1900, the pace of advance in genetic science and technology was rapid. During the first decade, William Bateson (1861–1926) coined the terms genetics, allelomorph (later allele), homozygote, and heterozygote. The cellular and chromosomal basis of heredity (cytogenetics) was identified by Theodor Heinrich Boveri (1862–1915) and others. And Sir Archibald Edward Garrod (1837–1936) developed the subspecialty of biochemical genetics by showing that certain human diseases were inborn errors of metabolism, inherited as Mendelian recessive characters.

During his investigations with the fruit fly Drosophila, Thomas Hunt Morgan (1866–1945) proposed that genes located on the same chromosome were linked together and could recombine by exchanging chromosome segments. Alfred Henry Sturtevant (1891–1970) drew the first genetic map, using cross-over frequencies between six sex-linked Drosophila genes to show their relative locations on the X chromosome. And in 1931, Harriet Creighton (1910–2004) and Barbara McClintock (1902–1992), and Curt Stern (1902–1981) working independently, found in cells under the microscope the first direct proof of crossing-over.

THE DISCOVERY OF DNA. In the 1940s, Oswald Theodore Avery (1877–1955), Colin Munro MacLeod (1909–1972), and Maclyn McCarty (1911–2005) offered evidence that DNA was the hereditary material. The challenge then was to determine the structure of this molecule. In 1953, James D. Watson (b. 1928) and Francis Crick (1916–2004) published in Nature the three-dimensional molecular structure of DNA, presenting what would be a breakthrough discovery in the biological sciences. They relied on the methods of Linus Pauling (1901–1994) for finding the helical structure in a complex protein and on unpublished x-ray crystallographic data obtained largely by Rosalind Elsie Franklin (1920–1958) and also by Maurice Wilkins (1916–2004). Watson and Crick determined that the DNA molecule was a double helix with phosphate backbones on the outside and the bases on the inside. They also determined that the strands were antiparallel and that there was a specific base pairing, adenine (A) with thymine (T), and guanine (G) with cytosine (C) (see Figure 3).

It is difficult to overstate the importance of the discovery of the structure of the DNA molecule. It has not only revolutionized the field of biology, but has become a cultural icon. The metaphor of the DNA as the "blueprint" of life has become engrained in much talk about human traits, diseases, and development. And with it, the ideology of genetic determinism, the idea that genes alone determine human traits and behaviors, has gained strength, despite the fact that practically every geneticist alive has disavowed it. Indeed, psychologist Susan Oyama has argued that genetic determinism is inherent in the way that what genes do is represented, because they have been given a privileged causal status. To describe and think about DNA in any way other than through this problematic representation of the power of DNA, is ever more difficult.

The Challenge of Genetic Knowledge and Power

The Watson-Crick model of DNA resulted in remarkable theoretical and technological achievements during the next decades. The genetic code was deciphered, the cellular components as well as the biochemical pathways involved in DNA replication, translation, and protein synthesis were carefully described, and the enzymes responsible for catalyzing these processes were isolated.

DNA RESEARCH. A striking result of these theoretical advances was the newly found ability to use a variety of techniques that would allow researchers to control and manipulate DNA. The discovery of restriction enzymes was one of the most important steps in this ability to manipulate DNA material. These enzymes are bacterial proteins that can recognize and cleave specific DNA sequences. They act as a kind of immune system, protecting the cell from the invasion of foreign DNA by acting as chemical knives or scissors. The capacity to cut DNA into distinct fragments was a revolutionary advance. For the first time, scientists could segment the DNA that composed a genome into fragments that were small enough to handle. Human chromosomes range in size from 50 million to 250 million base pairs, and thus are very difficult to work with. Additionally, methods for synthesizing DNA and for using messenger RNA to make DNA copies provided reliable means for obtaining DNA.

Moreover, they now had the opportunity to separate an organism's genes, remove its DNA, rearrange the cut pieces, or add sections from other parts of the DNA or from other organisms. The use of plasmids, extra-chromosomal genetic elements found in a variety of bacterial species, and of bacterial viruses as vectors or vehicles to introduce foreign DNA material into living cells served as a major tool in genetic engineering. Once introduced into the nucleus, the foreign DNA material is inserted, usually at a random site, into the organism's chromosomes by intracellular enzymes. In some rare occasions, however, a foreign DNA molecule carrying a mutated gene is able to replace one of the two copies of the organism's normal gene. These rare events can be used to alter or inactivate genes of interest. This process can be done with stem cells, which will eventually give rise to a new organism with a defective or missing gene, or with somatic cells in order to compensate for a non-functioning gene.

No less important for the ability to understand and manipulate genetic material were the development of techniques to sequence DNA, the establishment of the methodology for gene cloning, and the development of the polymerase chain reaction (PCR). With these techniques it was possible to obtain and analyze unlimited amounts of DNA and RNA within a short period of time. Additionally, PCR would prove an invaluable method to identify mutations associated with genetic disease, to detect the presence of unwanted genetic material (for example in cases of bacterial or viral infection), and to use in forensic science.

Researchers working on organisms such as worms developed technologies that allow mapping of their genomes. These mapping techniques permitted the location of the positions of known landmarks throughout the organism's chromosomes. Furthermore, as these molecular techniques improved, their application to cancer studies became more and more common, leading to the discovery of viruses that were able to transform normal cells into cancer cells, the description of oncogenes, cancer suppressor genes, and a variety of other molecules and biochemical pathways involved in the development of cancer.

HUMAN GENOME PROJECT. This new venture traces its origins back to Los Alamos national laboratory and the Manhattan Project. After the atomic bomb was developed and used in Hiroshima and Nagasaki, the U.S. Congress charged the Atomic Energy Commission and the Energy Research and Development Administration, the predecessors of the U.S. Department of Energy (DOE) with studying and evaluating genome damage and repair as well as the consequences of genetic mutations. There was a special interest in focusing the research on genetic damages caused by radiation and chemical by-products of energy production. From this research developed the plan to analyze the entire human genome.

The automation of DNA sequencing in the 1980s brought to the forefront of the scientific community the possibility of not just mapping the human genome, but also sequencing it. Thus, while gene mapping allowed researchers to determine the relative position of genes on a DNA molecule and the distance between them, sequencing let them identify one by one the order of bases along each chromosome.

It was in this context that discussions began about launching a human genome program. During a series of informal meetings, researchers and government officials attempted to assess the feasibility of different aspects of a project to map and sequence the entire three billion bases of the human genome. Although the majority of scientific opinion by the end of the 1980s was that sequencing the entire human genome was feasible, not all researchers were persuaded that such a project was a good idea. Many of them saw it as a massive work in data gathering rather than important research. Many scientists were also worried that the potential huge costs of such a project would diminish the funds dedicated to basic biological research.

In spite of the concerns, in 1990 the Human Genome Project (HGP) was formally launched as a fifteen-year plan coordinated by the U.S. Department of Energy and the National Institutes of Health. James Watson had been asked to head the project and did so until 1992. He resigned then because of his opposition to the patenting of human gene sequences. Francis Collins, who in 2005 is still the director of the National Human Genome Research Institute (NHGRI), replaced him in 1993. The main goals of the project were to identify all the genes in human DNA and to determine the sequences of its three billion chemical base pairs. Other important objectives of this international project were to improve the existent tools for data analysis and store the information obtained about the genome in databases.

The main focus was the human genome. However, important resources were also devoted to sequencing the entire genomes of other organisms, often called "model organisms" and used extensively in biological research, such as mice, fruit flies, and flatworms. The idea was that such efforts would be mutually supportive because most organisms have many similar genes with like functions. Hence, the identification of the sequence or function of a gene in a model organism had the potential to explain a homologous gene in human beings, or in one of the other model organisms.

The International Human Genome Sequencing Consortium published the first draft of the human genome in the journal Nature in February 2001, with about 90 percent of the sequence of the entire genome's three billion base pairs completed. Simultaneously the journal Science published the human sequence generated by Celera Genomics Corporation headed by Craig Venter.

Although the original expected conclusion date for the project was 2005, in April 2003, coinciding with the fiftieth anniversary of the discovery of the DNA double helix, the full sequence was published in special issues of Nature and Science. The early conclusion of the program was the result of a strong competition between the public program and the private one directed by Venter. His announcement in 1998 that his company would be able to sequence the entire human genome in just three years, forced the leaders of the public program to increase the pace, so as to not be left behind. The involvement of private capital in a project of this magnitude was a major turning point in science policy because it called into question the common belief since World War II that only the federal government had sufficient resources to fund "big science."

In December 2003, the NHGRI announced the formation of the social and behavioral research branch. This new branch has as its purpose developing approaches to translating the discoveries from the completed human genome into interventions leading to health promotion and disease prevention. The launching of this new branch is evidence of the shift of the NHGRI from genome sequencing to behavioral genetics.

ETHICAL, LEGAL, AND SOCIAL ISSUES. Because of the well-known abuses of eugenics during the beginning decades of the twentieth century in the United States and then in Nazi Germany, there was an unprecedented decision to attend to the possible consequences of the research into the human genome. Thus a significant goal of the HGP was to support research on the ethical, legal, and social issues (ELSI) that might arise from the project. Funds were dedicated to the examination of issues raised by the integration of genetic technologies and information into health care and public health activities and to explore the interaction of genetic knowledge with a variety of philosophical, theological, and ethical perspectives. Similarly, part of the ELSI budget was dedicated to supporting research exploring how racial, ethnic, and socioeconomic factors affect the use, understanding, and interpretation of genetic information; the use of genetic services; and the development of public policy.

Of course, the HGP, and the scientific and technological advances that permitted it, are extremely significant because of the theoretical knowledge it has produced on how, for example, genes work and what their contribution to health and disease is. It is difficult, however, to clearly separate theory and practice in molecular genetics given that this science is very technique intensive. In any case, the research supported by the HGP is also noteworthy because it has grounded the development of a variety of what are now common biotechnologies. Hence, genetic tests and screening for several human diseases such as Tay Sachs, sickle cell anemia, Huntington's disease, and breast cancer are now part of medical practice. Agricultural products such as corn plants genetically modified to produce selective insecticides or tomatoes engineered to prevent expression of a protein involved in the process of repining are common in food markets. Animal cloning does not make the front page anymore. Genetic therapy and pharmacogenetics are more and more often presented as the new medical miracles. And, of course, discussions of genetic enhancement and the hopeful, or frightening, possibility of designer babies are regular features of the news and entertainment media.

Given the increased presence of biotechnologies in people's lives and the significance of the genetic sciences, it is not surprising then that both the so-called theoretical research on human genetics and the practical applications of such knowledge have raised heated debates about ethical, legal, and social implications. Consider, for example, the following issues that have emerged in discussions of medical and agricultural biotechnologies.

GENETIC INFORMATION. The increasing use of genetic knowledge and genetics technologies in medical practice has been a subject of concern, though to different degrees, for both those who support such use and those who are skeptical of its benefits. One of the topics that has attracted the most attention among bioethicists working on ELSI issues is related to the availability and possible abuse of genetic information. Hence, the availability of genetic information has opened discussions about privacy and confidentiality. Questions have arisen about whether medical practitioners have an obligation to inform the family members of a patient with a genetic disease, or whether such information should be available to insurers and employers, for example. The concern for the possibility of genetic discrimination has been such that many states have proposed and passed legislation prohibiting insurers from discriminating on genetic grounds. Similarly, given past experiences with eugenics, there are good reasons to have some concern about the possible stigmatization of individuals due to their genetic makeup.

GENETIC DIAGNOSIS AND HUMAN RESPONSIBILITY. The use of genetic diagnosis for a variety of medical conditions has received no less attention. Concern about fair access to these technologies, the reliability and usefulness of the tests, the training of health care professionals, the psychological effects they might have on people, and the consequences for family relationships are common. Similarly, many of the tests being developed, and some of the ones already in use, point to genetic susceptibilities or test for complex conditions that are linked to multiple genes and gene-environment interactions. Thus, such tests provide information not of a present or even a future disease, but of an increased risk of suffering such a disease. In many cases, these tests reveal possibilities of disorders, such as Huntington's disease, for which no available treatments exist. Given these issues, concerns about regulation of these tests, whether they should be performed at all, or whether parents have a right or an obligation to test their children for late-onset diseases are certainly justifiable. Moreover, the use of genetic diagnosis techniques in reproductive decision-making can also have serious implications for reproductive rights, our view of human beings, the expectations people might impose on their offspring, and the way we might treat people with disabilities.

The emphasis on people's genetic makeup might also have implications for their ideas of human responsibility, views regarding control of behavior and health status, their notions of health and disease, and their conceptions of treating a disorder or enhancing a trait. Such emphasis also has consequences for the kind of public policies people support regarding education, health promotion, disease prevention, and environmental regulations.

AGRICULTURAL BIODIVERSITY. Discussions about agricultural biotechnologies focus not just on the effects that these technologies might have on human beings, but also the consequences for animals and the natural environment. Genetic recombination techniques are used to create genetically modified organisms (GMOs) and products. These technologies enable the alteration of the genetic makeup of living organisms such as animals, plants, or bacteria, by modifying some of their own genes or by introducing genes from other organisms. GM crops, for example, are now grown commercially or in field trials in more than forty countries and on six continents. Some of these crops, including soybeans, corn, cotton, and canola, are genetically engineered to be herbicide and insecticide-resistant. Other crops grown commercially or field-tested are a sweet potato resistant to a virus that could decimate most of the African harvest, rice with increased iron and vitamins, and a variety of plants able to survive weather extremes. Research is being conducted to create bananas that produce human vaccines against infectious diseases such as hepatitis, fish that mature more quickly, fruit and nut trees that yield years earlier, and plants that produce new plastics with distinctive properties. It is unclear at this point how many of this research lines will be successful.

Questions about whether genetically modified organisms and products are safe for humans, whether they might produce allergens or transfer antibiotic resistance, whether they are safe for the environment, whether there might be an unintended transfer of transgenes through cross-pollination, whether they might have unknown effects on other organisms or result in the loss of floral and faunal biodiversity, for instance, are at the forefront of these debates. But the use of these technologies has also raised concern about possible implications for people's conceptions of other animals and the environment, their views of agricultural production, and their relationships with natural objects. Thus, many have wondered whether the use of these techniques constitutes a violation of natural organisms' intrinsic value, whether humans are unjustifiably tampering with nature by mixing genes among species, or whether the use of animals exclusively for human purposes is immoral. Debates also have been sparked about access to these technologies and the effect that this might have on non-industrialized countries. Some have questioned whether the domination of world food production by a few companies might not be putting food production at risk, and poor farmers in poor countries at an increasing dependence on industrialized nations. Issues about the commercialization of these products through the use of patents, copyrights, and trade secrets are also relevant when analyzing the implications of these technologies. Thus, many have called attention to the accessibility of data and materials.


It is important to point out that although the ELSI program of the HGP has certainly had a significant effect on the understanding and evaluation of the consequences of new genetic technologies, the prevalent idea that humans must pay attention exclusively to the consequences of scientific or technological advances might be a reason for concern. A focus on consequences reinforces the incorrect view that science and technology are value-neutral. Issues about scientific or technological advances are thus framed as questions related to the implementation of scientific knowledge or technological practices. Hence, under the presumption that such practices are not the problem, but the use that people make of them might be, an evaluation of the scientific practices themselves appears illegitimate. This prevents researchers from trying to analyze the values that might underlie the current focus on genes, or attempting to propose different value assumptions to guide scientific research. Moreover, the emphasis on consequences directs attention to analysis of means and away from an evaluation of ends. Thus, scientists are encouraged to evaluate whether a particular technology is good to solve certain problems, but cannot analyze the goals for which such a technique has been developed. Technical discussions of biotechnology that focus on impacts presuppose that these goals are unquestionable. Thus, attention must be paid to the fact that assessments of new technologies must require not only discussions of risks and benefits—that is, discussions of means—but also reflections about ends. Of course, these issues apply to a variety of bioethical problems and not just to ELSI work.


SEE ALSO Bioethics; Biotech Ethics; Fetal Research; Gene Therapy; Genethics; Genetic Counseling; Health and Disease; Human Genome Organization; In Vitro Fertilization and Genetic Screening; Medical Ethics; Playing God; Privacy.


Buchanan, Allen; Dan W. Brock; Norman Daniels; and Daniel Wikler. (2001). From Chance to Choice: Genetics and Justice. Cambridge, UK: Cambridge University Press.

Davis, Kevin. (2001). Cracking the Genome: Inside the Race to Unlock Human DNA. Baltimore: Johns Hopkins University Press.

Dunn, Leslie C. (1965). A Short History of Genetics: The Development of Some of the Main Lines of Thought: 1864–1939. New York: McGraw-Hill.

Kass, Leon. (2002). Life, Liberty, and the Defense of Dignity: The Challenges for Bioethics. San Francisco: Encounter Books.

Kevles, Daniel J., and Leroy Hood. (1992). The Code of Codes: Scientific and Social Issues in the Human Genome Project. Cambridge, MA: Harvard University Press.

Keller, Evelyn Fox. (2000). The Century of the Gene. Cambridge, MA: Harvard University Press.

Kristol, William, and Eric Cohen, eds. (2002). The Future Is Now: America Confronts the New Genetics. New York: Rowman and Littlefield.

Lee, Keekok. (2003). Philosophy and Revolutions in Genetics: Deep Science and Deep Technology. New York: Palgrave MacMillan.

Mahowald, Mary Briody. (2000). Genes, Women, Equality. New York: Oxford University Press.

Mayr, Ernest. (1982). The Growth of Biological Thought: Diversity, Evolution, and Inheritance. Cambridge, MA: Belknap Press.

Nelkin, Dorothy, and Susan Lindee. (2004). The DNA Mystique: The Gene as Cultural Icon, 2nd edition. Ann Arbor: University of Michigan Press.

Oyama, Susan. (1985). Ontogeny of Information: Developmental Systems and Evolution. Cambridge, UK: Cambridge University Press.

Sherlock, Richard, and John D. Morrey, eds. (2002). Ethical Issues in Biotechnology. Lanham, MD: Rowman and Littlefield.

Stubbe, Hans. (1972). History of Genetics: From Prehistoric Times to the Rediscovery of Mendel's Laws, trans. Trevor R. W. Waters. Cambridge, MA: MIT Press.

Tudge, Colin. (2000). The Impact of the Gene: From Mendel's Peas to Designer Babies. New York: Hill and Wang.

Watson, James, and Andrew Berry. (2003). DNA: The Secret of Life. New York: Knopf.

Watson, James, and Francis Crick. (1953). "A Structure for Deoxyribose Nucleic Acid." Nature 171: 737–38.

Watson, James; Michael Gilman; Jan Witkowski; and Mark Zoller. (1992). Recombinant DNA, 2nd edition. New York: W.H. Freeman.

Wright, Susan. (1986). "Recombinant DNA Technology and Its Social Transformation 1972–1982." Osiris 2: 303–60.


U.S. National Human Genome Research Institute, National Institutes of Health. Available from

U.S. Department of Energy, Office of Science. "Human Genome Project Information." Available from

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Genetic Research and Technology." Encyclopedia of Science, Technology, and Ethics. . 24 Sep. 2018 <>.

"Genetic Research and Technology." Encyclopedia of Science, Technology, and Ethics. . (September 24, 2018).

"Genetic Research and Technology." Encyclopedia of Science, Technology, and Ethics. . Retrieved September 24, 2018 from

Learn more about citation styles

Citation styles gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, cannot guarantee each citation it generates. Therefore, it’s best to use citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.