Sociological Ethics

views updated

SOCIOLOGICAL ETHICS

Sociology, or the scientific study of society, social institutions, and social relationships, is one of the most important social sciences and may include in its concerns anthropology, economics, history, political science, and psychology. As a field of study it is inherently intertwined with ethics. Because any society is dependent on common assumptions about what is acceptable and unacceptable behavior among its members, sociological analysis has to include descriptions of those ethical beliefs and practices. Indeed, the society constituted by sociologists may be defined by its internal ethical commitments. At the same time, insofar as sociologists do research in and on society, they produce knowledge about moral values and their social functions, and questions arise about the proper guidelines for their work, especially when that work may conflict in various ways with accepted social norms.

The Sociology of Ethics

Early in the formation of sociology morals and values entered into the picture and influenced sociological thought and practice. A specific concentration such as a "sociology of moral values" may not exist (Durkheim 1993, p. 14), but morality has played a central role in the prevailing concepts that have shaped and molded sociology. This ideology can be seen in the works of individuals such as Karl Marx (1818–1883), Max Weber (1864–1920), and Emile Durkheim (1858–1917). These classical sociologists agreed on issues surrounding industrial capitalism and how values and morals worked to keep a society together; however, they nonetheless differed in their views of the function these elements have and how they change over time.

Although Marx is credited for playing a key role in establishing the field, Weber is the one considered to be the father of sociology. Marx's challenging social criticism was replaced by Weber's value-neutral sociology, which nevertheless stressed, as in The Protestant Ethic and the Spirit of Capitalism (1904), the ethical foundations of social orders. Marx was intrigued by the interaction between science and society, whereas Weber examined social structure and focused more on the notion of value-free science. Weber believed people acted on their own accord and emphasized the importance of the individual rather than the role of society as a collective whole. He also emphasized the notion that people should not expect science to tell them how to live their lives.

Durkheim's theories are considered by some sociologists to be even more applicable today than they were at the time he formulated them (Turner 1993). His primary contribution to sociology was his stance on social solidarity, social roles, and the division of labor. Morality and the connection between science and society also influenced Durkeim's work on professional ethics. Durkheim touted the importance of moral education on everyday life and emphasized its inclusion in the study of sociology. Marx, Weber, and Durkheim may have developed their theories in a different academic era, but they continue to influence and impact the field of sociology today.

Works by Weber and Durkheim were the precursors to those by Robert Merton, the first sociologist to win the National Medal of Science and the founder of the sociology of science. Merton's focus was on the functional analysis of social structures, and he discounted subjective dispositions, such a motives and aims. Things Merton is best known for are coining the terms "self-fulfilling prophecy," "deviant behavior," and implementing the focus group concept in a research setting.


The Ethics of Sociology

The first attempt to promote international cooperation and professionalize the field of sociology can be seen in the formation of the International de Sociologie by Rene Worm in 1893. In 1905 a number of well-known sociologists across the United States met to create an entity to promote the professionalization of the field of sociology. This organization was called the American Sociological Society and later evolved into what is known today as the American Sociological Association. Today, the ASA is the largest organization of sociologists and its membership is not only made up of students and faculty, but 20% of its membership is comprised of individuals who represent government, business, and non-profit groups. In the spring of 1997, the ASA membership approved its current version of the Code of Ethics. It includes an introduction, a preamble, five general principles, and specific ethical standards. Rules and procedures for handling and investigating complaints are also noted.

As time went on, more organizations such as the International Sociological Association were formed to support sociologists and advance knowledge about this field of study. Like ASA, these entities have also developed and established codes of ethics for their membership to follow. ISA, an organization founded in 1949, drafted its own code of ethics and the current version was approved by their Executive Committee in the fall of 2001. Other groups, such as the North Central Sociological Association, have preferred to base their codes on those outlined by ASA.

New and exciting research opportunities often bring unforeseen scenarios, many of which revolve around the sociologist's relationship with subjects. Dilemmas involving the applicability of informed consent, the use of deception, and the protection of privacy and confidentiality are common in social science research. A conflict between the desire to protect human subjects and the goal of obtaining data may not be easy to rectify even if guidelines are followed.

Research misconduct and authorship violations are also concerns that face social scientists. Abuses vary in severity and may encompass plagiarism, data fabrication, and falsification of data and results. The ethical dilemmas encountered in sociology are not unique. As science and technology become intertwined further with society, these ethical questions will become even more complex.


Sociological Issues Related to Science and Technology

Problems that occurred during the 1960s and 1970s, such as the thalidomide drug tests (1962) and the Tuskegee syphilis study (1932–1972) emphasized the fallibility and injustices of scientific research and added momentum to appeals for more regulations and guidelines. Scientific investigations, especially those in biomedicine, often are considered high-risk and life-threatening, but the social sciences also have encountered less obvious but not necessarily less dangerous situations. One case that is discussed frequently in social science circles is Stanley Milgram's work on obedience to authority in 1963. Milgram found that a majority of the individuals participating in this series of studies were willing to administer what they believed to be harmful electrical shocks to their victims. Laud Humphreys's tearoom trade in 1970 also sparked controversy. Humphreys studied homosexual encounters in a St. Louis park restroom without revealing the true nature and intention of his research. Philip Zimbardo's Stanford prison experiment in 1973 is another example of an infraction that sent up red flags to those involved in protecting human subjects (Sieber 1982). Zimbardo's study, which ended early due to concerns about its effects on the subjects, used role playing to determine what happens when good people are put in an environment that fosters evil.

Informed consent is a key component of human subjects research, but it can be controversial in disciplines such as sociology. Regulations require that in most cases informed consent be obtained before research can commence, but consent often is seen as an unrealistic obstacle in the social sciences. Research conducted by social scientists often involve the use of ethnographic methods, the collection of oral histories, and survey procedures, which do not readily lend themselves to the written informed consent process. Obtaining written consent may be problematic for researchers working in situations where language and cultural differences pose as a barrier. This may occur in situations where the individuals are illiterate or merely speak a different language. Some cultures consider the signing of a document taboo or an act reserved for certain situations such as the signing of legal documents. Evidence also indicates that subjects who sign consent forms, like those who participated in Milgram's study, do not always comprehend the full extent of the project (Mitchell 1993). Many social science initiatives include individuals involved in illegal activities where anonymity is essential. In these situations the informed consent document may compromise confidentiality by being the only link to the subject.


Steps taken to protect the privacy of the subject and ensure the confidentiality of the data may instill a false sense of security in the researcher and the subjects. A researcher may code identifiers, destroy data after project completion, use pseudonyms to mask identity, and avoid gathering personal information altogether in an attempt to provide protection. These measures are not infallible, and violations are evident in numerous cases. The use of thinly disguised pseudonyms that provoked the "Springdale" controversy can be seen in Arthur Vidich's Small Town in Mass Society (Vidich and Bensman 2000). Sociologist Arthur Vidich and anthropologist Joseph Bensman conducted a study of small town life and assigned the pseudonym "Springdale" to the upstate New York community. It didn't take long for the community's true identity to be revealed, which caused Vidich's and Bensman's research practices to be called into question. Other infractions have involved the subpoena of data, as in the case of Rik Scarce, who underwent 159 days of incarceration for refusing to release his field notes (Scarce 1995). Even with protections in place the subject's privacy and confidentiality may be at risk.


All researchers wrestle with similar issues of research misconduct. A survey published in American Scientist (November–December 1993) that measured perceived rather than actual misconduct examined some of those concerns. Doctoral candidates and faculty members representing the fields of chemistry, civil engineering, microbiology, and sociology were asked questions about scientific misconduct, questionable research practices, and other types of wrongdoing. Several conclusions were extracted from the data results, including reports that scientific transgressions occurred "less frequently than other types of ethically wrong or questionable behavior by faculty and graduate students in the four disciplines" surveyed (Swazey, Anderson, and Lewis 1993, p. 552). Other entities, such as the media, chose to concentrate on practices that painted a dire picture of academic integrity.

Funding and sponsor involvement constitute other factors that can create serious ethical dilemmas for researchers. Certain departments, such as sociology, often struggle for financial support and rely heavily on government and corporate sponsorship. Project Camelot, which has been regarded by some as "intellectual prostitution," was used to "predict and influence politically significant aspects of social change in developing nations of the world, especially Latin America" (Homan 1991, p. 27). Warnings by critics like Derek Bok, the former president of Harvard and author of the book "Universities in the Marketplace: The Commercialization of Higher Education" (Princeton University Press) indicate that pressure by academia to attract industry involvement is a precarious undertaking that can lead to the "commercialization of higher education" (Lee 2003, p. A13). These relationships also may result in pressure on researchers to skew results to favor the sponsor. In the end stiff competition for research funding and pressure to attract industry involvement may compromise ethical and professional standards (Homan 1991).


Changes in Science and Technology That Affect Sociology

Regulations and guidelines based on a biomedical model have had a dramatic impact on sociology. After the atrocities that occurred during World War II a series of codes were implemented to focus on the protection of human subjects in research. Some of the more noted ones include the Nuremberg Code, the Declaration of Helsinki, and the 1971 guidelines published by the U.S. Department of Health, Education, and Welfare (DHEW).

The Nuremberg Code, a set of ten principles designed to protect human subjects in research, was a ruling announced in 1947 by the war crimes court against Nazi doctors who conducted experiments on their prisoners. The Declaration of Helsinki was approved by the Eighteenth World Medical Assembly in 1964 and was designed to assist physicians in biomedical research involving human subjects. The continuation of ethical infractions invoked calls for additional regulations. Guidelines published in 1971 by the DHEW were one response to those demands and would prove to be the inspiration for the development of institutional review boards (IRBs) for federally funded research initiatives.


Another instrumental document resulted from the formation of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report elaborated on the ten points outlined in the Nuremberg Code and placed the emphasis on respect for persons, beneficence, and justice. Those regulations were revised in 1981, and Title 45, Code of Federal Regulations, Part 46, became known as the Common Rule.


Professional codes of ethics are a relatively recent phenomenon. The codes that existed before World War II were found primarily in the major professions of that time, such as medicine and law. Most modern organizations have developed codes based on those in the sciences, but the codes used in the social sciences often lack the power to impose sanctions for noncompliance. Unlike the case in some professional associations, participation in an organization such as the American Sociological Association (ASA) is not necessary for a person to be a sociologist or to conduct social science research. The lack of an enforcement mechanism for ethical violations also weakens the power of codes such as that of the ASA. The notion that professional codes of ethics are merely symbolic has been attributed to the government's decision to implement regulations (Dalglish 1976).


Contributions to Science, Technology, and Ethics Discussions by Sociology

A debate has been brewing among scientists and social scientists who submit research protocols for approval. The DHEW declared on July 12, 1974, that to obtain federal funding for a research project an IRB had to be in place to review projects that involved human subjects in biomedical and behavioral research. Today IRBs apply one set of rules, based on a biomedical format, to review all project submissions. Those requirements have proved to be inapplicable to numerous social science proposals and are next to impossible to carry out in all research settings. Sociologists and other social scientists have joined forces to form alliances, such as the Social and Behavioral Sciences working group, to improve the IRB process for social science researchers. In some cases, however, IRBs continue to interpret "the requirements of the Common Rule in a manner more appropriate to high risk biomedical research, ignoring the flexibility available to them in the Common Rule" (Sieber, Plattner, and Rubin 2002, p. 2).

Sociologists also have collaborated with researchers in science and technology on a number of ethics initiatives. Joint facilities and centers have helped facilitate those efforts by encouraging cross-curriculum dialogue and research. The Hastings Center was founded in 1969 to "examine the different array of moral problems engendered by advances in the biomedical, behavioral, and social sciences" (Abbott 1983, p. 877). The Center for Applied Ethics at the University of Virginia, also founded in 1969, has worked on integrity issues that span various fields and subject matters. Another interdisciplinary effort is the Ethical, Legal and Social Implications Research Program (ELSI). Founded in 1990, ELSI has focused on a number of issues, including informed consent, public and professional education, and discrimination, by bringing together experts from multiple, diverse disciplines and conducting workshops and orchestrating policy conferences to discuss these pertinent issues.


Education is imperative to promote academic integrity, and students in all disciplines should be instructed on matters that may have an adverse effect on their research. Acceptable academic behavior can be conveyed through formal methods such as workshops and symposia or through the use of informal techniques such as discussions with advisers, mentors, and classmates. Conversations that introduce possible solutions to the ethical predicaments encountered in research also can be beneficial. Teaching new researchers how to act in an ethical manner will help reduce the number of violations and will create research professionals dedicated to upholding the morals that are valued in society.

The Future

Ethical dilemmas will continue to plague researchers whether they are in the sciences or the social sciences. A state of risk-free research is not foreseeable, and steps will continue to be taken to minimize the severity and frequency of these problems. Changes in the regulations will be felt most heavily in the biomedical and science fields, but the social sciences will not be spared from increased scrutiny. Some efforts may prove to be worthy and circumvent or minimize ethical quandaries, whereas others may violate personal rights and academic freedom in the process. Cooperation among disciplines is essential to communicate the importance of ethics and create researchers who conduct their work with integrity. In the words of Johann Wolfgang von Goethe, "Knowing is not enough; we must apply; willing is not enough, we must do."


SHARON STOERGER

SEE ALSO Codes of Ethics;Durkheim, Émile;Human Subjects Research;Informed Consent;Institutional Review Boards;Merton, Robert;Misconduct in Science:Social Science Cases;Privacy;Research Ethics;Sociobiology;Tuskegee Experiment;Weber, Max.

BIBLIOGRAPHY

Abbott, Andrew. (1983). "Professional Ethics." American Journal of Sociology 88(5): 855–885.

Dalglish, Thomas Killin. (1976). Protecting Human Subjects in Social and Behavioral Research: Ethics, Law, and the DHEW Rules: A Critique. Berkeley: Center for Research in Management Science, University of California, Berkeley.

Durkheim, Emile. (1993). Ethics and the Sociology of Morals. Translated by Robert T. Hall. Buffalo, NY: Prometheus Books.

Federman, Daniel; Kathi E. Hanna; and Laura Lyman Rodriguez, eds. (2002). Responsible Research: A Systems Approach to Protecting Research Participants. Washington, DC: National Academies Press.

Gouldner, Alvin. (1971). The Coming Crisis of Western Sociology. New York: Avon Books.

Homan, Roger. (1991). The Ethics of Social Research. New York: Longman.

Lee, Felicia R. (2003). "The Academic Industrial Complex." New York Times, September 6.

Mitchell, Richard G., Jr. (1993). Secrecy and Fieldwork. Newbury Park, CA: Sage.

Report and Recommendations of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. (1978). "The Belmont Report." Washington DC: U.S. Government Printing Office.

Scaff, Lawrence A. (1984). "Weber before Weberian Sociology." British Journal of Sociology 35(2): 190–215.

Scarce, Rik. (1995). "Scholarly Ethics and Courtroom Antics: Where Researchers Stand in the Eyes of the Law." American Sociologist 26: 87–112.

Sieber, Joan E., ed. (1982). The Ethics of Social Research: Surveys and Experiments. New York: Springer-Verlag.

Sieber, Joan E.; Stuart Plattner; and Philip Rubin. (2002). "How (Not) to Regulate Social and Behavioral Research." Professional Ethics Report 15(2): 1–4.

Silber, Joan E. (1992). Planning Ethically Responsible Research: A Guide for Students and Internal Review Boards. Newbury Park, CA: Sage.

Silber, Joan E. (2001). Summary of Human Subjects Protection Issues Related to Large Sample Surveys. Washington, DC: U.S. Department of Justice, Bureau of Justice Statistics. Available at http://purl.access.gpo.gov/GPO/LPS16482.

Stanley, Barbara H.; Joan E. Sieber; and Gary B. Melton, eds. (1996). Research Ethics: A Psychological Approach. Lincoln: University of Nebraska Press.]

Swazey, Judith P.; Melissa S. Anderson; and Karen Seashore Lewis. (1993). "Ethical Problems in Academic Research." American Scientist 81: 542–553.

Turner, Stephen P., ed. (1993). Emile Durkheim: Sociologist and Moralist. London: Routledge.

Vidich, Arthur J., and Joseph Bensman. (2000). Small Town in Mass Society: Class, Power and Religion in a Rural Community. Urbana: University of Illinois Press.

About this article

Sociological Ethics

Updated About encyclopedia.com content Print Article