Medical Education

views updated May 29 2018

MEDICAL EDUCATION

MEDICAL EDUCATION. Pain, suffering, and premature death from disease have ravaged human beings from the beginning of recorded time. This harsh fact was as true for colonial America, where life expectancy as late as 1800 was only twenty-five years, as for every other known society. Yet the aspiration for health and the desire to explain the mysteries of disease have also characterized all known human societies. Thus, inhabitants of colonial America sought medical care from a variety of practitioners of folk, herbal, and Indian medicine. In addition, some members of the clergy, such as the Protestant New England clergyman Cotton Mather (1663–1728), incorporated the healing art in their services to the colonists.

In the eighteenth century, practitioners of "regular, " or "allopathic, " medicine began to become more commonplace. A small number of elite practitioners obtained medical degrees, primarily by studying medicine in Edinburgh, Leiden, or London. This mode of study was not economically feasible for more than a few. Of necessity, the apprenticeship tradition became the dominant system of medical training, with the typical preceptorial period lasting three years. Apprentice physicians would study medicine with a practicing physician, who would allow the apprentice the opportunity to participate in his practice in exchange for a fee and the performance of various menial chores.

In the early nineteenth century, the "proprietary" medical school became the dominant vehicle of medical instruction in America. In 1800, only four medical schools existed: the University of Pennsylvania (founded in 1765), King's College (1767), Harvard (1782), and Dartmouth (1797). Between 1810 and 1840, twenty-six new schools were established, and between 1840 and 1876, forty-seven more. In the late nineteenth century, dozens of additional schools sprouted. Originally, these schools were intended to be a supplement to the apprenticeship system. However, because they could more readily provide systematic teaching, by the middle of the nineteenth century they had superseded the apprenticeship as the principal pathway of medical education.

Though the first schools were created with lofty ambitions, the quality of instruction at the proprietary schools rapidly deteriorated, even based on the standards of the day. Entrance requirements were nonexistent other than the ability to pay the fees. Disciplinary problems arising from outrageous student behavior were commonplace. The standard course of instruction in the mid-nineteenth century consisted of two four-month terms of lectures during the winter, with the second term identical to the first. The curriculum generally consisted of seven courses: anatomy; physiology and pathology; materia medica, therapeutics, and pharmacy; chemistry and medical jurisprudence; theory and practice of medicine; principles and practice of surgery; and obstetrics and the diseases of women and children. Instruction was wholly didactic: seven or eight hours of lectures a day, supplemented by textbook reading. Laboratory work was sparse, and even in the clinical subjects, no opportunity to work with patients was provided. Examinations were brief and superficial; virtually the only requirement for graduation was the ability to pay the fees. Students who wished a rigorous medical education had to supplement what they learned in medical school in other ways, such as through enrollment at non-degree-granting extramural private schools, study in Europe, or work in hospitals as "house pupils."

The mid-nineteenth-century proprietary schools, such as Bennett Medical College and Jenner Medical College in Chicago, were independent institutions. University or hospital affiliations, in the few cases in which they existed, were nominal. The faculties were small, typically consisting of six or eight professors. The professors owned the schools and operated them for profit. A commercial spirit thus pervaded the schools, for the faculty shared the spoils of what was left of student fees after expenses. The mark of a good medical school, like that of any business, was considered its profitability. Since an amphitheater was virtually the only requirement to operate a medical school, physical facilities were meager. The second floor above the corner drugstore would suffice; a school that had a building of its own was considered amply endowed.

The Creation of the Modern Medical School

While American medical education was floundering in the mid-1800s, the reform of the system was already beginning. At the root of the transformation was a series of underlying events: the revolution in experimental medicine that was proceeding in Europe; the existence of a cadre of American doctors traveling to Europe (particularly Germany) to learn laboratory methods; the emergence of the modern university in America; the development of a system of mass public education to provide qualified students for the university; and the cultivation of a habit of philanthropy among some very rich industrialists. Together, these developments provided the infrastructure for a new system of medical education soon to appear.

The creation of America's current system of medical education occurred in two overlapping stages. In the first stage, which began in the middle of the nineteenth century, a revolution in ideas occurred concerning the purpose and methods of medical education. After the Civil War, medical educators began rejecting traditional notions that medical education should inculcate facts through rote memorization. Rather, the new objective of medical education was to produce problem-solvers and critical thinkers who knew how to find out and evaluate information for themselves. To do so, medical educators deemphasized the traditional didactic teaching methods of lectures and textbooks and began speaking of the importance of self-education and learning by doing. Through laboratory work and clinical clerkships, students were to be active participants in their learning, not passive observers as before. A generation before John Dewey, medical educators were espousing the ideas of what later came to be called "progressive education."

At the same time, a revolution occurred in the institutional mission of medical schools. The view emerged that the modern medical school should not only engage in the highest level of teaching but also should be committed to the discovery of new knowledge through research. This meant that medical schools could no longer remain freestanding institutions. Rather, they had to become integral parts of universities and hire scientifically trained, full-time faculty who, like all university professors, were researchers as well as teachers.

In the early 1870s, the first lasting reforms occurred, as Harvard, Pennsylvania, and Michigan extended their course of study to three years, added new scientific subjects to the curriculum, required laboratory work of each student, and began hiring full-time medical scientists to the faculty. In the late 1870s, the plans for the new Johns Hopkins Medical School were announced, though for financial reasons the opening was delayed until 1893. When the school finally did open, it immediately became the model by which all other medical schools were measured, much as the Johns Hopkins University in 1876 had become the model for the modern American research university. A college degree was required for admission, a four-year curriculum with nine-month terms was adopted, classes were small, students were frequently examined, the laboratory and clinical clerkship were the primary teaching devices, and a brilliant full-time faculty made medical research as well as medical education part of its mission. In the 1880s and 1890s, schools across the country started to emulate the pioneering schools, and a campaign to reform American medical education began. By the turn of the century, the university medical school had become the acknowledged ideal, and proprietary schools were already closing for lack of students.

Nevertheless, ideas alone were insufficient to create the modern medical school. The new teaching methods were extremely costly to implement, and hospitals had to be persuaded to join medical schools in the work of medical education. Thus, an institutional as well as an intellectual revolution was needed. Between 1885 and 1925 this revolution occurred. Large sums of money were raised, new laboratories were constructed, an army of full-time faculty was assembled, and clinical facilities were acquired. Medical schools, which had existed autonomously during the proprietary era, became closely affiliated with universities and teaching hospitals.

No individual contributed more dramatically to the institution-building process than Abraham Flexner (1886– 1959), an educator from Louisville who had joined the staff of the Carnegie Foundation for the Advancement of Teaching. In 1910, Flexner published a muckraking report, Medical Education in the United States and Canada. In this book, he described the ideal conditions of medical education, as exemplified by the Johns Hopkins Medical School, and the deficient conditions that still existed at most medical schools. Flexner made no intellectual contribution to the discussion of how physicians should be taught, for he adopted the ideas that had developed within the medical faculties during the 1870s and 1880s. However, this report made the reform of medical education a cause célèbre, transforming what previously had been a private matter within the profession into a broad social movement similar to other reform movements in Progressive Era America. The public responded by opening its pocketbook, state and municipal taxes were used to fund medical education, private philanthropists, George Eastman and Robert Brookings among them, and philanthropic organizations all contributed significant sums to support medical education. In the two decades that followed the public provided the money and clinical facilities that had long eluded medical schools. In addition, an outraged public, scandalized by Flexner's acerbic depiction of the proprietary schools still in existence, brought a sudden end to the proprietary era through the enactment of state licensing laws, which mandated that medical schools operated for profit would not be accredited.

Graduate Medical Education

Through World War I, medical education focused almost exclusively on "undergraduate" medical education—the years of study at medical school leading to the M.D. degree. At a time when the great majority of medical school graduates entered general practice, the four years of medical school were considered an adequate preparation for the practice of medicine. Abraham Flexner's 1910 report did not even mention internship or other hospital training for medical graduates.

By World War I, however, medical knowledge, techniques, and practices had grown enormously. There was too much to teach, even in a four-year course. Accordingly, a period of hospital education following graduation—the "internship"—became standard for every physician. By the mid-1920s the internship had become required of all U.S. medical graduates.

The modern internship had its origins in the informal system of hospital appointments that dated to the early nineteenth century. Until the end of the century, such positions were scarce, available only to a tiny handful of graduates. Though such positions allowed the opportunity to live and work in a hospital for a year or two, they were saddled with considerable education deficiencies. Interns had limited clinical responsibilities, and the positions involved a considerable amount of nonmedical chores like maintaining the hospital laboratories. During the first two decades of the twentieth century, the internship was transformed into a true educational experience. Internship now provided a full schedule of conferences, seminars, rounds, and lectures as well as the opportunity to participate actively in patient management.

Internships came in three forms. The most popular was the so-called "rotating" internship, in which interns rotated among all the clinical areas. Some hospitals, particularly those associated with medical schools, offered "straight" internships in medicine or surgery, in which interns spent the entire time in that field. The third type was the "mixed" internship, a cross between the rotating and straight internship. Mixed internships provided more time in medicine and surgery and less in the various specialties than rotating internships. Typically, internships lasted one year, though some were as long as three years. All forms of internship provided a rounding-out clinical experience that proved invaluable as a preparation for general medical practice.

Medical education in the early twentieth century faced another challenge: meeting the needs of individuals who desired to practice a clinical specialty (such as ophthalmology, pediatrics, or surgery) or to pursue a career in medical research. To this end the "residency"—a several-year hospital experience following internship—became the accepted vehicle.

The modern residency was introduced to America at the opening of the Johns Hopkins Hospital in 1889. Based upon the system of "house assistants" in the medical clinics of German universities, the Hopkins residency was designed to be an academic experience for mature scholars. During World War I, the Hopkins residency system began to spread to other institutions, much as the Hopkins system of undergraduate medical education had spread to other medical schools the generation before. By the 1930s, the residency had become the sole route to specialty training. In doing so, it displaced a variety of informal, educationally unsound paths to specialty practice that had preceded it, such as taking a short course in a medical specialty at a for-profit graduate medical school or apprenticing oneself to a senior physician already recognized as a specialist.

Residency training before World War II had three essential characteristics. First, unlike internship, which was required of all medical school graduates before they could receive a license to practice medicine, residency positions were reserved for the elite. Only one-third of graduates were permitted to enter residency programs following the completion of an internship, and only about one-quarter of first-year residents ultimately completed the entire program. Second, the defining educational feature of residency was the assumption of responsibility by residents for patient management. Residents evaluated patients themselves, made their own decisions about diagnosis and therapy, and performed their own procedures and treatments. They were supervised by—and accountable to—attending physicians, but they were allowed considerable clinical independence. This was felt to be the best way for learners to be transformed into mature physicians. Lastly, the residency experience at this time emphasized scholarship and inquiry as much as clinical training. The residency system assumed many characteristics of a graduate school within the hospital, and residents were carefully trained in clinical research. Residency came to be recognized as the breeding ground for the next generation of clinical investigators and medical scholars.

Evolution and Growth

Scientific knowledge is continually growing. In addition, the diseases facing a population are constantly changing, as are medical practices, cultural mores, and the health care delivery system. Thus, of necessity, medical education is always evolving to reflect changing scientific and social circumstances.

After World War II, medical educators continued to emphasize the importance of "active learning" and the cultivation of problem-solving skills. However, the postwar period witnessed several important curricular innovations: the development of an organ-based curriculum by Western Reserve (1950s); the invention of "problem-based" learning by McMaster (1970s); the introduction of a primary care curriculum by New Mexico (1980s); and the establishment of the "New Pathway" program at Harvard Medical School (1980s). In addition, all medical schools reduced the amount of required course work, increased the opportunity for electives, and began to provide early clinical experiences during the first and second years of medical school.

Reflecting changes in the broader society, medical schools also became more representative of the diverse population they served. Religious quotas against Jewish and Catholic students, established at many medical schools in the early 1920s, disappeared in the 1950s following the revelation of Nazi atrocities and changes in civil rights laws. The admission of African American students, though still short of target levels, roughly tripled from 2.7 percent of enrolled medical students in the 1960s to around 8 percent in the 1990s. Greater success was achieved in the enrollment of women, whose numbers increased from roughly 7 percent of students in the 1960s to about 50 percent in the 1990s.

Graduate medical education also changed significantly following World War II. In the late 1940s and 1950s, residency training became "democratized"—that is, it became available to all medical graduates, not merely the academic elite as before. Between 1940 and 1970, the number of residency positions at U.S. hospitals increased from 5,796 to 46,258. Thus, the number of residents seeking specialty training soared. At the same time, the academic component of residency training diminished. Residency became an exclusively clinical training ground rather than a preparation point for clinical research as before. Most physicians desiring research training now had to acquire that through Ph.D. programs or postgraduate research fellowships.

In addition, the stresses of residency training also increased substantially after Word War II. In the 1960s, intensive care units were introduced, as were new, life-sustaining technologies like ventilators and dialysis machines. Hospitalized patients tended to be much sicker than before, resulting in much more work. In the 1980s, following the death of nineteen-year-old Libby Zion at the New York Hospital, the public began to demand shorter hours and greater supervision of house officers. Ironically, after extensive investigation, Libby Zion's death appeared not to be the result of poor care provided by fatigued or unsupervised house officers. Nevertheless, the movement to regulate house staff hours gained strength.

As medical education was changing, it also grew longer. In the 1960s and 1970s, in response to the public's demand for more doctors, existing medical schools expanded their class size, and forty new medical schools were established. Between 1960 and 1980, the number of students entering U.S. medical schools increased from 8,298 to 17,320. Following World War II, the research mission of medical schools expanded enormously, mainly because of the infusion of huge amounts of research funding from the National Institutes of Health. The number of full-time faculty at U.S. medical schools grew from 3,500 in 1945 to 17,000 in 1965. After 1965, medical schools grew larger still, primarily because of the passage of Medicare and Medicaid legislation that year and the resultant explosion in demands on the schools to provide clinical care. By 1990, the number of clinical faculty at U.S. medical schools had grown to around 85,000, with most of the increase occurring in the clinical departments. By that time one-half of a typical medical school's income came from the practice of medicine by the full-time faculty. By the 1990s, the "academic health center"—the amalgam of a medical school with its teaching hospitals—had become an extremely large and complex organization with many responsibilities besides education and research. By the late 1990s, a typical academic health center could easily have a budget of $1.5 billion or more and be the largest employer in its community.

The Challenge of Managed Care

Though medical schools prospered and served the public well during the twentieth century, a cautionary note appeared at the end of the century. Academic health centers had grown strong and wealthy, but they had become dependent for their income on the policies of the third-party payers (insurance companies and government agencies) that paid the bills. During the managed care era of the 1990s, the parsimonious payments of many third-party payers began causing academic health centers considerable financial distress. For instance, in 2000 the University of Pennsylvania Health System suffered a $200 million operating loss. (All hospitals were threatened financially by managed care, but teaching centers, because of their higher costs, were particularly vulnerable.) In addition, the emphasis of managed care organizations on increasing the "throughput" of patients—seeing as many patients as possible, as quickly as possible—eroded the quality of educational programs. Students and residents no longer had as much time to learn by doing or to study their patients in depth. Hopefully, the desire of the profession and public to maintain quality in education and patient care will allow these difficulties to be surmounted in the years ahead.

BIBLIOGRAPHY

Bonner, Thomas N. American Doctors and German Universities: A Chapter in International Intellectual Relations, 1870–1914. Lincoln: University of Nebraska Press, 1963.

Fleming, Donald. William H. Welch and the Rise of Modern Medicine. Boston: Little, Brown, 1954.

Ludmerer, Kenneth M. Learning to Heal: The Development of American Medical Education. New York: Basic Books, 1985.

———. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press, 1999.

Norwood, William F. Medical Education in the United States before the Civil War. New York: Arno, 1971.

Kenneth M.Ludmerer

Medical Education

views updated May 21 2018

MEDICAL EDUCATION

•••

When this subject was addressed in the first edition of this encyclopedia, the paucity of systematic analyses of the ethical issues peculiar to medical education was underscored (Pellegrino, 1978). In recent years, this deficiency has gradually been redressed, so that today, a considerable body of literature is available. This entry is therefore a substantial revision of the first. The emphasis has shifted from underlying values to more specific, normative issues, particularly in clinical education.

Ethical issues arise in medical education because of the special societal role of medical schools, the necessary inter-mingling of patient care with education, and the conflicts that may arise because of the obligations among students, patients, faculty members, and society. Similar ethical issues are present in the education of nurses, dentists, and the allied health professions.

The Social Mandate of Medical Schools

Medical schools occupy a unique moral position in society. They are mandated to meet society's need for a continuous supply of competent practitioners who can care for the sick and promote the public's health. For this reason, medical schools are supported as loci for the advancement and transmission of medical knowledge and are granted authority to select who shall study medicine, what shall be studied, and what standards of performance shall be established.

To achieve these goals, medical schools require certain special privileges, for example, to dissect human bodies, to provide "hands on" practical experience for students in the care of sick people, and to conduct human experimentation. These practices would be criminal were they not socially mandated for a good purpose. When medical schools, students, and faculty avail themselves of these privileges, they enter an implicit covenant with society to use them for the purposes for which they are granted.

To fulfill this social covenant, medical schools and their faculties must perform a tripartite function with respect to medical knowledge: 1) they must preserve, validate, and expand it by research; 2) transmit it to the next generation by teaching; and 3) apply it by practice in the care of the sick. However, these three functions have different aims. The aim of research is truth that requires dedication to objectivity, freedom of inquiry, rigorous design, as well as peer review and publication. The aim of teaching is learning that requires dedication to student welfare, competent pedagogy, and opportunities for students to practice their skills. The aim of practice is the welfare of the patient that requires dedication to compassion, competence, and ethical concern for the vulnerability, dignity, and autonomy of the sick person.

In the past, these three functions were less often in conflict with each other than they are today. This conflict is the result of several factors in the evolution of medical education since the late nineteenth century. The first factor is the realization of the power of the physical and biological sciences to advance medical knowledge and their integration into medical education. Second is the incorporation of teaching hospitals into medical schools for the clinical education of medical students (Flexner). Third is the increasing reliance on practice income to support salaries of medical teachers. Previously, teachers had been self-supporting practitioners from the community, while only a few were university-funded full-time teachers. Today's "tenure track" clinical faculty member is expected to excel in research, to support himself or herself financially through practice and overhead cost recovery from grants, and to teach at the bedside. Each function has its own legitimacy, but taken together, these functions conflict with each other.

Ethical Obligations of Medical Schools

The ethical obligations of medical schools as societal entities are defined in terms of the constituencies they serve: society, faculty, student body, and patients (Pellegrino, 1976).

Medical schools have been granted a virtual monopoly over the number of students they admit and the number of training places in the various specialties in teaching hospitals. Medical schools are the sole portal into the practice of the profession and, as a result, medical schools incur a responsibility to match the kind and number of physicians they produce with the needs of society. This requires a socially responsive appraisal by medical schools of the way resources are used and curricula are designed, as well as how faculty rewards are distributed. Societal aims sometimes can, and do, conflict with a medical school's pursuit of esteem among its peers, which usually comes not through renown in teaching or the quality of practitioners it produces, but excellence in producing research and academic leaders.

Another important obligation of medical schools is to ensure that graduates are competent to enter postgraduate training and are free of obvious traits of character that would make them dangerous practitioners. Today, most of those admitted to medical school graduate and obtain licenses. Few fail, particularly in the clinical years. This places an obligation on medical schools to evaluate not only a student's knowledge and skill, but some facets of his or her character as well. Close supervision by clinical teachers is mandatory if dubious character traits are to be detected. Educators must balance fairness in their evaluations of students against their obligations to protect future patients from unsafe or dishonest practitioners.

Another societal responsibility of medical schools is to ensure equal opportunity for admission to all qualified students. Despite early progress, there is recent evidence of retrenchment in the support, financial and otherwise, available for minority student recruitment in the United States and in Great Britain (Hanft and White; Esmail and Everington). Subtle forms of discrimination probably still exist in the interview process where it is difficult to detect and prove (Connolly). Gender discrimination and sexism are no longer legally tolerable, but remain a persistent social problem (Hostler and Gressard). Academic administrators and faculty members are morally obliged to ensure equitable treatment of all applicants and must assume collective responsibility for inequities and injustice. In doing so, medical schools must thread their way carefully through an ethical maze of competing claims for preferential treatment and reverse discrimination.

Ethical obligations exist in the relationship between medical schools and faculty members. Faculties are owed freedom of inquiry in research and teaching, justice in hiring, tenure, promotion, compensation, and redress for injury or grievances. Faculty members in turn are morally responsible for the quality of their instruction, for fairness in the evaluation of students, and for properly apportioning their time and effort between teaching and personally remunerative activities such as clinical practice and consultation. Imbalance among these activities compromises the societal responsibilities of a medical faculty.

Faculty and administration are therefore obligated to detect inadequate teachers and to rehabilitate and reassign them or terminate their appointments when necessary. Tenure is among the most privileged benefits of academic life. The obligation to use it responsibly rests squarely on faculty members and administrators.

Incidents of scientific fraud, abuse of consulting and travel privileges, and conflicts of interest are cause for legitimate public concern. While the number is small, such abuses by faculty members invite external limitations and regulation of privileges that can interfere with the educational mission. The ethics of medical academia cannot be a private matter since the moral behavior of academics affects students, patients, the use of public funds, and the quality of fulfillment of the medical school's covenant with society.

Some Ethical Issues Peculiar to Clinical Education

The ethical issues outlined thus far are particular only in part to medical education. What is unique is the medical school's engagement in clinical education, i.e., in providing "hands on" experience for students in the actual care of patients. It is here that serious conflict may arise between patient care and student learning.

Physicians since Hippocrates have taught their students from actual cases. Usually, this was accomplished by preceptorship with a practicing physician or by case demonstrations to entire classes of students. In the mid-nineteenth century, it was a rare school that incorporated more intimate involvement in the care of patients in its teaching (Ludmerer). Toward the end of the same century, William Osler involved students more directly as clinical clerks at the Johns Hopkins Hospital, where they " … lived and worked … as part of its machinery, as an essential part of the work of the wards" (Osler, p. 389). This practice lagged in other schools until the reform of education in 1910 (Ludmerer). Since then, however, it has become standard pedagogic practice.

Today, clinical education centers on practical experience under supervision at every level, from medical school through postgraduate specialty training to lifelong continuing education. Until recently, the merits of this training have been so much taken for granted that the ethical conflicts inherent in the process have been neglected (Fry; Pilowski).

Clinical education by its nature unavoidably puts the aims of caring for patients into potential conflict with the aims of teaching and learning. The involvement of medical students, interns, and residents in patient care slows the process of care, increases its discomforts and fragmentation, and, at times, poses significant danger to the patient. With close supervision by experienced clinical teachers, these potential conflicts are tolerable. The clinical teacher therefore carries a double responsibility for balancing the quality of his or her pedagogy with the quality of patient care.

The moral status of medical students is ambiguous. They are physicians in utero, that is, in a developmental state of competence to provide care. When they enter medical school they are laypersons. When they graduate they are physicians, still in need of further training before they can become safe and competent practitioners. During this process, they take on progressive degrees of responsibility associated with the privilege of caring for patients, although their capacity to fulfill that responsibility is limited.

Patients come to university hospitals primarily to receive optimal treatment, not to be subjects of teaching. They may understand in a general way what being in a teaching hospital means. This in no way suggests, as some assume, that patients give implicit consent to become "teaching material." Patients in teaching hospitals preserve their moral right to know the relative degrees of competence of those caring for them. They have a right to give informed consent to any procedures and to know whether an untrained or partially trained person will perform that procedure. When unskilled students participate in procedures, patients are owed appropriate supervision by someone of significantly greater competence who can protect their safety.

Medical students, therefore, should disclose the fact that they are students to avoid the attributions of knowledge and trust patients still associate with anyone bearing the title "doctor" (Greer; Ganos; Brody; Liepman). They should be introduced as students by their supervisors before procedures like spinal taps and chest taps are performed. For their part, students as well as their supervisors must thoroughly acquaint themselves with the procedures in question and must observe a sufficient number performed by experienced clinicians. Students are under an obligation to refrain from conducting a procedure until these requirements are met and to resist the "see one, do one" philosophy of some clinical teachers. They must also receive instruction on how to obtain a morally and legally valid consent (Johnson et al.).

Students must also be sensitive enough to discontinue even the simplest procedures, such as a venipuncture, if their efforts cause discomfort (Williams and Fost). These injunctions are particularly important in highly personal and sensitive situations such as learning to do vaginal or rectal examinations (Bewley et al.; and Lawton et al.).

Medical students also face problems of personal ethical integrity with respect to abortion, treating patients with acquired immunodeficiency syndrome (AIDS), and attitudes toward the poor (Christakis and Feudtner; Dans; Crandall et al.; Currey et al.; Holleman). They may observe unethical or unacceptable behavior of teachers or colleagues (Morris). The extent of their responsibility and the real possibility of punitive treatment if students "blow the whistle" is a difficult, unresolved, but genuine ethical issue. Students may cheat on exams or see others do so (Rozance; Stimmel). By virtue of their presence at the bedside as members of the "team," they may be drawn prematurely into advising about the ethics of other colleagues. Helping students to deal with these moral dilemmas poses a new challenge to students and to their clinical teachers. This is a crucial part of the ethical maturation of the student (Drew; Andre; Wiesemann).

Two final examples of recently debated ethical dilemmas center on the moral status of dead human bodies and of animals of other species similar to humans. To what extent may recently dead human bodies be used to teach intubation, resuscitation, and tracheostomy? Who can, or should, give permission? May it be presumed? Is it necessary at all (Benfield et al.; Iserson)? Are the moral rights of other animal species to be considered so that they never or rarely should be used in teaching or research? Do computer models or tissue and cell preparations adequately replace animal experimentation?

Conclusion

Despite the sanction society gives to clinical education, there are important ethical obligations that limit this privilege. In no sense can learning by practice be a "right" of medical students or medical schools no matter how high the tuition or the degree of social utility. The privileges of clinical education cannot be bought at any price by the student, or granted even for good purpose by the medical school. Only a social mandate can legitimize the invasions of privacy a medical education entails.

The ethical issues of clinical education have just begun to receive the ethical scrutiny they deserve. Fundamental conceptual issues like the moral status of medical students, dead bodies, and animals are coupled with very practical issues regarding student–faculty and student–patient relationships. Clearer guidelines are needed to deal with the ethical issues characteristic of clinical education. We can expect the literature on this topic to expand in size, sophistication, and importance in the immediate future.

edmund d. pellegrino (1995)

SEE ALSO: Clinical Ethics; Competence; Conflict of Interest; Dentistry; Ethics; Family and Family Medicine; Informed Consent; Nursing Ethics; Profession and Professional Ethics; Race and Racism; Sexism; Virtue and Character; Whistleblowing

BIBLIOGRAPHY

Andre, Judith. 1992. "Learning to See: Moral Growth during Medical Training." Journal of Medical Ethics 18(3): 148–152.

Benfield, D. Gary; Flaksman, Richard J.; Lin, Tsun-Asin; Kantaki, Anand D.; Kokomoor, Franklin W.; and Vallman, John H.1991. "Teaching Intubation Skills Using Newly Deceased Infants." Journal of the American Medical Association 265(18): 2360–2363.

Bewley, Susan. 1992. "Teaching Vaginal Examination." British Medical Journal 305(6849): 369.

Brody, Howard. 1983. "Deception in the Teaching Hospital." In Difficult Decisions in Medical Ethics: The Fourth Volume in a Series on Ethics and Humanism in Medicine, pp. 81–86, ed. Doreen L. Ganos, Rachel E. Lipson, Gwynedd Warren, and Barbara J. Weil. New York: Alan R. Liss.

Christakis, Dmitri A., and Feudtner, Chris. 1993. "Ethics in a Short White Coat: The Ethical Dilemmas that Medical Students Confront." Academic Physician 68(4): 249–254.

Connolly, Paul H. 1979. "What Are the Medical Schools Up To? Abortions and Admissions." Commonwealth 105(17): 551–552.

Crandall, Sonia J.; Volk, Robert J.; and Loemker, Vicki. 1993. "Medical Students' Attitudes Toward Providing Care for the Underserved: Are We Training Socially Responsible Physicians?" Journal of the American Medical Association 269(19): 2519–2523.

Currey, Charles J.; Johnson, Michael; and Ogden, Barbara. 1990. "Willingness of Health-Professions Students to Treat Patients with AIDS." Academic Medicine 65(7): 472–474.

Dans, Peter E. 1992. "Medical Students and Abortion: Reconciling Personal Beliefs and Professional Roles at One Medical School." Academic Medicine 67(3): 207–211.

Drew, Barbara L. 1992. "What If the Whistle Blower Is a Student? The Advisory Role of the Instructor." In Ethical Dilemmas in Contemporary Nursing Practice, pp. 117–127, ed. Gladys B. White. Washington, D.C.: American Nurses Publishing.

Esmail, A., and Everington, S. 1993. "Racial Discrimination against Doctors from Ethnic Minorities." British Medical Journal 306(6879): 691–692.

Flexner, Abraham. 1910. Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching. Birmingham, AL: Classics of Medicine Library.

Fry, Sara T. 1991. "Is Health Care Delivery by Partially Trained Professionals Ever Justified?" Journal of Clinical Ethics 2(1): 42–44.

Ganos, Doreen L. 1983. "Introduction: Deception in the Teaching Hospital." In Difficult Decisions in Medical Ethics: The Fourth Volume in a Series on Ethics and Humanism in Medicine, pp. 77–78, ed. Doreen L. Ganos, Rachel E. Lipson, Gwynedd Warren, and Barbara J. Weil. New York: Alan R. Liss.

Greer, David S. 1987. "To Inform or Not to Inform Patients About Students." Journal of Medical Education 62(10): 861–862.

Hanft, Ruth S., and White, Catherine C. 1987. "Constraining the Supply of Physicians: Effects on Black Physicians." Milbank Quarterly 65 (suppl. 2): 249–269.

Holleman, Warren Lee. 1992. "Challenges Facing Student Clinicians." Human Medicine 8(30): 205–211.

Hostler, Sharon L., and Gressard, Risa P. 1993. "Perceptions of the Gender Fairness of the Medical School Environment." Journal of the American Medical Women's Association 48(2): 51–54.

Iserson, Kenneth V. 1993. "Postmortem Procedures in the Emergency Department: Using the Recently Dead to Practice and Teach." Journal of Medical Ethics 19(2): 92–98.

Johnson, Shirley M.; Kurtz, Margot E.; Tomlinson, Tom; and Fleck, Leonard. 1992. "Teaching the Process of Obtaining Informed Consent to Medical Students." Academic Medicine 67(9): 598–600.

Lawton, Frank G.; Redman, Charles W. E.; and Luesley, David M. 1990. "Patient Consent for Gynaecological Examination." British Journal of Hospital Medicine 44(5): 326, 329.

Liepman, Marcia K. 1983. "Deception in the Teaching Hospital: Where We Are and Where We've Been." In Difficult Decisions in Medical Ethics: The Fourth Volume in a Series on Ethics and Humanism in Medicine, pp. 87–94, ed. Doreen L. Ganos, Rachel E. Lipson, Gwynedd Warren, and Barbara J. Weil. New York: Alan R. Liss.

Ludmerer, Kenneth M. 1985. Learning to Heal: The Development of American Medical Education. New York: Basic Books.

Morris, Mark. 1992. "When Loyalties Are Divided Between Teachers and Patients." Journal of Medical Ethics 18(3): 153–155.

Osler, William. 1943. Aequanimitas: With Other Addresses to Medical Students, Nurses, and Practitioners of Medicine, 3rd edition. Philadelphia: Blakiston.

Pellegrino, Edmund D. 1976. "Medical Schools as Moral Agents." Transactions of the American Clinical and Climatological Association 88: 54–67.

Pellegrino, Edmund D. 1978. "Medical Education." In vol. 2 of Encyclopedia of Bioethics, pp. 863–870, ed. Warren T. Reich. New York: Macmillan.

Pilowski, I. 1973. "The Student, the Patient and His Illness: The Ethics of Clinical Teaching." Medical Journal of Australia 1(17): 858–859.

Rozance, Christine P. 1991. "Cheating in Medical Schools: Implications for Students and Patients." Journal of the American Medical Association 266(17): 2453, 2456.

Stimmel, Barry. 1990. "Cheating in Medical Schools: A Problem or an Annoyance?" Rhode Island Medical Journal 73(9): 413–416.

Wiesemann, Claudia. 1993. "Eine Ethik des Nichtwissens." Ethik Med 5: 3–12.

Williams, Charles T., and Fost, Norman. 1992. "Ethical Considerations Surrounding First Time Procedures: A Study and Analysis of Patient Attitudes Toward Spinal Taps by Students." Kennedy Institute of Ethics Journal 2(3): 217–231.

Medical Education

views updated May 09 2018

Medical Education

Sources

Schools. Prior to the Revolution, there were very few trained physicians in the United States, and what few there were had been educated in Europe, typically in Edinburgh, Scotland. In the United States only Philadelphia and New York had medical schools, both established in the 1760s. The post-Revolutionary War period saw a significant increase in the number of medical schools affiliated with prestigious institutions. Dartmouth Medical School was established in 1781, Harvard in 1782, the University of Maryland at Baltimore in 1807, and Yale and Columbia in 1813. At the same time state licensing boards established standards for medical practice. The medical schools and state boards fought over the right to license physicians; Harvard, for example, claimed that its medical-school graduates did not have to take the state exams, and eventually the state agreed. By about 1820 most states allowed medical schools to license physicians, and Americans began to take pride in their medical schools, reasoning that there was no need to train doctors in Europe. Unfortunately, it would be many decades before that belief would be justified.

Quality of Education . Despite the increase in the number of medical schools, nine out often doctors still had no formal training. Even the formal training was meagermedical schools would grant a license after only one year of study beyond the bachelors degree. Doctors mainly learned on the job. One particular problem in medical education was the difficulty in finding human cadavers, which would help prospective doctors learn human anatomy. The public had a deep revulsion to human dissection, and rarely would permission be given to dissect a body, or even to perform an autopsy. The only legal way to get human bodies was for physicians and doctors to wait for the execution of a criminal. Some medical students and doctors would rob graves to get corpses. There was even a club at Harvard called the Spunkers, who procured corpses of derelicts and criminals.

New York Doctors Riot. The most dramatic example of public disapproval of dissection was the Doctors Riot in 1788. A group of boys climbed a ladder to look into a window at New York Hospital. They saw four men dissecting a corpse. The public was already furious over a series of grave robberies, and when news of the dissections got out, an angry mob formed to punish these suspected graverobbers. The rioters entered and ransacked the hospital, and would have lynched the doctors had they not escaped. The militia was called in to disperse the mob, and the riot ended, but the basic conflict between the medical professions research needs and the individuals right to a safe burial remained unsettled.

Midwifery. Since medieval times, midwives, rather than male physicians, had helped women deliver babies. Midwives tended to be unschooled, yet many of them had years of experienceand some medical historians claim that women were better off with a midwife than with some theorizing doctor. A few doctors believed that midwifery and professional medicine could be brought closer together. William Shippen, who had been trained in obstetrics in London, gave private lectures in midwifery in Philadelphia in 1782. He admitted midwives to these lecturesan unusual practice at the time. Like Samuel Bard in New York, he believed that midwives could handle most births, while physicians could be called in for emergencies. With the fading of the moral taboo over mens involvement in childbirth, physicians began to displace the midwives, at least in the cities, and for those who could afford it. But in the rural areas midwifery continued to flourish, as seen in the case of Martha Ballard, who from 1785 to 1815 kept a detailed diary of her life in rural Maine. She bore nine children herself, and delivered 816 others. Midwives were often the equivalent of country doctors, using their practical knowledge, herbal medicines, and empathy to meet the health-care needs of their neighbors. Ballard knew how to manufacture salves, syrups, pills, teas, and ointments, how to prepare an oil emulsion (she called it an oil a mulge), how to poultice wounds, dress burns, treat dysentery, sore throat, frostbite, measles, colic, hooping Cough, Chin cough, St. Vitas dance, flying pains, the salt rhume, and the itch, how to cut an infants tongue, administer a clister (enema), lance an abscessed breast, apply a blister or a back plaster, induce vomiting, assuage bleeding, reduce swelling, and relieve a toothache, as well as deliver babies. By any standard Ballard was a remarkable woman, as capable as any country doctor, although women as yet had never attained the status of physicians. It was not until 1849 that Elizabeth Blackwell became the first woman to obtain an M.D. degree.

MEDICAL TRAINING AT HARVARD

Harvard Medical School got its start in 1783 when John Warren, a surgeon in the American Revolution, began giving lectures for the Boston Medical Society. These lectures were so popular that Harvard College (from which Warren had graduated in 1771) saw the opportunity to catch up with its rivals, the College of Philadelphia and Kings College (Columbia University), both of which had established medical schools.

When it was first established in 1783, the Medical Institution of Harvard College had little money, a few pieces of laboratory equipment, and a dingy classroom in the basement of Harvard Hall. Warren was appointed professor of anatomy and surgery, Benjamin Waterhouse (unpopular and pretentious, but the best-educated doctor in New England) professor of theory and practice of physic, and Aaron Dexter professor of chemistry. Students who could pay their fees were admitted; there were no entrance exams. Since some of the students could not write, the final qualifying exam was oral. To receive a bachelor of medicine degree, a student would attend college for two four-month terms and serve a six-month apprenticeship. Classroom lectures were delivered in the pompous, droning style favored by Waterhouse; students were given no opportunity to ask bothersome questions. Laboratory training was limited, and clinical experience could only be had several miles away at the pest house (the hospital where people with infectious diseases were quarantined) in Boston.

In 1810 the Harvard Medical School moved to Boston, and soon joined up with Massachusetts General Hospital, where students could get the clinical training they desperately needed. Yet even by 1860 there were still no entrance requirements, exams continued to be oral, and a student had to pass only five of the nine subject areas to graduate. After the Civil War, reforms instituted by Harvard president Charles William Eliot tightened entrance requirements and placed new emphasis on scientific discipline and academic rigor.

Source: John Langone, Harvard Med (New York: Crown, 1995).

Sources

Richard Harrison Shryock, Medicine and Society in America, 16601860 (New York: New York University Press, 1960);

Laurel Thatcher Ulrich, A Midwifes Tale (New York: Knopf, 1990).

Medical Education

views updated May 23 2018

MEDICAL EDUCATION


The path to a career in medicine in the United States is well defined. Aspiring physicians must earn an undergraduate degree, complete four years of medical school, participate in a minimum of three years of graduate medical training, and pass three national examinations for licensure. Becoming a physician also demands a desire to work with people; intellectual, emotional, and physical stamina; and an ability to think critically to solve complex problems.

Preparation for one of the world's most highly respected careers often starts in high school by taking courses in biology, chemistry, and physics. Preparation continues during college, with particular attention to the courses needed for admission to medical school. Although the specific number of credits required for admission to medical school varies, the minimum college course requirements include one year of biology, two years of chemistry (one year of general/inorganic chemistry and one year of organic chemistry), and one year of physics, all with adequate laboratory experiences. Medical schools may require or strongly recommend taking mathematics and computer science courses in college, though only a small number demand a specific sequence of mathematics courses. Candidates for admission to medical schools are also expected to have a solid background in English, the humanities, and the social sciences.

There is an expectation that aspiring physicians will participate in health-oriented research and in volunteer activities to demonstrate their commitment to the profession. These types of extracurricular activities provide opportunities to explore ones' motivations, specific interests, and aptitude for a career in medicine.

Typically, the process of applying to medical school begins during the junior year of undergraduate study. One of the first steps is to take the Medical College Admission Test (MCAT) in the spring of the junior year. The MCAT is a standardized test designed to measure knowledge in the biological and physical sciences, the ability to read and interpret information, and communication skills. Students indicate which medical schools they want to receive their MCAT scores.

The American Medical College Application Service (AMCAS) facilitates applying to medical school by centralizing the submission of information and supporting materials. Of the 125 medical schools in the United States, 114 participate in AMCAS. Students submit one set of application materials and one official transcript to AMCAS, which in turn distributes the information to participating institutions as designated by the applicant. Deadlines for receiving applications are determined by the individual medical schools. Applications to non-AMCAS medical schools are submitted directly to those institutions in accordance with their individual requirements and deadlines.

Admission committees, composed of faculty members from the basic and clinical sciences departments, screen and prioritize the applications. Academic ability and personal qualities are used to discern applicants' qualifications for medical school. Academic ability is measured in terms of grades on undergraduate courses (with emphasis on the required science courses) and MCAT scores. College grades and MCAT scores are considered the most important predictors of medical school performance during the first two years. Most students admitted to medical school have above average (3.0 and higher) undergraduate grade point averages. An undergraduate major in the sciences is not a mandatory requirement for admission to medical school. Most admission committees look for well-rounded individuals and strive to admit a diversified class. The importance of MCAT scores to admission decisions varies by institution.

Admission committees also look for evidence of maturity, self-discipline, commitment to helping others, and leadership qualities. Candidates' personal statements, letters of evaluation, and the breadth and variety of extracurricular activities in health-related settings are used as indicators of personal attributes. Many medical schools have specific programs for recruiting and enrolling minority students to help increase the number of underrepresented minorities who practice medicine. Interviews with faculty members also provide information about the applicant's personal background and motivation to become a doctor.

Each medical school decides the number of students that will be admitted each year. Some medical schools accept high school graduates into combined bachelor's and medical degree programs, or combined medical and graduate degree programs.

Medical school applicants are urged to submit applications for financial assistance in conjunction with applications for admission. Loans, primarily sponsored by the federal government, are the major source of financial aid for medical school. Some schools offer academic scholarships.

For the 19981999 academic year, the American Association of Medical Colleges (AAMC) reported that 41,004 individuals applied to medical school. AMCAS participants applied to an average of 11.5 AMCAS-participating schools. Among first-time applicants, 45.9 percent (27,525) were accepted to a medical school. AAMC data further indicates that 6,353 candidates were accepted to two or more medical schools in 1998. Medical schools start issuing acceptances to the entering class by March 15 each year.

Medical schools typically provide four years of medical education, with the goal of preparing students to enter three-to seven-year programs of graduate medical training, which are referred to as residency programs. Medical school programs leading to the medical degree (M.D.) generally consist of two years of study in the basic sciences and two years in the clinical sciences. The basic sciences include anatomy, biochemistry, physiology, microbiology, pharmacology, pathology, and behavioral sciences. Clinical education begins in the third year with required clinical clerkships in internal medicine, pediatrics, family medicine, obstetrics and gynecology, surgery, and psychiatry. During six-to twelve-week rotations, students learn how to take a medical history, conduct a physical examination, and recognize familiar disease patterns. Students are allowed to shape their own course of study during the fourth year with elective courses in the clinical specialties or research. Most medical schools strive to integrate basic science and clinical science instruction throughout the four-year curriculum.

In addition to written examinations and direct observations of performance, Step 1 and Step 2 of the United States Medical Licensing Examination (USMLE) are also used to measure the acquisition of medical knowledge. Medical students take Step 1, which measures understanding and ability to apply key concepts in the basic sciences, after completion of the second year of medical school. Passing Step 1 is a requirement for graduation at the majority of medical schools. Step 2, which is taken at the beginning of the senior year, evaluates medical knowledge and understanding of the clinical sciences. More than half of all American medical schools require passing Step 2 as a condition for graduation.

The Liaison Committee on Medical Education (LCME) monitors the quality of education that is provided by American medical schools that award the medical degree. Similar accrediting bodies exist for schools of osteopathic medicine and schools of podiatry.

Students apply to graduate medical programs through the Electronic Residency Application Service (ERAS), a centralized computer-based service that transmits applications, personal statements, medical school transcripts, and Dean's Letters to residency program directors. Students register their first, second, and third choices for residency placements through the National Resident Matching Program (NRMP). The NRMP provides an impartial venue for matching applicants and programs. The "match" facilitates placements by establishing a uniform procedure for communication between students and residency directors, and for announcing residency selections. Matches are usually announced in March of the senior year of medical school.

Graduate medical education programs (residencies) provide extensive, direct patient-care experiences in recognized medical specialties. Three-year residencies in family practice, emergency medicine, pediatrics, and internal medicine are typical. Several other specialties require one year of general practice followed by three to five years of advanced training. Participation in an accredited residency program and passing the USMLE Step 3 are requirements for licensure in most states.

See also: Medical College Admission Test.

bibliography

Association of American Medical Colleges. 1999. Medical School Admission Requirements: United States and Canada, 20002001, 50th edition. Washington, DC: Association of American Medical Colleges.

Crawford, Jane D. 1994. The Premedical Planning Guide, 3rd edition. Baltimore, MD: Williams and Wilkins.

internet resources

American Association of Medical Colleges. 2000. "AAMC: Medical College Admission Test (MCAT)." <www.aamc.org/students/mcat/>.

Association of American Medical Colleges. 2000. "Getting into Medical School." <www.aamc.org/students/considering/gettingin.htm>.

National Resident Matching Program. 2000. "About the NRMP." <www.nrmp.org/about_nrmp>.

National Resident Matching Program. 2000. "About Residency." <www.nrmp.org/res_match/about_res>.

Juanita F. Buford

About this article

Medical education

All Sources -
Updated Aug 13 2018 About encyclopedia.com content Print Topic

NEARBY TERMS

Medical education