Entries

Dictionary of American History Encyclopedia of EducationAmerican Eras Further reading

NON JS

Medical Education

MEDICAL EDUCATION

MEDICAL EDUCATION. Pain, suffering, and premature death from disease have ravaged human beings from the beginning of recorded time. This harsh fact was as true for colonial America, where life expectancy as late as 1800 was only twenty-five years, as for every other known society. Yet the aspiration for health and the desire to explain the mysteries of disease have also characterized all known human societies. Thus, inhabitants of colonial America sought medical care from a variety of practitioners of folk, herbal, and Indian medicine. In addition, some members of the clergy, such as the Protestant New England clergyman Cotton Mather (1663–1728), incorporated the healing art in their services to the colonists.

In the eighteenth century, practitioners of "regular, " or "allopathic, " medicine began to become more commonplace. A small number of elite practitioners obtained medical degrees, primarily by studying medicine in Edinburgh, Leiden, or London. This mode of study was not economically feasible for more than a few. Of necessity, the apprenticeship tradition became the dominant system of medical training, with the typical preceptorial period lasting three years. Apprentice physicians would study medicine with a practicing physician, who would allow the apprentice the opportunity to participate in his practice in exchange for a fee and the performance of various menial chores.

In the early nineteenth century, the "proprietary" medical school became the dominant vehicle of medical instruction in America. In 1800, only four medical schools existed: the University of Pennsylvania (founded in 1765), King's College (1767), Harvard (1782), and Dartmouth (1797). Between 1810 and 1840, twenty-six new schools were established, and between 1840 and 1876, forty-seven more. In the late nineteenth century, dozens of additional schools sprouted. Originally, these schools were intended to be a supplement to the apprenticeship system. However, because they could more readily provide systematic teaching, by the middle of the nineteenth century they had superseded the apprenticeship as the principal pathway of medical education.

Though the first schools were created with lofty ambitions, the quality of instruction at the proprietary schools rapidly deteriorated, even based on the standards of the day. Entrance requirements were nonexistent other than the ability to pay the fees. Disciplinary problems arising from outrageous student behavior were commonplace. The standard course of instruction in the mid-nineteenth century consisted of two four-month terms of lectures during the winter, with the second term identical to the first. The curriculum generally consisted of seven courses: anatomy; physiology and pathology; materia medica, therapeutics, and pharmacy; chemistry and medical jurisprudence; theory and practice of medicine; principles and practice of surgery; and obstetrics and the diseases of women and children. Instruction was wholly didactic: seven or eight hours of lectures a day, supplemented by textbook reading. Laboratory work was sparse, and even in the clinical subjects, no opportunity to work with patients was provided. Examinations were brief and superficial; virtually the only requirement for graduation was the ability to pay the fees. Students who wished a rigorous medical education had to supplement what they learned in medical school in other ways, such as through enrollment at non-degree-granting extramural private schools, study in Europe, or work in hospitals as "house pupils."

The mid-nineteenth-century proprietary schools, such as Bennett Medical College and Jenner Medical College in Chicago, were independent institutions. University or hospital affiliations, in the few cases in which they existed, were nominal. The faculties were small, typically consisting of six or eight professors. The professors owned the schools and operated them for profit. A commercial spirit thus pervaded the schools, for the faculty shared the spoils of what was left of student fees after expenses. The mark of a good medical school, like that of any business, was considered its profitability. Since an amphitheater was virtually the only requirement to operate a medical school, physical facilities were meager. The second floor above the corner drugstore would suffice; a school that had a building of its own was considered amply endowed.

The Creation of the Modern Medical School

While American medical education was floundering in the mid-1800s, the reform of the system was already beginning. At the root of the transformation was a series of underlying events: the revolution in experimental medicine that was proceeding in Europe; the existence of a cadre of American doctors traveling to Europe (particularly Germany) to learn laboratory methods; the emergence of the modern university in America; the development of a system of mass public education to provide qualified students for the university; and the cultivation of a habit of philanthropy among some very rich industrialists. Together, these developments provided the infrastructure for a new system of medical education soon to appear.

The creation of America's current system of medical education occurred in two overlapping stages. In the first stage, which began in the middle of the nineteenth century, a revolution in ideas occurred concerning the purpose and methods of medical education. After the Civil War, medical educators began rejecting traditional notions that medical education should inculcate facts through rote memorization. Rather, the new objective of medical education was to produce problem-solvers and critical thinkers who knew how to find out and evaluate information for themselves. To do so, medical educators deemphasized the traditional didactic teaching methods of lectures and textbooks and began speaking of the importance of self-education and learning by doing. Through laboratory work and clinical clerkships, students were to be active participants in their learning, not passive observers as before. A generation before John Dewey, medical educators were espousing the ideas of what later came to be called "progressive education."

At the same time, a revolution occurred in the institutional mission of medical schools. The view emerged that the modern medical school should not only engage in the highest level of teaching but also should be committed to the discovery of new knowledge through research. This meant that medical schools could no longer remain freestanding institutions. Rather, they had to become integral parts of universities and hire scientifically trained, full-time faculty who, like all university professors, were researchers as well as teachers.

In the early 1870s, the first lasting reforms occurred, as Harvard, Pennsylvania, and Michigan extended their course of study to three years, added new scientific subjects to the curriculum, required laboratory work of each student, and began hiring full-time medical scientists to the faculty. In the late 1870s, the plans for the new Johns Hopkins Medical School were announced, though for financial reasons the opening was delayed until 1893. When the school finally did open, it immediately became the model by which all other medical schools were measured, much as the Johns Hopkins University in 1876 had become the model for the modern American research university. A college degree was required for admission, a four-year curriculum with nine-month terms was adopted, classes were small, students were frequently examined, the laboratory and clinical clerkship were the primary teaching devices, and a brilliant full-time faculty made medical research as well as medical education part of its mission. In the 1880s and 1890s, schools across the country started to emulate the pioneering schools, and a campaign to reform American medical education began. By the turn of the century, the university medical school had become the acknowledged ideal, and proprietary schools were already closing for lack of students.

Nevertheless, ideas alone were insufficient to create the modern medical school. The new teaching methods were extremely costly to implement, and hospitals had to be persuaded to join medical schools in the work of medical education. Thus, an institutional as well as an intellectual revolution was needed. Between 1885 and 1925 this revolution occurred. Large sums of money were raised, new laboratories were constructed, an army of full-time faculty was assembled, and clinical facilities were acquired. Medical schools, which had existed autonomously during the proprietary era, became closely affiliated with universities and teaching hospitals.

No individual contributed more dramatically to the institution-building process than Abraham Flexner (1886– 1959), an educator from Louisville who had joined the staff of the Carnegie Foundation for the Advancement of Teaching. In 1910, Flexner published a muckraking report, Medical Education in the United States and Canada. In this book, he described the ideal conditions of medical education, as exemplified by the Johns Hopkins Medical School, and the deficient conditions that still existed at most medical schools. Flexner made no intellectual contribution to the discussion of how physicians should be taught, for he adopted the ideas that had developed within the medical faculties during the 1870s and 1880s. However, this report made the reform of medical education a cause célèbre, transforming what previously had been a private matter within the profession into a broad social movement similar to other reform movements in Progressive Era America. The public responded by opening its pocketbook, state and municipal taxes were used to fund medical education, private philanthropists, George Eastman and Robert Brookings among them, and philanthropic organizations all contributed significant sums to support medical education. In the two decades that followed the public provided the money and clinical facilities that had long eluded medical schools. In addition, an outraged public, scandalized by Flexner's acerbic depiction of the proprietary schools still in existence, brought a sudden end to the proprietary era through the enactment of state licensing laws, which mandated that medical schools operated for profit would not be accredited.

Graduate Medical Education

Through World War I, medical education focused almost exclusively on "undergraduate" medical education—the years of study at medical school leading to the M.D. degree. At a time when the great majority of medical school graduates entered general practice, the four years of medical school were considered an adequate preparation for the practice of medicine. Abraham Flexner's 1910 report did not even mention internship or other hospital training for medical graduates.

By World War I, however, medical knowledge, techniques, and practices had grown enormously. There was too much to teach, even in a four-year course. Accordingly, a period of hospital education following graduation—the "internship"—became standard for every physician. By the mid-1920s the internship had become required of all U.S. medical graduates.

The modern internship had its origins in the informal system of hospital appointments that dated to the early nineteenth century. Until the end of the century, such positions were scarce, available only to a tiny handful of graduates. Though such positions allowed the opportunity to live and work in a hospital for a year or two, they were saddled with considerable education deficiencies. Interns had limited clinical responsibilities, and the positions involved a considerable amount of nonmedical chores like maintaining the hospital laboratories. During the first two decades of the twentieth century, the internship was transformed into a true educational experience. Internship now provided a full schedule of conferences, seminars, rounds, and lectures as well as the opportunity to participate actively in patient management.

Internships came in three forms. The most popular was the so-called "rotating" internship, in which interns rotated among all the clinical areas. Some hospitals, particularly those associated with medical schools, offered "straight" internships in medicine or surgery, in which interns spent the entire time in that field. The third type was the "mixed" internship, a cross between the rotating and straight internship. Mixed internships provided more time in medicine and surgery and less in the various specialties than rotating internships. Typically, internships lasted one year, though some were as long as three years. All forms of internship provided a rounding-out clinical experience that proved invaluable as a preparation for general medical practice.

Medical education in the early twentieth century faced another challenge: meeting the needs of individuals who desired to practice a clinical specialty (such as ophthalmology, pediatrics, or surgery) or to pursue a career in medical research. To this end the "residency"—a several-year hospital experience following internship—became the accepted vehicle.

The modern residency was introduced to America at the opening of the Johns Hopkins Hospital in 1889. Based upon the system of "house assistants" in the medical clinics of German universities, the Hopkins residency was designed to be an academic experience for mature scholars. During World War I, the Hopkins residency system began to spread to other institutions, much as the Hopkins system of undergraduate medical education had spread to other medical schools the generation before. By the 1930s, the residency had become the sole route to specialty training. In doing so, it displaced a variety of informal, educationally unsound paths to specialty practice that had preceded it, such as taking a short course in a medical specialty at a for-profit graduate medical school or apprenticing oneself to a senior physician already recognized as a specialist.

Residency training before World War II had three essential characteristics. First, unlike internship, which was required of all medical school graduates before they could receive a license to practice medicine, residency positions were reserved for the elite. Only one-third of graduates were permitted to enter residency programs following the completion of an internship, and only about one-quarter of first-year residents ultimately completed the entire program. Second, the defining educational feature of residency was the assumption of responsibility by residents for patient management. Residents evaluated patients themselves, made their own decisions about diagnosis and therapy, and performed their own procedures and treatments. They were supervised by—and accountable to—attending physicians, but they were allowed considerable clinical independence. This was felt to be the best way for learners to be transformed into mature physicians. Lastly, the residency experience at this time emphasized scholarship and inquiry as much as clinical training. The residency system assumed many characteristics of a graduate school within the hospital, and residents were carefully trained in clinical research. Residency came to be recognized as the breeding ground for the next generation of clinical investigators and medical scholars.

Evolution and Growth

Scientific knowledge is continually growing. In addition, the diseases facing a population are constantly changing, as are medical practices, cultural mores, and the health care delivery system. Thus, of necessity, medical education is always evolving to reflect changing scientific and social circumstances.

After World War II, medical educators continued to emphasize the importance of "active learning" and the cultivation of problem-solving skills. However, the postwar period witnessed several important curricular innovations: the development of an organ-based curriculum by Western Reserve (1950s); the invention of "problem-based" learning by McMaster (1970s); the introduction of a primary care curriculum by New Mexico (1980s); and the establishment of the "New Pathway" program at Harvard Medical School (1980s). In addition, all medical schools reduced the amount of required course work, increased the opportunity for electives, and began to provide early clinical experiences during the first and second years of medical school.

Reflecting changes in the broader society, medical schools also became more representative of the diverse population they served. Religious quotas against Jewish and Catholic students, established at many medical schools in the early 1920s, disappeared in the 1950s following the revelation of Nazi atrocities and changes in civil rights laws. The admission of African American students, though still short of target levels, roughly tripled from 2.7 percent of enrolled medical students in the 1960s to around 8 percent in the 1990s. Greater success was achieved in the enrollment of women, whose numbers increased from roughly 7 percent of students in the 1960s to about 50 percent in the 1990s.

Graduate medical education also changed significantly following World War II. In the late 1940s and 1950s, residency training became "democratized"—that is, it became available to all medical graduates, not merely the academic elite as before. Between 1940 and 1970, the number of residency positions at U.S. hospitals increased from 5,796 to 46,258. Thus, the number of residents seeking specialty training soared. At the same time, the academic component of residency training diminished. Residency became an exclusively clinical training ground rather than a preparation point for clinical research as before. Most physicians desiring research training now had to acquire that through Ph.D. programs or postgraduate research fellowships.

In addition, the stresses of residency training also increased substantially after Word War II. In the 1960s, intensive care units were introduced, as were new, life-sustaining technologies like ventilators and dialysis machines. Hospitalized patients tended to be much sicker than before, resulting in much more work. In the 1980s, following the death of nineteen-year-old Libby Zion at the New York Hospital, the public began to demand shorter hours and greater supervision of house officers. Ironically, after extensive investigation, Libby Zion's death appeared not to be the result of poor care provided by fatigued or unsupervised house officers. Nevertheless, the movement to regulate house staff hours gained strength.

As medical education was changing, it also grew longer. In the 1960s and 1970s, in response to the public's demand for more doctors, existing medical schools expanded their class size, and forty new medical schools were established. Between 1960 and 1980, the number of students entering U.S. medical schools increased from 8,298 to 17,320. Following World War II, the research mission of medical schools expanded enormously, mainly because of the infusion of huge amounts of research funding from the National Institutes of Health. The number of full-time faculty at U.S. medical schools grew from 3,500 in 1945 to 17,000 in 1965. After 1965, medical schools grew larger still, primarily because of the passage of Medicare and Medicaid legislation that year and the resultant explosion in demands on the schools to provide clinical care. By 1990, the number of clinical faculty at U.S. medical schools had grown to around 85,000, with most of the increase occurring in the clinical departments. By that time one-half of a typical medical school's income came from the practice of medicine by the full-time faculty. By the 1990s, the "academic health center"—the amalgam of a medical school with its teaching hospitals—had become an extremely large and complex organization with many responsibilities besides education and research. By the late 1990s, a typical academic health center could easily have a budget of $1.5 billion or more and be the largest employer in its community.

The Challenge of Managed Care

Though medical schools prospered and served the public well during the twentieth century, a cautionary note appeared at the end of the century. Academic health centers had grown strong and wealthy, but they had become dependent for their income on the policies of the third-party payers (insurance companies and government agencies) that paid the bills. During the managed care era of the 1990s, the parsimonious payments of many third-party payers began causing academic health centers considerable financial distress. For instance, in 2000 the University of Pennsylvania Health System suffered a $200 million operating loss. (All hospitals were threatened financially by managed care, but teaching centers, because of their higher costs, were particularly vulnerable.) In addition, the emphasis of managed care organizations on increasing the "throughput" of patients—seeing as many patients as possible, as quickly as possible—eroded the quality of educational programs. Students and residents no longer had as much time to learn by doing or to study their patients in depth. Hopefully, the desire of the profession and public to maintain quality in education and patient care will allow these difficulties to be surmounted in the years ahead.

BIBLIOGRAPHY

Bonner, Thomas N. American Doctors and German Universities: A Chapter in International Intellectual Relations, 1870–1914. Lincoln: University of Nebraska Press, 1963.

Fleming, Donald. William H. Welch and the Rise of Modern Medicine. Boston: Little, Brown, 1954.

Ludmerer, Kenneth M. Learning to Heal: The Development of American Medical Education. New York: Basic Books, 1985.

———. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press, 1999.

Norwood, William F. Medical Education in the United States before the Civil War. New York: Arno, 1971.

Kenneth M.Ludmerer

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Medical Education." Dictionary of American History. 2003. Encyclopedia.com. 24 Aug. 2016 <http://www.encyclopedia.com>.

"Medical Education." Dictionary of American History. 2003. Encyclopedia.com. (August 24, 2016). http://www.encyclopedia.com/doc/1G2-3401802583.html

"Medical Education." Dictionary of American History. 2003. Retrieved August 24, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3401802583.html

Medical Education

MEDICAL EDUCATION


The path to a career in medicine in the United States is well defined. Aspiring physicians must earn an undergraduate degree, complete four years of medical school, participate in a minimum of three years of graduate medical training, and pass three national examinations for licensure. Becoming a physician also demands a desire to work with people; intellectual, emotional, and physical stamina; and an ability to think critically to solve complex problems.

Preparation for one of the world's most highly respected careers often starts in high school by taking courses in biology, chemistry, and physics. Preparation continues during college, with particular attention to the courses needed for admission to medical school. Although the specific number of credits required for admission to medical school varies, the minimum college course requirements include one year of biology, two years of chemistry (one year of general/inorganic chemistry and one year of organic chemistry), and one year of physics, all with adequate laboratory experiences. Medical schools may require or strongly recommend taking mathematics and computer science courses in college, though only a small number demand a specific sequence of mathematics courses. Candidates for admission to medical schools are also expected to have a solid background in English, the humanities, and the social sciences.

There is an expectation that aspiring physicians will participate in health-oriented research and in volunteer activities to demonstrate their commitment to the profession. These types of extracurricular activities provide opportunities to explore ones' motivations, specific interests, and aptitude for a career in medicine.

Typically, the process of applying to medical school begins during the junior year of undergraduate study. One of the first steps is to take the Medical College Admission Test (MCAT) in the spring of the junior year. The MCAT is a standardized test designed to measure knowledge in the biological and physical sciences, the ability to read and interpret information, and communication skills. Students indicate which medical schools they want to receive their MCAT scores.

The American Medical College Application Service (AMCAS) facilitates applying to medical school by centralizing the submission of information and supporting materials. Of the 125 medical schools in the United States, 114 participate in AMCAS. Students submit one set of application materials and one official transcript to AMCAS, which in turn distributes the information to participating institutions as designated by the applicant. Deadlines for receiving applications are determined by the individual medical schools. Applications to non-AMCAS medical schools are submitted directly to those institutions in accordance with their individual requirements and deadlines.

Admission committees, composed of faculty members from the basic and clinical sciences departments, screen and prioritize the applications. Academic ability and personal qualities are used to discern applicants' qualifications for medical school. Academic ability is measured in terms of grades on undergraduate courses (with emphasis on the required science courses) and MCAT scores. College grades and MCAT scores are considered the most important predictors of medical school performance during the first two years. Most students admitted to medical school have above average (3.0 and higher) undergraduate grade point averages. An undergraduate major in the sciences is not a mandatory requirement for admission to medical school. Most admission committees look for well-rounded individuals and strive to admit a diversified class. The importance of MCAT scores to admission decisions varies by institution.

Admission committees also look for evidence of maturity, self-discipline, commitment to helping others, and leadership qualities. Candidates' personal statements, letters of evaluation, and the breadth and variety of extracurricular activities in health-related settings are used as indicators of personal attributes. Many medical schools have specific programs for recruiting and enrolling minority students to help increase the number of underrepresented minorities who practice medicine. Interviews with faculty members also provide information about the applicant's personal background and motivation to become a doctor.

Each medical school decides the number of students that will be admitted each year. Some medical schools accept high school graduates into combined bachelor's and medical degree programs, or combined medical and graduate degree programs.

Medical school applicants are urged to submit applications for financial assistance in conjunction with applications for admission. Loans, primarily sponsored by the federal government, are the major source of financial aid for medical school. Some schools offer academic scholarships.

For the 19981999 academic year, the American Association of Medical Colleges (AAMC) reported that 41,004 individuals applied to medical school. AMCAS participants applied to an average of 11.5 AMCAS-participating schools. Among first-time applicants, 45.9 percent (27,525) were accepted to a medical school. AAMC data further indicates that 6,353 candidates were accepted to two or more medical schools in 1998. Medical schools start issuing acceptances to the entering class by March 15 each year.

Medical schools typically provide four years of medical education, with the goal of preparing students to enter three-to seven-year programs of graduate medical training, which are referred to as residency programs. Medical school programs leading to the medical degree (M.D.) generally consist of two years of study in the basic sciences and two years in the clinical sciences. The basic sciences include anatomy, biochemistry, physiology, microbiology, pharmacology, pathology, and behavioral sciences. Clinical education begins in the third year with required clinical clerkships in internal medicine, pediatrics, family medicine, obstetrics and gynecology, surgery, and psychiatry. During six-to twelve-week rotations, students learn how to take a medical history, conduct a physical examination, and recognize familiar disease patterns. Students are allowed to shape their own course of study during the fourth year with elective courses in the clinical specialties or research. Most medical schools strive to integrate basic science and clinical science instruction throughout the four-year curriculum.

In addition to written examinations and direct observations of performance, Step 1 and Step 2 of the United States Medical Licensing Examination (USMLE) are also used to measure the acquisition of medical knowledge. Medical students take Step 1, which measures understanding and ability to apply key concepts in the basic sciences, after completion of the second year of medical school. Passing Step 1 is a requirement for graduation at the majority of medical schools. Step 2, which is taken at the beginning of the senior year, evaluates medical knowledge and understanding of the clinical sciences. More than half of all American medical schools require passing Step 2 as a condition for graduation.

The Liaison Committee on Medical Education (LCME) monitors the quality of education that is provided by American medical schools that award the medical degree. Similar accrediting bodies exist for schools of osteopathic medicine and schools of podiatry.

Students apply to graduate medical programs through the Electronic Residency Application Service (ERAS), a centralized computer-based service that transmits applications, personal statements, medical school transcripts, and Dean's Letters to residency program directors. Students register their first, second, and third choices for residency placements through the National Resident Matching Program (NRMP). The NRMP provides an impartial venue for matching applicants and programs. The "match" facilitates placements by establishing a uniform procedure for communication between students and residency directors, and for announcing residency selections. Matches are usually announced in March of the senior year of medical school.

Graduate medical education programs (residencies) provide extensive, direct patient-care experiences in recognized medical specialties. Three-year residencies in family practice, emergency medicine, pediatrics, and internal medicine are typical. Several other specialties require one year of general practice followed by three to five years of advanced training. Participation in an accredited residency program and passing the USMLE Step 3 are requirements for licensure in most states.

See also: Medical College Admission Test.

bibliography

Association of American Medical Colleges. 1999. Medical School Admission Requirements: United States and Canada, 20002001, 50th edition. Washington, DC: Association of American Medical Colleges.

Crawford, Jane D. 1994. The Premedical Planning Guide, 3rd edition. Baltimore, MD: Williams and Wilkins.

internet resources

American Association of Medical Colleges. 2000. "AAMC: Medical College Admission Test (MCAT)." <www.aamc.org/students/mcat/>.

Association of American Medical Colleges. 2000. "Getting into Medical School." <www.aamc.org/students/considering/gettingin.htm>.

National Resident Matching Program. 2000. "About the NRMP." <www.nrmp.org/about_nrmp>.

National Resident Matching Program. 2000. "About Residency." <www.nrmp.org/res_match/about_res>.

Juanita F. Buford

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

BUFORD, JUANITA F.. "Medical Education." Encyclopedia of Education. 2002. Encyclopedia.com. 24 Aug. 2016 <http://www.encyclopedia.com>.

BUFORD, JUANITA F.. "Medical Education." Encyclopedia of Education. 2002. Encyclopedia.com. (August 24, 2016). http://www.encyclopedia.com/doc/1G2-3403200399.html

BUFORD, JUANITA F.. "Medical Education." Encyclopedia of Education. 2002. Retrieved August 24, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3403200399.html

Medical Education

Medical Education

Sources

Schools. Prior to the Revolution, there were very few trained physicians in the United States, and what few there were had been educated in Europe, typically in Edinburgh, Scotland. In the United States only Philadelphia and New York had medical schools, both established in the 1760s. The post-Revolutionary War period saw a significant increase in the number of medical schools affiliated with prestigious institutions. Dartmouth Medical School was established in 1781, Harvard in 1782, the University of Maryland at Baltimore in 1807, and Yale and Columbia in 1813. At the same time state licensing boards established standards for medical practice. The medical schools and state boards fought over the right to license physicians; Harvard, for example, claimed that its medical-school graduates did not have to take the state exams, and eventually the state agreed. By about 1820 most states allowed medical schools to license physicians, and Americans began to take pride in their medical schools, reasoning that there was no need to train doctors in Europe. Unfortunately, it would be many decades before that belief would be justified.

Quality of Education . Despite the increase in the number of medical schools, nine out often doctors still had no formal training. Even the formal training was meagermedical schools would grant a license after only one year of study beyond the bachelors degree. Doctors mainly learned on the job. One particular problem in medical education was the difficulty in finding human cadavers, which would help prospective doctors learn human anatomy. The public had a deep revulsion to human dissection, and rarely would permission be given to dissect a body, or even to perform an autopsy. The only legal way to get human bodies was for physicians and doctors to wait for the execution of a criminal. Some medical students and doctors would rob graves to get corpses. There was even a club at Harvard called the Spunkers, who procured corpses of derelicts and criminals.

New York Doctors Riot. The most dramatic example of public disapproval of dissection was the Doctors Riot in 1788. A group of boys climbed a ladder to look into a window at New York Hospital. They saw four men dissecting a corpse. The public was already furious over a series of grave robberies, and when news of the dissections got out, an angry mob formed to punish these suspected graverobbers. The rioters entered and ransacked the hospital, and would have lynched the doctors had they not escaped. The militia was called in to disperse the mob, and the riot ended, but the basic conflict between the medical professions research needs and the individuals right to a safe burial remained unsettled.

Midwifery. Since medieval times, midwives, rather than male physicians, had helped women deliver babies. Midwives tended to be unschooled, yet many of them had years of experienceand some medical historians claim that women were better off with a midwife than with some theorizing doctor. A few doctors believed that midwifery and professional medicine could be brought closer together. William Shippen, who had been trained in obstetrics in London, gave private lectures in midwifery in Philadelphia in 1782. He admitted midwives to these lecturesan unusual practice at the time. Like Samuel Bard in New York, he believed that midwives could handle most births, while physicians could be called in for emergencies. With the fading of the moral taboo over mens involvement in childbirth, physicians began to displace the midwives, at least in the cities, and for those who could afford it. But in the rural areas midwifery continued to flourish, as seen in the case of Martha Ballard, who from 1785 to 1815 kept a detailed diary of her life in rural Maine. She bore nine children herself, and delivered 816 others. Midwives were often the equivalent of country doctors, using their practical knowledge, herbal medicines, and empathy to meet the health-care needs of their neighbors. Ballard knew how to manufacture salves, syrups, pills, teas, and ointments, how to prepare an oil emulsion (she called it an oil a mulge), how to poultice wounds, dress burns, treat dysentery, sore throat, frostbite, measles, colic, hooping Cough, Chin cough, St. Vitas dance, flying pains, the salt rhume, and the itch, how to cut an infants tongue, administer a clister (enema), lance an abscessed breast, apply a blister or a back plaster, induce vomiting, assuage bleeding, reduce swelling, and relieve a toothache, as well as deliver babies. By any standard Ballard was a remarkable woman, as capable as any country doctor, although women as yet had never attained the status of physicians. It was not until 1849 that Elizabeth Blackwell became the first woman to obtain an M.D. degree.

MEDICAL TRAINING AT HARVARD

Harvard Medical School got its start in 1783 when John Warren, a surgeon in the American Revolution, began giving lectures for the Boston Medical Society. These lectures were so popular that Harvard College (from which Warren had graduated in 1771) saw the opportunity to catch up with its rivals, the College of Philadelphia and Kings College (Columbia University), both of which had established medical schools.

When it was first established in 1783, the Medical Institution of Harvard College had little money, a few pieces of laboratory equipment, and a dingy classroom in the basement of Harvard Hall. Warren was appointed professor of anatomy and surgery, Benjamin Waterhouse (unpopular and pretentious, but the best-educated doctor in New England) professor of theory and practice of physic, and Aaron Dexter professor of chemistry. Students who could pay their fees were admitted; there were no entrance exams. Since some of the students could not write, the final qualifying exam was oral. To receive a bachelor of medicine degree, a student would attend college for two four-month terms and serve a six-month apprenticeship. Classroom lectures were delivered in the pompous, droning style favored by Waterhouse; students were given no opportunity to ask bothersome questions. Laboratory training was limited, and clinical experience could only be had several miles away at the pest house (the hospital where people with infectious diseases were quarantined) in Boston.

In 1810 the Harvard Medical School moved to Boston, and soon joined up with Massachusetts General Hospital, where students could get the clinical training they desperately needed. Yet even by 1860 there were still no entrance requirements, exams continued to be oral, and a student had to pass only five of the nine subject areas to graduate. After the Civil War, reforms instituted by Harvard president Charles William Eliot tightened entrance requirements and placed new emphasis on scientific discipline and academic rigor.

Source: John Langone, Harvard Med (New York: Crown, 1995).

Sources

Richard Harrison Shryock, Medicine and Society in America, 16601860 (New York: New York University Press, 1960);

Laurel Thatcher Ulrich, A Midwifes Tale (New York: Knopf, 1990).

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Medical Education." American Eras. 1997. Encyclopedia.com. 24 Aug. 2016 <http://www.encyclopedia.com>.

"Medical Education." American Eras. 1997. Encyclopedia.com. (August 24, 2016). http://www.encyclopedia.com/doc/1G2-2536600873.html

"Medical Education." American Eras. 1997. Retrieved August 24, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-2536600873.html

Facts and information from other sites