Health care reform
HEALTH CARE. The term "health care system" refers to a country's system of delivering services for the prevention and treatment of disease and for the promotion of physical and mental well-being. Of particular interest to a health care system is how medical care is organized, financed, and delivered. The organization of care refers to such issues as who gives care (for example, primary care physicians, specialist physicians, nurses, and alternative practitioners) and whether they are practicing as individuals, in small groups, in large groups, or in massive corporate organizations. The financing of care involves who pays for medical services (for example, self-pay, private insurance, Medicare, or Medicaid) and how much money is spent on medical care. The delivery of care refers to how and where medical services are provided (for example, in hospitals, doctors' offices, or various types of outpatient clinics; and in rural, urban, or suburban locations).
Health care systems, like medical knowledge and medical practice, are not fixed but are continually evolving. In part, health care systems reflect the changing scientific and technologic nature of medical practice. For instance, the rise of modern surgery in the late nineteenth and early twentieth centuries helped create the modern hospital in the United States and helped lead to the concentration of so many medical and surgical services in hospital settings. However, the rise of "minimally invasive" surgery a century later contributed to the movement of many surgical procedures out of hospitals and into doctors' offices and other outpatient locations. A country's health care system also reflects in part the culture and values of that society. Thus, physicians in the United States, Canada, France, Germany, and Great Britain follow similar medical practices, but the health care systems of these nations vary considerably, reflecting the different cultural values and mores of those societies.
Traditional Medical Practice in America
For the first century of the republic, almost all physicians engaged in "general practice"—the provision of medical and surgical care for all diseases and for all patients, regardless of sex and age. Typically, doctors engaged in "solo practice," whereby they practiced by themselves without partners. Doctors' offices were typically at their homes or farms. Reflecting the rural makeup of the country, most physicians resided in rural settings. House calls were common. Payment was on the "fee-for-service" basis. Doctors would give patients a bill, and patients would pay out of pocket.
Medicine at this time was not an easy way for an individual to earn a living. Many physicians could not be kept busy practicing medicine, and it was common for doctors to have a second business like a farm, general store, or pharmacy. Physician income, on average, was not high, and doctors often received payment in kind—a chicken or box of fruit rather than money. Doctors also experienced vigorous competition for patients from a variety of alternative or lay healers like Thomsonians, homeopaths, and faith healers.
In the last quarter of the nineteenth century and first quarter of the twentieth century, fueled by the revolution in medical science (particularly the rise of bacteriology and modern surgery), the technologic capacity and cultural authority of physicians in the United States began to escalate. Competition for patients from alternative healers diminished, and most Americans thought of consulting a doctor if they needed medical services. The location of care moved to doctors' offices for routine illnesses and to hospitals for surgery, childbirth, and major medical problems. Indeed, the hospital came to be considered the "doctor's workshop." In 1875, there were 661 hospitals in the United States containing in aggregate about 30,000 beds. By 1930, the number of acute care hospitals had increased to around 7,000, and together they contained about one million beds. Since most hospitals were concentrated in cities and large towns, where larger concentrations of patients could be found, doctors were increasingly found in larger metropolises. In the 1920s, the U.S. population was still 50 percent rural, but already 80 percent of physicians resided in cities or large towns.
Before World War II (1939–1945), about 75 to 80 percent of doctors continued to engage in general practice. However, specialty medicine was already becoming prominent. Residency programs in the clinical specialties had been created, and by 1940 formal certifying boards in the major clinical specialties had been established. Decade by decade, fueled by the growing results of scientific research and the resultant transformation of medical practice—antibiotics, hormones, vitamins, antiseizure medications,
|Specialization in Medicine|
|American Board of Ophthalmology||1916|
|American Board of Pediatrics||1933|
|American Board of Radiology||1934|
|American Board of Psychiatry and Neurology||1934|
|American Board of Orthopedic Surgery||1934|
|American Board of Colon and Rectal Surgery||1934|
|American Board of Urology||1935|
|American Board of Pathology||1936|
|American Board of Internal Medicine||1936|
|American Board of Anesthesiology||1937|
|American Board of Plastic Surgery||1937|
|American Board of Surgery||1937|
|American Board of Neurological Surgery||1940|
safer childbirth, and many effective new drugs and operations—the cultural authority of doctors continued to grow. By 1940, competition to "regular medicine" from alternative healers had markedly slackened, and the average U.S. physician earned 2½ times the income of the average worker. (Some medical specialists earned much more.) Most physicians continued in solo, fee-for-service practice, and health care was not yet considered a fundamental right. As one manifestation of this phenomenon, a "two-tiered" system of health care officially existed—private rooms in hospitals for paying patients, and large wards for indigent patients where as many as thirty or forty "charity" patients would be housed together in one wide open room. In many hospitals and clinics, particularly in the South, hospital wards were segregated by race.
The Transformation of Health Care, 1945–1985
The four decades following World War II witnessed even more extraordinary advances in the ability of medical care to prevent and relieve suffering. Powerful diagnostic tools were developed, such as automated chemistry analyzers, radioimmunoassays, computerized tomography, and nuclear magnetic resonance imaging. New vaccines, most notably the polio vaccine, were developed. Equally impressive therapeutic procedures came into use, such as newer and more powerful antibiotics, antihypertensive drugs, corticosteroids, immunosuppressants, kidney dialysis machines, mechanical ventilators, hip replacements, open-heart surgery, and a variety of organ transplantations. In 1900, average life expectancy in the United States was forty-seven years, and the major causes of death each year were various infections. By midcentury, chronic diseases such as cancer, stroke, and heart attacks had replaced infections as the major causes of death, and by the end of the century life expectancy in the United States had increased about 30 years from that of 1900. Most Americans now faced the problem of helping their parents or grandparents cope with Alzheimer's disease or cancer rather than that of standing by helplessly watching their children suffocate to death from diphtheria.
These exceptional scientific accomplishments, together with the development of the civil rights movement after World War II, resulted in profound changes in the country's health care delivery system. Before the war, most American physicians were still general practitioners; by 1960, 85 to 90 percent of medical graduates were choosing careers in specialty or subspecialty medicine. Fewer and fewer doctors were engaged in solo practice; instead, physicians increasingly began to practice in groups with other physicians. The egalitarian spirit of post–World War II society resulted in the new view that health care was a fundamental right of all citizens, not merely a privilege. This change in attitude was financed by the rise of "third-party payers" that brought more and more Americans into the health care system. In the 1940s, 1950s, and 1960s, private medical insurance companies like Blue Cross/Blue Shield began providing health care insurance to millions of middle-class citizens. In 1965, the enactment of the landmark Medicare (a federal program for individuals over 65) and Medicaid (joint federal and state programs for the poor) legislation extended health care coverage to millions of additional Americans. Medicare and Medicaid also brought to an end the era of segregation at U.S. hospitals, for institutions with segregated wards were ineligible to receive federal payments. Third-party payers of this era continued to reimburse physicians and hospitals on a fee-for-service basis. For providers of medical care, this meant unprecedented financial prosperity and minimal interference by payers in medical decision-making.
Despite these accomplishments, however, the health care system was under increasing stress. Tens of millions of Americans still did not have access to health care. (When President Bill Clinton assumed office in 1993, the number of uninsured Americans was estimated at 40 million. When he left office in 2001, that number had climbed to around 48 million.) Many patients and health policy experts complained of the fragmentation of services that resulted from increasing specialization; others argued that there was an overemphasis on disease treatment and a relative neglect of disease prevention and health promotion. The increasingly complicated U.S. health care system became inundated with paperwork and "red tape," which was estimated to be two to four times as much as in other Western industrialized nations. And the scientific and technological advances of medicine created a host of unprecedented ethical issues: the meaning of life and death; when and how to turn off an artificial life-support device; how to preserve patient autonomy and to obtain proper informed consent for clinical care or research trials.
To most observers, however, the most critical problem of the health care system was soaring costs. In the fifteen years following the passage of Medicare and Medicaid, expenditures on health care in dollars increased
|U.S. Health Care Costs|
|Dollars||Percentage of GDP|
|1950||$12.7 billion||4.5 percent|
|1965||$40 billion (est.)||6 percent|
|1980||$230 billion||9 percent|
|2000||$1.2 trillion||14 percent|
nearly sixfold, and health care costs rose from 6 percent to 9 percent of the country's gross domestic product (GDP). Lee Iacocca, while president of Chrysler in the late 1970s, stunned many Americans by pointing out that U.S. automobile companies were spending more per car on health premiums for workers than for the steel that went into the automobiles. Public opinion polls of the early 1980s revealed that 60 percent of the population worried about health care costs, compared with only 10 percent who worried about the quality of care. Millions of Americans became unwillingly tied to their employers, unable to switch to a better job because of the loss of health care benefits if they did so. Employers found their competitiveness in the global market to be compromised, for they were competing with foreign companies that paid far less for employee health insurance than they did. In the era of the soaring federal budget deficits of the Reagan administration, these problems seemed even more insurmountable.
The Managed Care Era, 1985–Present
In the mid-1980s, soaring medical care costs, coupled with the inability of federal regulations and the medical profession on its own to achieve any meaningful cost control, led to the business-imposed approach of "managed care." "Managed care" is a generic term that refers to a large variety of reimbursement plans in which third-party payers attempt to control costs by limiting the utilization of medical services, in contrast to the "hands off" style of traditional fee-for-service payment. Examples of such cost-savings strategies include the requirement that physicians prescribe drugs only on a plan's approved formulary, mandated preauthorizations before hospitalization or surgery, severe restrictions on the length of time a patient may remain in the hospital, and the requirement that patients be allowed to see specialists only if referred by a "gatekeeper." Ironically, the first health maintenance organization, Kaiser Permanente, had been organized in the 1930s to achieve better coordination and continuity of care and to emphasize preventive medical services. Any cost savings that were achieved were considered a secondary benefit. By the 1980s, however, the attempt to control costs had become the dominant force underlying the managed care movement.
Unquestionably, the managed care movement has brought much good. It has forced the medical profession for the first time to think seriously about costs; it has encouraged greater attention to patients as consumers (for example, better parking and more palatable hospital food); and it has stimulated the use of modern information technologies and business practices in the U.S. health care system. In addition, the managed care movement has encouraged physicians to move many treatments and procedures from hospitals to less costly ambulatory settings, when that can be done safely.
However, there have been serious drawbacks to managed care that in the view of many observers have outweighed its accomplishments. Managed care has not kept its promise of controlling health care costs, and in the early years of President George Walker Bush's administration, the country once again faced double-digit health care inflation. In the view of many, the emphasis on cost containment has come at the erosion of the quality of care, and the dollar-dominated medical marketplace has been highly injurious to medical education, medical schools, and teaching hospitals. Managed care has also resulted in a serious loss of trust in doctors and the health care system—creating a widespread fear that doctors might be acting as "double agents," allegedly serving patients but in fact refusing them needed tests and procedures in order to save money for the employing organization or insurance company. As a result, the twenty-first century has opened with a significant public backlash against managed care and a vociferous "patients' rights movement."
Ironically, many of the perceived abuses of managed care have less to do with the principles of managed care than with the presence of the profit motive in investor-owned managed care organizations. Nonprofit managed care organizations, such as Kaiser Permanente, retain about 5 percent of the health premiums they receive for administrative and capital expenses and use the remaining 95 percent to provide health care for enrollees. For-profit managed care companies, in contrast, seek to minimize what they call the "medical loss"—the portion of the health care premium that is actually used for health care. Instead of spending 95 percent of their premiums on health care (a "medical loss" of 95 percent), they spend only 80, 70, or even 60 percent of the premiums on health services, retaining the rest for the financial benefit of executives and investors. Some astute observers of the U.S. health care system consider the for-profit motive in the delivery of medical services—rather than managed care per se—the more serious problem. However, since 90 percent of managed care organizations are investor-owned companies, the for-profit problem is highly significant.
The U.S. health care system has three primary goals: the provision of high-quality care, ready access to the system, and affordable costs. The practical problem in health care policy is that the pursuit of any two of these goals aggravates the third. Thus, a more accessible system of high-quality care will tend to lead to higher costs, while a low-cost system available to everyone is likely to be achieved at the price of diminishing quality.
Certain causes of health care inflation are desirable and inevitable: an aging population and the development of new drugs and technologies. However, other causes of soaring health care costs are clearly less defensible. These include the high administrative costs of the U.S. health care system, a litigious culture that results in the high price of "defensive medicine," a profligate American practice style in which many doctors often perform unnecessary tests and procedures, the inflationary consequences of having a "third party" pay the bill (thereby removing incentives from both doctors and patients to conserve dollars), and the existence of for-profit managed care organizations and hospital chains that each year divert billions of dollars of health care premiums away from medical care and into private wealth. Clearly, there is much room to operate a more efficient, responsible health care delivery system in the United States at a more affordable price.
Yet the wiser and more efficient use of resources is only one challenge to our country's health care system. In the twenty-first century, the country will still face the problem of limited resources and seemingly limitless demand. At some point hard decisions will have to be made about what services will and will not be paid for. Any efforts at cost containment must continue to be appropriately balanced with efforts to maintain high quality and patient advocacy in medical care. Better access to the system must also be provided. Medical insurance alone will not solve the health problems of a poor urban community where there are no hospitals, doctors, clinics, or pharmacies. Lastly, the American public must be wise and courageous enough to maintain realistic expectations of medicine. This can be done by recognizing the broad determinants of health like good education and meaningful employment opportunities, avoiding the "medicalization" of social ills like crime and drug addiction, and recognizing that individuals must assume responsibility for their own health by choosing a healthy lifestyle. Only when all these issues are satisfactorily taken into account will the United States have a health care delivery system that matches the promise of what medical science and practice have to offer.
Fox, Daniel M. Health Policies, Health Politics: The British and American Experience, 1911–1965. Princeton, N.J.: Princeton University Press, 1986.
Fuchs, Victor R. The Health Economy. Cambridge, Mass.: Harvard University Press, 1986.
Gray, Bradford H. The Profit Motive and Patient Care: The Changing Accountability of Doctors and Hospitals. Cambridge, Mass.: Harvard University Press, 1991.
Hiatt, Howard H. America's Health in the Balance: Choice or Chance? New York: Harper and Row, 1987.
Ludmerer, Kenneth M. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York: Oxford University Press, 1999.
Lundberg, George D. Severed Trust: Why American Medicine Hasn't Been Fixed. New York: Basic Books, 2000.
Mechanic, David. Painful Choices: Research and Essays on Health Care. New Brunswick, N.J.: Rutgers University Press, 1989.
Rodwin, Marc A. Medicine, Money, and Morals: Physicians' Conflicts of Interest. New York: Oxford University Press, 1993.
Rosen, George. The Structure of American Medical Practice, 1875–1941. Edited by Charles E. Rosenberg. Philadelphia: University of Pennsylvania Press, 1983.
Rosenberg, Charles E. The Care of Strangers: The Rise of America's Hospital System. New York: Basic Books, 1987.
Starr, Paul. The Social Transformation of American Medicine: The Rise of a Sovereign Profession and the Making of a Vast Industry. New York: Basic Books, 1982.
Stevens, Rosemary. In Sickness and in Wealth: America's Hospitals in the Twentieth Century. New York: Basic Books, 1989.
"Health Care." Dictionary of American History. 2003. Encyclopedia.com. (August 29, 2016). http://www.encyclopedia.com/doc/1G2-3401801871.html
"Health Care." Dictionary of American History. 2003. Retrieved August 29, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3401801871.html
A specialist in geriatrics, Dr. Risa Lavizzo-Mourey has made significant contributions to health care policy in the United States. In addition to maintaining a clinical practice and a teaching career, she has served on numerous committees and has advised the federal government on health care reform. In 2003 she assumed leadership of the country's largest health care philanthropy, the Robert Wood Johnson Foundation, which distributes more than $500 million annually to agencies and programs that focus on public health issues. In 2003 Modern Physician named Lavizzo-Mourey one of 25 "visionary doctors…who rattle the status quo by flexing their experience and reputations in a variety of disciplines—politics, quality, information technology, public health, philanthropy and business."
Parents Sparked Interest in Medical Career
Lavizzo-Mourey, who grew up in Seattle, credits her parents with nurturing her interest in becoming a doctor. "I was blessed to have two parents who were physicians," she told Chronicle of Philanthropy writer Domenica Marchetti, "so I grew up seeing what an incredible opportunity it is to be a physician." She also learned that there is a "tremendous need" for health care among underserved populations, including uninsured and low-income groups. "I saw that very clearly in my parents' practice," she recalled to Marchetti, "when they were practicing in fairly poor neighborhoods in the time before Medicare."
After one year at the University of Washington, Lavizzo-Mourey attended the State University of New York at Stony Brook. She was admitted to Harvard Medical School after completing her junior year of college, and received her M.D. in 1979. After completing her internship and residency in internal medicine at Brigham and Women's Hospital in Boston, she did additional training in geriatrics at the University of Pennsylvania School of Medicine. There she also completed postgraduate research as a Robert Wood Johnson Clinical Scholar. She joined the university's medical school faculty in 1986. While pursuing an ambitious academic career, Lavizzo-Mourey also earned an M.B.A. in health care administration from the University of Pennsylvania's Wharton School.
Lavizzo-Mourey held several distinguished positions at the University of Pennsylvania. She began as an assistant professor, rose to associate professor, and was later named the Sylvan Eisman Professor of Medicine and Health Care Systems. She also served as director of the university's Institute on Aging, and was chief of geriatric medicine at the medical school.
Advised Government on Health Care Policies
In 1992 Lavizzo-Mourey took a leave of absence from the University of Pennsylvania to join the Federal Agency for Health Care Policy and Research (now the Agency for Health Care Quality), where she served as deputy director until 1994. She also served on other federal advisory committees, including the Task Force of Aging Research, the National Committee for Vital and Health Statistics, and the Institute of Medicine's Panel on Disease and Disability Prevention among Older Adults. As a member of the White House Task Force on Health Care Reform, Lavizzo-Mourey chaired the working group on Quality of Care.
Among her particular concerns is the delivery of quality medical care to minority populations. In 2002 she coauthored an Institute of Medicine report, "Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care," that found that minorities are likely to receive lower-quality medical care than whites, regardless of income. In fact, evidence showed that this tendency occurs even when medical insurance, age, and the extent of the disease were comparable across race and ethnicities. According to Harvard Public Health Now, the report showed that "evidence suggests that bias, prejudice, and stereotyping by health care providers may contribute to differences in care."
Indeed, Lavizzo-Mourey herself had been treated dismissively when she brought her daughter, then aged two, to a hospital emergency room in Philadelphia. Though the child had symptoms that Lavizzo-Mourey knew to be serious, the resident physician who examined her said that the child was not ill. Only after Lavizzo-Mourey asserted her own medical training and demanded more tests did it become evident that the child had pneumonia. Such treatment, Lavizzo-Mourey stated in remarks quoted in Health Leaders, "is really very troubling" and must be addressed.
Became Head of Nation's Largest Health Philanthropy
In 2001 Lavizzo-Mourey became senior vice president of the Robert Wood Johnson Foundation. This organization had funded her earlier postgraduate research and, according to a Health Leaders article, had watched her career with increasing respect. As vice president, Lavizzo-Mourey took charge of the foundation's grant making in health care, much of which focused on treating and preventing substance abuse and improving care for patients with chronic illnesses. Upon the retirement of foundation head Steven Schroeder, Lavizzo-Mourey became president and chief executive officer of the foundation in January of 2003. She is the first woman to hold this position.
In an interview in the Journal of the American Medical Association (JAMA), Lavizzo-Mourey noted that she has been interested in the "interface between public policy, clinical medicine, and business" since her days as a medical student. The expertise in this area that she gained while earning her Wharton School M.B.A. in health care administration, she observed, positioned her to become an effective consultant on public health policy. As an advisor to both the first Bush administration and the Clinton administration, she learned how to create effective cooperation across different agencies and how to set realistic expectations.
Lavizzo-Mourey told JAMA that the Robert Wood Johnson Foundation would continue to focus on its four "bedrock goals—ensuring access to quality care; improving the quality of care and support of people with chronic health conditions; reducing the harm caused by substance abuse; and promoting healthy communities and lifestyles." She outlined a plan to distribute funds through a portfolio system that would allow the foundation to evaluate programs more effectively, and also noted that the foundation's efforts to improve public health systems would help to strengthen their ability to respond to possible biological or chemical weapons attacks.
At a Glance …
Born on September 25, 1954, in Seattle, WA; daughter of Philip V. Lavizzo and Blanche Sellers Lavizzo, both physicians; married Robert J. Mourey, a physician, June 21, 1975; children: Rel, Max. Education: University of Washington, 1972-73; State University of New York-Stony Brook, 1973-75; Harvard Medical School, MD, 1979; Wharton School, University of Pennsylvania, MBA, 1986.
Career: Brigham and Women's Hospital, Boston, MA, medical resident, 1979-82; Temple University Medical School, Philadelphia, PA, clinical instructor, 1982-84; University of Pennsylvania School of Medicine, assistant professor of medicine, 1986-92, associate professor, 1992-97, Sylvan Eisman Professor of Medicine, 1997-2001, director of the Institute of Aging, chief of Division of Geriatric Medicine, 1984-1992, 1994-2001; Philadelphia Veterans Administration Medical Center, associate chief of staff; Agency for Health Care Policy and Research, Rockville, MD, deputy administrator, 1992-94; Robert Wood Johnson Foundation, Princeton, NJ, senior vice president and director, Health Care Group, 2001-2002, president and chief executive officer, 2003–.
Memberships: Association of Academic Minority Physicians; National Medical Association.
Awards: University of Pennsylvania, Class of 1970 Term Professor, 1992; American College of Physicians, fellowship; American Geriatric Society, fellowship; Alonzo Smythe Yerby Award, 2002.
Addresses: Office— P.O. Box 2316, College Road East and Route 1, Princeton, NJ 0853-2316.
While continuing with the foundation's basic work, Lavizzo-Mourey also hopes to expand efforts in several areas, including programs to meet needs of elderly patients; programs to address obesity; and measures to eliminate unequal treatment due to race or ethnicity. Her vision, she told JAMA, is that "everyone in this country has access to safe, effective, equitable health care when they need it, and that everyone gets a good start in life with a nurturing relationship that protects them from harm, including things like tobacco, alcohol, and drugs. That everyone has an opportunity for lifelong vitality, an opportunity for treatment if they are addicted, and that we promote a caring society and we keep our attention focused on the possible."
Chronicle of Philanthropy, August 8, 2002.
Journal of the American Medical Association (JAMA), April 16, 2003, pp. 1909-1911.
Modern Healthcare, August 5, 2002, p. 33.
Modern Physician, May 1, 2003, p. 26.
"Profile: Giving Spirit," Health Leaders, www.healthleaders.com (September 14, 2004).
"Risa Lavizzo-Mourey Named Head of the Robert Wood Johnson Foundation," Wharton Health Care Management Alumni, www.whartonhealthcare.org (September 14, 2004).
"Risa Lavizzo-Mourey: President and CEO," Robert Wood Johnson Foundation, www.rwjf.org (September 14, 2004).
"Lavizzo-Mourey Receives 2002 Yerby Award," Harvard Public Health Now, www.hsph.harvard.edu (September 14, 2004).
Shostak, E.. "Lavizzo-Mourey, Risa." Contemporary Black Biography. 2005. Encyclopedia.com. (August 29, 2016). http://www.encyclopedia.com/doc/1G2-3431400044.html
Shostak, E.. "Lavizzo-Mourey, Risa." Contemporary Black Biography. 2005. Retrieved August 29, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1G2-3431400044.html
"health care." A Dictionary of Nursing. 2008. Encyclopedia.com. (August 29, 2016). http://www.encyclopedia.com/doc/1O62-healthcare.html
"health care." A Dictionary of Nursing. 2008. Retrieved August 29, 2016 from Encyclopedia.com: http://www.encyclopedia.com/doc/1O62-healthcare.html