Higher Education in the United States
Higher Education in the United States
HIGHER EDUCATION IN THE UNITED STATES
John R. Thelin
Jason R. Edwards
Joseph B. Berger
Maria Vita Calkins
At the start of the twenty-first century, higher education in the United States stands as a formidable enterprise. As an established "knowledge industry" it represents about 3 percent of the gross national product. Virtually every governor and legislature across the nation evokes colleges and universities as critical to a state's economic and cultural development. Its profile includes more than 4,000 accredited institutions that enroll over fifteen million students and confers in excess of two million degrees annually. Colleges and universities spend about $26 billion per year on research and development, of which $16 billion comes from federal agencies. The research universities' ability to attract expertise is recognized internationally.
This success story of growth and expansion began more than 300 years ago before the United States existed. Beginning in the seventeenth century, the idea of an American higher education grew to fruition throughout the ensuing centuries. At the same time, differences developed with each new era of collegiate growth, but the story has remained one of expanding access.
Imperial governments usually invested little in colonial colleges. The typical mercantile approach emphasized the exportation of agricultural products and raw materials from the provinces to the homeland. The British Empire, for instance, responded to Virginia's request for a seminary to save their souls, with "Souls?!? Damn your souls! Make tobacco." Despite such hostility, the American colonies generally enjoyed greater independence than the typical British territory. While a succession of kings and queens encouraged the cultivation and exportation of tobacco, rice, indigo, and cotton, colleges also flourished as an unlikely crop in America.
The colonists created institutions for higher education for several reasons. New England settlers included many alumni of the royally chartered British universities, Cambridge and Oxford, and therefore believed education was essential. In addition, the Puritans emphasized a learned clergy and an educated civil leadership. Their outlook generated Harvard College in 1636. Between Harvard's founding and the start of the American Revolution, the colonists chartered nine colleges and seminaries although only one in the South.
Religion provided an impetus for the creation of colonial colleges. As the First Great Awakening of the 1730s to 1770s initiated growth in a wider variety of Protestant churches, each denomination often desired its own seminary. Furthermore, each colony tended to favor a particular denomination and so the new colleges took on an importance for regional development as well. Presbyterians in New Jersey founded the College of New Jersey (later renamed Princeton). The College of William and Mary in Virginia maintained a strong Anglican orientation, reflecting that colony's settlement by landed gentry from England. The Baptists, who had been expelled from Massachusetts Bay Colony and settled in Rhode Island, established their own college but in an unusual move did not require religious tests for admission. Other dissenting religious groups, such as the Methodists and Quakers, became enthusiastic college builders after facing hostility in many colleges.
Small in size and limited in scope, colonial colleges rarely enrolled more than one hundred students and few completed their degrees. Yet the young men who attended these colonial colleges made historic and extraordinary contributions to both political thought and action. Also, colleges represented one of the few institutional ventures to receive royal and/or colonial government support and regulation during the eighteenth century. The college's multipurpose buildings were typically among the largest construction projects in the colony, matched only by a major church or a capitol.
Though colonial colleges were frontier institutions that expanded access to higher education, by contemporary standards the colonial period remained elite and exclusionary. Only white Christian males were allowed to matriculate. Women and African-Americans were denied participation by statute and custom, but colleges did serve Native Americans in a missionary capacity. The evangelism of the Protestant groups attracted donors, although in time the colonial colleges' devotion to such educational plans waned. In order to keep receiving financial support, however, the colleges argued that by educating young Christian men, missionaries would be available to preach Christianity to Native Americans.
Despite their limitations, the colonial colleges effectively educated a literate, articulate, and responsible American elite. Even though college education was not crucial for the professional and career advancement of sons of prosperous merchants and wealthy planters, the college alumni were disproportionately influential in politics and national affairs. Not only did they lead by action in revolutionary proclamations, but they followed through as military and political leaders. Due in part to a collegiate curriculum that drew from the advanced writings of Scottish and Enlightenment thinkers in political economy, the colonial college alumni designed a system of government destined to serve as a model for the world. The colonial colleges' legacy then was producing a generation of American leaders and thinkers whose combination of decisiveness and thoughtfulness literally turned the world "upside down."
Higher Education in the New United States
With the founding of the United States of America, governmental policies towards English-chartered colleges became unclear. Wary of centralized power, Americans maintained educational control close to home. Therefore, governance of colonial colleges became almost exclusively the jurisdiction of local and state governments. In actuality the schools enjoyed independence as the Supreme Court's famous Dartmouth decision in 1819 demonstrated that the new federal government would protect colleges from state intervention.
With the reputation of colleges remaining high, most state legislatures, particularly in the newer states west of the Allegheny and Appalachian mountain ranges, looked favorably on chartering colleges as long as the state did not have to provide financial support. Between 1800 and 1850, the United States experienced a "college building boom" in which more than two hundred degree-granting institutions were created. However, since most of these new colleges depended on student tuition payments and local donors, there was also a high closure rate and the schools that did survive typically struggled from year to year.
Although the classical languages and liberal studies of the bachelor of arts degree remained central to the character of American higher education in this era, several new fields gained a foothold in formal study. Engineering and science acquired a presence on the campus. Professional education for law and medicine usually also took place though in separate institutions. Nevertheless, few if any learned professions in the early nineteenth century required academic degrees or certification. Most states reserved the right to set requirements for professional practice, and these were for the most part meager.
Going to college early in the nineteenth century was not particularly expensive. The cost of potential lost opportunities presented a greater concern for students and parents. Employers seldom required college degrees, therefore college presidents faced the perpetual challenge of persuading young adults to delay pursuing their life's enterprises by spending four years on campus. Modest-income families decided whether or not a young man's potential contribution to family labor could be spared while he pursued higher education. The college experience and the college degree did confer prestige and often some professional advantages, but its perceived benefits did not always outweigh the costs.
The United States, for all its deserved acclaim of being a truly "new nation," remained faithful to many of the tenets of English common law. For example, in New England states, the small farms and principle of primogeniture forbade the division of a father's land among numerous sons, so families had to find useful work for those sons not inheriting land. Going to college provided an attractive alternative, especially in subsistence-farming regions. Affordable colleges in rural New England provided an important route to respect and employment in schools and churches. In fact, with the onset of the Second Great Awakening in the early nineteenth century, new denominations once again pushed for clergy educated in institutions dedicated to their particulars of faith. Their "missionary zeal" led to the founding of new schools and an increase in college attendance. In short, college became useful not just for the elite, but also for sons who had fewer prospects in the new nation. This development resulted in a host of small liberal arts colleges in the Northeast and later in Ohio, Kentucky, and Tennessee that served as an important incubator for a growing middle class.
Educational opportunities for young women followed a comparable pattern. Families often wondered how a young single woman could be self-supporting or contribute to the family welfare. A growing national demand for trained teachers due to the "common-school movement" of the 1830s provided one answer. Women could achieve financial independence and respectability within a rather rigid social structure by attending a normal school or female seminary that provided them with an education for employment as teachers in the ever-expanding nation.
The mid-nineteenth century. Variety and growth characterized college building during the mid-nineteenth century. In addition to the conspicuous church-related liberal arts colleges, various groups founded a range of other special interest institutions for advanced study. These included agricultural colleges, proprietary medical schools, freestanding law schools, engineering schools, and scientific colleges. Private philanthropy indicated a growing American interest in founding new institutions concentrated on advanced scientific, technical, and engineering education. Illustrative of this realm was the generous support for such colleges as Rensselaer, Drexel, Cooper Union, and the Massachusetts Institute of Technology.
One of the biggest shifts was the federal government becoming directly involved in higher education, which developed during the Civil War when southern congressmen who opposed the legislation were absent. The Morrill Act of 1862 set in motion an elaborate program whereby states received profits from the sale of an allotted portion of western lands if used to establish programs of agricultural, mechanical, and military sciences, along with liberal arts. The so-called land-grant act thereby stimulated numerous creative proposals and projects. In some cases, states attached their new engineering or agricultural programs to historic colleges. In others, they opted to create new state colleges. Between 1887 and 1914, the land-grant colleges gained support and collective political strength and expanded the definition and scope of university curricula. Legislation such as the Hatch Act and the "Second Morrill Act" of 1890 continued the expansion of federal involvement in education by bringing federal funding and projects to the new land-grant campuses.
Amidst this flurry of federal legislation, African Americans also received attention though the treatment tended to have mixed results. On the one hand, the Morrill Act of 1890 provided funding for African-American education, which led to the creation of Negro colleges in seventeen southern states–a substantial gain in educational opportunities. On the other hand, the guidelines meant that the U.S. government accepted and endorsed state and local practices of racial segregation. By increasing their role in funding higher education, the federal government helped shift the focus of many American colleges.
Higher education's gilded age: 1870 to 1910. Between 1870 and 1910 nearly all institutions of higher education enjoyed a surge in appeal both to prospective students and to benefactors. Some historians have called this period the "Age of the University." Although accurate, the image remains incomplete. The university ideal certainly took root and blossomed during this period, but the historic undergraduate college also enjoyed growth, support, and popularity. Because of an unprecedented era of commercial and industrial expansion, a new period of philanthropy made possible the founding of well-endowed universities. One enduring sign of this growth came in 1900 when the presidents of fourteen institutions created the Association of American Universities. Its charter members included Johns Hopkins, Columbia, Harvard, Cornell, Yale, Clark, Catholic University, Princeton, Stanford, and the Universities of Chicago, Pennsylvania, California, Michigan, and Wisconsin. Gradually, over the next decades, relatively young state universities in the Midwest, along with private institutions such as Brown, Northwestern, Massachusetts Institute of Technology, and Vanderbilt, would also gain recognition and "university" status for their acceptance into the Association of American Universities.
The creation of the Association of American Universities reinvigorated an ongoing and intense debate over the proper definition and role of a modern American university. Nevertheless, without any official consensus, some general patterns of practice and aspiration stood out. The new modern university emphasized graduate programs, including the study for and conferral of the doctor of philosophy degree or Ph.D. In fact, the proliferation of varied degree programs connected with professions illustrated a new era in higher education. Many undergraduate programs in agriculture, engineering, business, education, and home economics, along with military training, challenged the old definition of collegiate studies. Medicine, law, and theology, three traditional professions, developed varying relationships with universities and academic standards.
A lack of national academic standards, especially among secondary schools, colleges, and universities, gave rise to the entrance of private agencies into the higher education arena. Such organizations as the Carnegie Foundation for the Advancement of Teaching and the Rockefeller General Education Board adjudicated ratings among American universities. The foundation directors used a combination of coercion and incentives to prompt universities, including professional schools, to adhere to reasonable criteria of admissions, instruction, and certification. On balance, the foundations probably acknowledged and promoted those universities that were already reasonably strong and sound, and raised the floor for others.
Much to the chagrin of "serious scholars," students shaped the undergraduate world according to their own preferences. It was in the elaborate extracurricular experiences of intercollegiate sports, campus newspapers, collegiate drama, literary societies, alumni groups, and fraternities that students reveled. Student (and public) enthusiasm for these activities grew as the popular media glamorized the social activities rather than scholarly pursuits.
Although the new structure and ethos of the "university" gained attention for its innovation, equally important was the support for and interest in smaller liberal arts colleges. This rising tide for colleges included an extended boom for the founding of women's colleges. Mount Holyoke Seminary in western Massachusetts transformed itself into a bachelor's degree-granting institution. Other prominent women's colleges founded in this era were Smith, Wellesley, Radcliffe, Pembroke, Barnard, and Bryn Mawr. Women also gained access via new coeducational institutions such as the University of Chicago, Stanford University, and many state colleges and universities in the Midwest and the West.
Higher education between the world wars. Between 1914 and 1918 the American campus displayed some flexibility to accommodate special programs for the domestic effort during World War I. It included special training programs for military personnel and sporadic but important instances of faculty research leading to direct inventions and innovations in warfare. Projects such as future Harvard president James B. Conant's efforts to develop mustard gas foreshadowed even greater cooperation between the universities and federal government during World War II.
College enrollments and public enthusiasm surged after World War I. One indicator of this popularity was the proliferation of huge football stadiums–most of which were named "Memorial Field." Crowds exceeding 50,000 at campus games became standard at many universities. Although popular since the 1890s, intercollegiate athletics soared in commercial appeal during the 1920s. The absence of any substantive national voluntary self-regulation led the Carnegie Foundation for the Advancement of Teaching to publish a highly visible expose of college sports' excesses in 1929. Some university officials denied the report's findings, but the Carnegie Study was timely and accurate. The abuses in college sports underscored what Abraham Flexner of the Carnegie Foundation identified as the root source of problems in American higher education: a lack of consensus on clarity of mission and purpose. Unfortunately for Flexner and his colleagues, too many colleges and their constituencies were well served by the amorphous, unregulated nature of American higher education. What was intended as a marketplace of ideas became simply a marketplace, in which students were consumers and sports was the best-seller.
The onset of the Great Depression illustrated an interesting phenomenon: college enrollments increased during times of national financial hardship. While institutions reduced budgets, many worked to sustain American colleges in lean years. Some universities also demonstrated resourcefulness in seeking out business and industrial projects for their faculty in such fields as engineering and physics. These initiatives by such schools as Stanford, Massachusetts Institute of Technology, and California Institute of Technology laid the groundwork for external projects sponsored by both the private sector and the federal government that would come to fruition in the 1940s.
Higher education's golden age: 1945 to 1970. Between 1941 and 1945 American colleges and universities participated directly and effectively in a complex national war effort. This track record in times of duress brought long-term rewards and readjustments after the war. In 1947, the President's Commission on Higher Education in a Democracy concluded that federal funding of research should continue even in peacetime. In response to the "problem" of returning military personnel to the domestic economy and as a measure of gratitude, Congress passed the Servicemen's Readjustment Act (1944), popularly known as the "G.I. Bill." For at least a temporary period, this generous and flexible financial aid program enabled an unprecedented number of veterans to attend colleges, universities, and an array of "postsecondary" institutions. This legislation also gave energy to civil rights cases linked with educational access.
In addition to federal funding, growing states with enthusiastic governors and legislatures sought ways to work with their state's educational leaders to accommodate an impending enrollment boom. The rising birth rate and increased migration into selected states, along with a deliberate extension of college admissions, caused this dramatic growth. California led the way in statewide coordination with its Master Plan of 1960. This program aimed at accommodating mass access to affordable higher education by channeling students into tiered institutions.
Among the most conspicuous transformations was the emergence of a network of public junior colleges. Founded in the early 1900s, junior colleges experienced expansion in California during the 1930s. After World War II these institutions carried out two critical functions in mass postsecondary education. First, they developed a "transfer function" in which students could enter colleges or universities after two years of course work at the junior college. They also offered advanced, terminal degree instruction and certification in a range of professional and occupational fields. By the 1960s, the addition of a third function–readily accessible, low-priced continuing education for adults–led to a change in the name from junior college to community college.
The federal government participated in the expansion of sponsored research and development education during the 1950s and 1960s. Drawing from former MIT President Vannevar Bush's 1945 monograph, Science: The Endless Frontier, Congress and a succession of U.S. presidents endorsed federal sponsorship of high-level, peer-reviewed national research projects. Federal agencies that became most involved were those requiring applied technical research, specifically defense and agriculture. The behavioral sciences gradually adopted this model for large-scale psychological testing, and then various health care programs also sought funding. Agencies such as the National Institute of Health possessed a limited scope and a miniscule budget in the late 1940s, but acquired an increasing presence over the next four decades. In 1963, Clark Kerr's work The Uses of the University summarized this culmination of government patronage in research and development. According to Kerr, about fifty to one hundred institutions had positioned themselves to be "Federal Grant Universities": powerful incubators of advanced scholarship in the sciences possessing the ability to inspire confidence and funding in their research grant applications.
Both public and private universities benefited from governmental concerns about "cold war" defense and competition with the Soviet Union. Fears resulting from an extended definition of "national defense" led to funding for advanced studies in foreign languages, anthropology, and political science as well as the "hard" sciences of physics and chemistry. The transfer of these national programs to higher education institutions increased both the founding of new campuses and the construction of new buildings on older campuses. According to one study in 1986, about 75 percent of American campus buildings were constructed between 1960 and 1985, suggesting that the symbol for higher education during the cold war ought to be the building crane.
Enrollment also surged during the cold war era. Just prior to World War II the state universities with the largest enrollments–namely, the Ohio State University and the University of California at Berkeley–surged far ahead of other institutions with enrollments of around 19,000. Many major state universities prior to World War II had enrollments between 3,000 and 6,000. By 1970, however, the Ohio State University's main campus at Columbus enrolled more than 50,000–comparable to the University of Minnesota. The University of California had expanded its Berkeley campus enrollment to 26,000.
Some states responded to increasing enrollments with complex, multicampus systems. The University of California, for example, had ten campuses, with a total enrollment exceeding 150,000. At the same time, the network of California state colleges formed its own system, eventually enrolling about two hundred thousand students as well. In addition, California's community colleges further expanded the accessibility to higher education by forming more than one hundred campuses. In New York, education officials and legislators created an expanded system of more than sixty campuses, the State University of New York (SUNY). While individual states pursued some variation of this theme, public community college systems enjoyed the greatest gains in student enrollments and campus expansion. Especially in such populous states as California, Texas, and Florida, the community college systems served a larger and expanding portion of the state's population. Although relative enrollment in private (independent) colleges decreased from approximately 50 percent of college students in 1950 to about 30 percent, this change did not preclude substantial numerical growth. Rather, the construction of new institutions in the public sector was exceptionally brisk.
Prior to the 1970s, the federal government did not venture much into substantial student financial aid programs. Rather, state and local policies produced low tuition rates at public institutions. However, with the passage of the Education Amendments of 1972, the federal government increasingly promoted college access, affordability, and choice. The showcase of this government interest was a commitment to need-based, portable student financial aid. The Pell Grant, officially known as the Basic Educational Opportunity Grant (BEOG), provided entitlements to enrolled college students who demonstrated financial need. These initiatives fueled dramatic enrollment growth. Though initially popular, by 1978 the emphasis on student grants shifted increasingly to providing low-interest student loans.
Expanded access and growing national investment in the higher education infrastructure increased the need for administration and planning both inside and outside the campus. Hence, higher education in the United States underwent a "managerial revolution" in its decision-making and attempts at coordination. On another level this led to the proliferation of an increasingly complex academic bureaucracy. On a second level, it gave rise to a reliance on a prodigious testing industry. Although the College Entrance Examination Board (CEEB) had been in existence since the turn of the century, it gained great influence after World War II with the development and diffusion of the Educational Testing Service's Scholastic Aptitude Test (SAT). The capacity of the SAT to make determinations about college admissions and projections on college academic performance coexisted with doubts and controversies about the equity and validity of such high stakes tests. The SAT expanded the nationwide search for academic talent, and enabled the historic institutions of New England and the Atlantic Coast to draw a large percentage of students from public high schools (rather than primarily from nearby private prep schools), while attracting students from a wide geographic base. Despite the promise of standardized exams, by the 1960s there would be intense debates over the ability of the SAT and other such tests to identify genuine aptitude without bias toward socioeconomic class or educational experiences.
Questions of social justice and the clash of national laws with local practices came to the fore in the decade following Brown v. the Board of Education of Topeka, Kansas (1954) decision. In numerous states where public universities were segregated by race, policies were challenged. The southern campus came to be a real and symbolic focus of civil rights in American life.
Tensions and transitions: 1970 to 1985. Although American campuses expanded in the late 1950s and 1960s, many students did not feel they were well served. Crowding, lack of dormitories, and reliance on large lecture halls created the "impersonality of the multiversity." This malaise over the relative lack of attention to undergraduate education, combined with political activism over free speech, the antiwar protests, and issues of civil rights and social justice, spawned unrest on many American campuses between 1968 and 1972. Whether at such conspicuous universities as Berkeley, Columbia, or Michigan, or at quieter campuses, a generation of campus presidents and deans were unprepared to deal with widespread student dissatisfaction. Furthermore, the nation was unprepared for the tragedies that occurred at Kent State and Jackson State in 1970.
What governors and state legislators perceived as administrative failure to keep a campus house in order ultimately led to a loss of public and government confidence in colleges and universities. This change in attitude, combined with a stressed national economy, signaled for the first time in decades a tapering in public support for higher education. Double-digit inflation and an energy crisis, combined with warnings of a decline in college matriculation, left most American colleges and universities in a troubled situation between 1975 and the early 1980s. Postsecondary institutions in the 1970s enrolled an increasingly diverse student body in areas of race, gender, and ethnicity. Less clear, however, was the question of whether the educational experiences within those institutional structures were effective and equitable, as American higher education faced criticisms for charges of tracking lower income students into particular subsets of institutions and courses of study.
The end of the twentieth century. A fifteen-year period beginning in 1985 was a financial roller coaster for higher education in the United States despite the underlying growth of the enterprise. By the mid-1980s virtually every gubernatorial candidate ran as an "education governor," testimony to the hope that states placed in their colleges and universities to stimulate economic development. Ambitious presidents seized the opportunity to "buy the best," whether it pertained to recruiting faculty, bright students, intense doctoral candidates, or, regrettably, even athletes. This period of opportunity, however, mortgaged institutions' futures–a situation that became clear to accountants and boards soon after declines in the stock market and state revenues. Between 1990 and 1993 overextension and uncertainty loomed.
Illustrative of the partial gains in equity and meritocracy was the changing profile of females in higher education, especially in graduate and professional students. Whereas in 1970 relatively few women pursued doctorates or degrees in law or medicine, by 2000 women constituted close to half the students entering law school and about forty percent of first-year medical students. Women even constituted a majority of the Ph.D. recipients in biology, literature, and the humanities. At the same time, however, they were substantially underrepresented in such graduate fields as engineering and the physical sciences.
Connecting past to present. One theme that pervades higher education in the United States in the second half of the twentieth century is that of a "managerial revolution." In response to the expanding definition of higher education, the ability to navigate institutions became a preoccupation. It extended to include the development of professional expertise in fund-raising, as colleges and universities acquired voracious appetites for resources while extending their mission into new fields and even into new roles. Higher education in the United States has succeeded in running its own operations while also considering new roles and constituencies. Its strength has ironically been its major source of weakness. In other words, the aspiration and ability of the American postsecondary institutions to accommodate some approximation of universal access has been its foremost characteristic. Institutions' shortfalls in completely achieving that aspiration have been the major source of criticism and debate within American higher education. It is the perpetual American dilemma of how achieve both equality and excellence.
Grappling with the questions of educational equality and access has taken on increased urgency for two reasons. First, the widespread embrace of higher education as a means to legitimacy, literacy, and respectability strikes a deep chord in all sectors of American society. Secondly, since higher education has acquired the strength and stability of being a "mature industry," it must then compete with numerous activities for a share of the public purse and private donations. Maintaining and bolstering widespread trust in postsecondary education will be the central determinant in present and future discussions about ways in which Americans support higher education.
See also: Community Colleges; Hispanic-Serving Colleges and Universities; Historically Black Colleges and Universities; Land-Grant Colleges and Universities; Liberal Arts Colleges; Research Universities; Single-Sex Institutions; Tribal Colleges and Universities.
Goodchild, Lester F., and Wechsler, Harold S., eds. 1989. ASHE Reader on The History of Higher Education. Needham Heights, MA: Ginn Press for the Association for the Study of Higher Education.
Horowitz, Helen Lefkowitz. 1987. Campus Life: Undergraduate Cultures from the End of the 19th Century to the Present. New York: Knopf.
Jencks, Christopher, and Riesman, David. 1968. The Academic Revolution. Garden City, NY: Doubleday.
Kerr, Clark. 1963. The Uses of the University. Cambridge, MA: Harvard University Press.
Lemann, Nicholas. 1999. The Big Test: The Secret History of the American Meritocracy. New York: Farrar, Straus, and Giroux.
Veysey, Laurence. 1965. The Emergence of the American University. Chicago: University of Chicago Press.
John R. Thelin
Jason R. Edwards
The higher education system of the United States is not so much a formal system as it is an informal configuration of varied institutions. The development of the American system has been unique when compared with other national postsecondary educational systems around the world. Unlike most other countries, where higher education systems have largely developed outward from a central, government-supported university, the United States has never had such an institution. Instead, the evolution of the U.S. system has been shaped by many different influences, including state and local needs, demographics, religion, and changing social contexts. As a result, postsecondary institutions in the United States mirror the multifaceted complexities of the broader society in which they are embedded and the diversity of the people they serve. Moreover, American higher education is quite disorderly in structure and function in contrast to many national postsecondary systems and even in sharp contrast to the rationally organized American compulsory primary and secondary education system. Postsecondary institutions and the students they serve are diverse and not easily categorized. This disorder is characterized by a variety of individual institutional goals and missions, types of degrees offered, finance and governance structures, and even curricula, course contents, and instructional methodologies.
In order to understand how this informal and loosely structured "system" of diverse institutions serves the wide-ranging needs of American society, it is necessary to identify some of the main features that define the major types of institutions found in American higher education. In 1983 Robert Birnbaum noted that institutional diversity can be defined across several categories of institutional features. The most useful of these categories include defining differences in terms of the following dimensions of institutional diversity: systemic, structural, constituent, and reputational.
Systemic diversity refers to differences in types of institutions with regard to their size and scope of mission. Starting in the 1970s, there have been many attempts to develop classification systems for categorizing postsecondary institutions in this manner. The best-known and most well-established classification system was developed by the Carnegie Foundation for the Advancement of Teaching and has come to be known as the "Carnegie Classification." Originally developed by Clark Kerr in 1970, this classification system was designed to serve the research analysis needs of the Carnegie Commission on Higher Education. The commission "sought to identify categories of colleges and universities that would be relatively homogeneous with respect to the functions of the institutions as well as with respect to characteristics of students and faculty members" (Carnegie Commission on Higher Education, p. v). The Carnegie Classification was originally published in 1973 and has been updated several times, most recently in 2000. It is the framework most often used in describing institutional diversity in the United States and is relied upon by researchers and educational leaders to ensure appropriate comparisons between and among colleges and universities.
The current classification divides institutions into six main categories: doctoral/research institutions, master's colleges and universities, baccalaureate colleges, associate's colleges, specialized institutions, and tribal colleges. Within most categories are subcategories. Doctoral/research institutions can be either extensive or intensive and offer a wide range of undergraduate degrees as well as master's and doctoral-level graduate degrees. Extensive doctoral/research institutions award more doctorates in a wider range of fields than do intensive institutions. Master's colleges and universities fall into one of two categories (master's I or II) and typically offer a wide range of undergraduate programs as well as graduate education through the master's degree. Category I master's institutions award more master's degrees in a wider range of disciplines than do their category II peers. Baccalaureate colleges primarily focus on undergraduate education and are divided into three categories: baccalaureate colleges–liberal arts, baccalaureate colleges–general, and baccalaureate/associate's colleges. Liberal arts colleges award at least half of their degrees in liberal arts fields, whereas general colleges award less than half of their degrees in liberal arts fields. Baccalaureate/associate's colleges award both associate and baccalaureate degrees. Colleges and universities identified as specialized institutions in the Carnegie Classification may award degrees ranging from bachelor's to the doctorate, but they award the majority of those degrees in a single field. There are several subcategories of specialized institutions, including theological seminaries and other specialized faith-related institutions, medical schools and centers, other health profession schools, schools of engineering and technology, schools of business and management, fine arts schools, schools of law, teachers colleges, military institutes, and other types of specialized institutions. Tribal colleges are generally tribally controlled and located on reservations.
While the Carnegie classification system is often used in making qualitative distinctions among institutions, the commission denies that this is the classification's purpose. In his foreword to the 1987 edition of the classification, Ernest Boyer emphasized that the classification "is not intended to establish a hierarchy among learning institutions. Rather, the aim is to group institutions according to their shared characteristics, and we oppose the use of the classification as a way of making qualitative distinctions among the separate sectors" (Carnegie Foundation, p. 2). Nevertheless, the process of "institutional drift," in which colleges strive to climb the hierarchy, is well documented in the literature. For example, junior colleges become baccalaureate-granting institutions by grafting another two years onto their programs, while doctoral/research-intensive universities increase funded research activities as they aspire to doctoral/research-extensive status. In the early twenty-first century, the Carnegie Foundation was in the process of reassessing the classification system, rethinking how to characterize similarities and differences among institutions, and allowing multiple classifications of institutions. This work was expected to be concluded in 2005.
While the Carnegie Foundation's system is the most widely used typology in educational research, other classification schemes exist and are usually used for other purposes, such as providing information to prospective students and their families. For example, U.S. News and World Report classifies colleges and universities in several typologies. Institutions are divided into categories by whether they tend to serve a national or a regional population and then are rank-sorted into four "tiers." Schools are also ranked according to best departments for a particular major and best financial value.
Although such categorization schemes are useful in a system that includes tremendous institutional variety, such simplification hides the true complexity of the higher education system of the United States. For example, an institution categorized as a "research university" may also have its roots in land-grant legislation, or may be single-sex or religiously affiliated. Other key hidden aspects of institutional identity include the institution's historical roots–whether it began as a land-grant college, historically black college or university, Hispanic-serving college, tribal college, or religiously affiliated institution. Additionally, there are less apparent dimensions of institutional difference, such as ratios between part-time and full-time students or residential versus commuter students. Athletic division membership is an important facet of institutional identity, as is location (region, urban, rural, suburban). Hence, it is important to pay attention to other aspects of institutional diversity in order to truly understand the nature of the diverse system of American higher education.
Structural diversity focuses on the ways in which institutions are organized and controlled. Structural diversity is most often defined in terms of type of institutional control–public or private. Publicly controlled institutions are funded primarily by the government (usually by state governments) and are typically part of a larger state system. Private institutions are primarily funded by nongovernment sources and tend to be independent with their own private governing boards. There are many more private institutions in the United States than there are public colleges and universities, although public higher education has grown significantly since the 1960s.
While there is no national system of higher education, all states have developed some type of public postsecondary educational system. There are a number of ways in which these systems are structured and organized. Public colleges and universities differ both in the ways in which they are governed and in the ways in which they are coordinated as part of a larger state system. All states assign responsibility for operating public colleges and universities to governing boards, and there are three main types of governing board structures: consolidated governance systems, segmental systems, and single-institution boards. Consolidated boards are responsible for all public postsecondary institutions in a particular state, although in some states this may apply only to the four-year institutions. Segmental systems have different governing boards for different types of campuses; in some states this may mean that public research universities are governed by one board, comprehensive state colleges by another board, and community colleges by yet another board. States that use single-institution boards grant governance autonomy to each public campus by allowing each to have its own board. Public boards vary in the degree to which they have formal governance authority and the extent to which they merely coordinate activities across the state's public postsecondary educational sector without any substantive decision-making powers.
Public institutions within these systems tend to fall into one of three major categories: universities, state colleges, and community colleges. Public universities typically grant a full range of graduate degrees (master's and doctoral), tend to have a strong research emphasis, and typically have large student enrollments. State colleges are typically smaller, may serve a particular region of a state, and usually offer both bachelor's and master's degrees. Community colleges are two-year colleges that provide associate degrees, preparation for transfer to four-year institutions, vocational and technical education and training, and large numbers of continuing education offerings. Some public institutions have been identified as land-grant institutions. Land-grant institutions were first established by the Morrill Act of 1862, which provided federal funds for establishing universities that (1) were open to all types of students (including women, minorities, and low-income students), (2) offered degrees in practical and applied fields such as engineering and agriculture, and (3) shared knowledge with citizens throughout their state.
Private institutions are less easily characterized than are their public counterparts. Private institutions cover the full range of missions and structures found in American higher education. The most prestigious and highly selective institutions, whether they be Ivy League research universities or smaller liberal arts colleges, are private; but so too are the least well-known institutions. In fact, Alexander Astin and Calvin Lee noted in 1972 that there are literally hundreds of small colleges scattered across the United States that can be thought of as "the invisible colleges." These are small, private institutions with limited resources. Some are affiliated with a particular religion; others began life as private junior colleges. One of the key distinctions among private colleges is whether they are religiously affiliated or not. Religious affiliation occurs in many forms. A religious denomination or order directly controls some institutions, whereas others have only nominal relationships with religious bodies or sponsors. There are also increasing numbers of proprietary institutions that tend to award specialized degrees or that engage in alternative modes of educational delivery, such as distance learning.
Institutions also vary by the core constituencies they serve, particularly with regard to the particular types of students served. This type of constituent institutional diversity is manifested in many forms, but some of the most prominent institutions that serve particular types of students are those colleges and universities that provide education primarily for student groups that have been traditionally underserved by the majority of postsecondary institutions. These institutions include historically black colleges and universities (HBCUs), Hispanic-serving institutions (HSIs), tribal colleges, and women's institutions.
HBCUs primarily, although not exclusively, exist to provide postsecondary institutions that primarily serve African-American students. There are currently 109 HBCUs, almost half of which are public. They are concentrated in the southern region of the nation, with a few institutions located in the Northeast and Midwest. HBCUs enroll fewer than 20 percent of African-American undergraduates, yet produce one-third of all African-American bachelor's degrees. HSIs are institutions in which at least one-quarter of the undergraduates are Hispanic. Rapidly growing as a group, there are well more than 100 such institutions in the early twenty-first century. Tribal colleges tend to be controlled by Native American tribes. There are currently twenty of these institutions in the United States. Women's colleges are primarily private and provide postsecondary educational environments that cater specifically to female students. Although there were hundreds of these institutions at one time, that number has dwindled to approximately seventy-five. There are also a handful of male-only institutions scattered across the country. All of these institutions reflect the diversity found in American society and provide the informal system of American higher education with a means of better serving the diverse groups of individuals that constitute a multicultural society. The existence of such diverse institutions has been noted as a particular strength of the American higher education system.
Another key feature of American higher education is reputational diversity. It has been noted that higher education institutions in the United States are extremely stratified. In 1956 David Riesman offered the classic characterization of the importance of hierarchy and stratification in American higher education when he described the system of higher education as a "snakelike" procession in which the tail (composed of institutions lower in the hierarchy) and the body (representing institutions in the middle of the hierarchy) of the snake continually try to move up and catch the head (those institutions at the top of the hierarchy that serve as a model for other institutions to follow). Reputation appears to depend on a complex set of factors, including undergraduate selectivity and peer evaluations of graduate programs.
Advantages of the U.S. System
While the lack of systemwide structure creates a somewhat incoherent system of higher education in the United States where widespread coordination is virtually impossible, there are many advantages to this noncentralized approach to a national higher education system. The large degree of institutional diversity that has arisen from the decentralized nature of American higher education has generated benefits on three levels: institutional, societal, and systemic. At the institutional level, arguments center on serving students' needs. Diversity in this sense would include variety of student body, institutional size, programs offered, and academic standards. Higher education does not exist in isolation, however. Birnbaum stated that "higher education is intimately connected to, and therefore interacts with, other societal systems" (p. 116). Aside from education and research, institutions of higher education have also long served various political, economic, and social functions. Societal arguments for diversity thus center on issues of social mobility and political interests. From a systems theory perspective, higher education is viewed as an "open system," characterized by diverse inputs and outputs. For example, if colleges and universities in the United States admit students with high levels of racial diversity (input), then the impact on society (output) will be very different from what it would be if the U.S. college student population were more homogeneous. Additionally, diversity in higher education is important because "differentiation of component units … leads to stability that protects the system itself" (Birnbaum, p. 121). Such systems are able to sense and respond to environmental pressures more quickly and effectively simply because they encompass such extensive variety. In sum, the diverse system of postsecondary institutions in America reflects the diverse composition and needs of the society it serves.
See also: Hispanic-Serving Colleges; Historically Black Colleges and Universities; LandGrant Colleges and Universities; Liberal Arts Colleges; Military Professional Education System; Research Universities; Single-Sex Institutions; Tribal Colleges and Universities.
Astin, Alexander W., and Lee, Calvin B. T. 1972. The Invisible Colleges: A Profile of Small, Private Colleges with Limited Resources. New York: McGraw-Hill.
Birnbaum, Robert. 1983. Maintaining Diversity in Higher Education. San Francisco: Jossey-Bass.
Brazzell, Johnetta C. 1996. "Diversification of Postsecondary Institutions." In Student Services: A Handbook for the Profession, 3rd edition, ed. Ursula Delworth and Gary R. Hanson. San Francisco: Jossey-Bass.
Carnegie Commission on Higher Education. 1973. A Classification of Institutions of Higher Education. Berkeley: Carnegie Commission on Higher Education.
Carnegie Foundation for the Advancement of Teaching. 1987. A Classification of Institutions of Higher Education. Princeton, NJ: Carnegie Foundation for the Advancement of Teaching.
Cohen, Arthur M. 1996. "Orderly Thinking about a Chaotic System." In Transfer and Articulation: Improving Policies to Meet New Needs, ed. Tronie Rifkin. San Francisco: Jossey-Bass.
McGuinness, Aimes C. 1997. "The Changing Structure of State Higher Education Leadership." In State Postsecondary Education Structures Handbook: State Coordinating and Governing Boards. Washington, DC: Education Commission of the States.
Riesman, David. 1956. The Academic Procession: Constraint and Variety in American Higher Education. Lincoln: University of Nebraska Press.
Joseph B. Berger
Maria Vita Calkins