The Postwar Era (1945–1970)

views updated

The Postwar Era (1945–1970)

How They Were Governed

The Central Intelligence Agency (CIA)

Created by the National Security Act of 1947, the Central Intelligence Agency (CIA) is an independent government agency that collects and analyzes information, often covertly, about adversaries of, and potential threats to, the United States.

The United States has spied on its enemies ever since it came into existence. Over time, American intelligence capabilities evolved into an uncoordinated effort scattered over many different offices of the federal government. For example, the Army and Navy each had their own code-breaking operations, and those units did not share information with each other. This situation persisted until World War II.

Office of Strategic Services (OSS)

After U.S. intelligence services failed to warn of the impending Japanese attack on Pearl Harbor in December 1941, President Franklin D. Roosevelt (1882–1945) asked New York lawyer and decorated World War I hero William J. Donovan to craft a plan for a consolidated national intelligence service. Out of that plan, the Office of Strategic Services (OSS) was created in June 1942, with Donovan as its first leader. The OSS was charged with the tasks of gathering and analyzing strategic information to be used by the Joint Chiefs of Staff and leaders of all branches of the military, and to conduct additional covert operations not assigned to other agencies. Throughout the rest of World War II, the OSS provided the military with important intelligence pertaining to enemy troop strength and weapons capabilities.

When the war was over, questions arose as to who should carry out intelligence activities during peacetime and who should be in charge of coordinating those efforts. These questions became more important as concerns began to arise about the potential threat posed by the Soviet Union at the dawn of the Cold War. In October 1945, President Harry S. Truman (1884–1972) disbanded the OSS and divided its former responsibilities between the Department of War and the Department of State. About the same time, Donovan suggested that an independent, civilian agency be created to coordinate all of the nation’s intelligence gathering operations. He proposed that the new agency be responsible for overseas espionage operations, but have no role in domestic law enforcement. Initially, both the military and the Federal Bureau of Investigation (FBI), which was responsible for domestic spying, opposed Donovan’s idea, fearing that it would reduce their own influence with the federal government.

Truman opted for an approach halfway between Donovan’s proposal and the position of its critics. In January 1946 he created the Central Intelligence Group (CIG), which had the authority to coordinate the intelligence efforts of the departments and agencies already involved in spying. The CIG was led by a newly created position, Director of Central Intelligence. The CIG was short-lived. In 1947 Congress passed the National Security Act, which created an entirely new intelligence structure. It established the National Security Council (NSC), which oversaw the activity of the newly created Central Intelligence Agency (CIA). The position of Director of Central Intelligence was retained as head of the CIA. A law passed two years later, the Central Intelligence Agency Act, established means for the CIA’s budget to remain secret and allowed the agency to skirt some of the limitations of federal budget appropriations. It also exempted the CIA from having to disclose information about its employees.

Cold War Role

The CIA’s role expanded dramatically during the Korean War. Truman appointed a new director, Lt. General Walter Bedell Smith, who succeeded in putting many more efficient systems in place at the agency. During this period the CIA absorbed the Office of Policy Coordination, which the NSC had created in 1948 to engage in secret anticommunist operations abroad. Allen Dulles served as DCI from 1953 to 1961 during the presidency of Dwight D. Eisenhower (1890–1969). Under Dulles’s leadership, the CIA grew into a formidable force on the world scene, mounting massive propaganda campaigns, influencing the outcomes of foreign elections, and in some cases directly sponsoring political coups when it would benefit U.S. interests. The CIA was largely responsible for restoring the pro-Western Mohammad Reza Pahlavi, Shah of Iran, to power in that country in 1953, and it orchestrated the 1954 overthrow of Guatemalan president Jacobo Arbenz.

In 1961 the CIA was the driving force behind the disastrous Bay of Pigs Invasion, a failed attempt to topple Cuban leader Fidel Castro from power. The following year, CIA surveillance was instrumental in detecting the buildup of Soviet missiles in Cuba, which led to the Cuban Missile Crisis of 1962. During the rest of the 1960s, the Far East occupied much of the CIA’s attention. As part of its effort during the Vietnam War, the CIA undertook a secret arm of the war in neighboring Laos under director Richard Helms. As opposition to the war increased, the CIA became a target of criticism for its role not only in Vietnam, but also in supporting brutal dictatorships in other parts of the Third World.

Controversies

In the 1970s the CIA came under increased scrutiny by Congress. When Helms refused the request of President Richard M. Nixon (1913–1994) to use the CIA to help him cover up the Watergate scandal, Nixon replaced him with James Schlesinger in early 1973. By 1975 the press had uncovered information about past CIA involvement in an assortment of assassination attempts and incidents of domestic spying, leading to investigations by a special presidential commission and select committees in each house of Congress. Permanent intelligence oversight committees were created over the next two years.

Cold War tensions were renewed during the first half of the 1980s, and President Ronald Reagan (1911–2004) expanded the CIA’s role accordingly with additional funding and personnel. The agency was embroiled in controversy in the mid-1980s for its role in the Iron-Contra Affair, in which members of the Reagan administration and their operatives orchestrated the illegal sale of weapons to Iran and funneled the proceeds, also illegally, to the right-wing Contra rebels in Nicaragua.

With the demise of the Soviet Union in 1991, the CIA’s involvement in fighting communism gave way to a new emphasis on fighting terrorism. Since the terrorist attack of September 11, 2001, the CIA has once again received massive financial and logistical support in the federal budget for its clandestine activities.

The Department of Defense

The Department of Defense (DoD) is a department of the executive branch of the U.S. government with responsibility for all defense and military activities. Under the direction of the president and the secretary of defense, the DoD is responsible for fifteen military departments including the three major military branches: the U.S. Army, U.S. Navy, and U.S. Air Force. The DoD determines troop deployment, obtains and distributes defense revenues, invests in the development of military technology, and functions as a crucial link between military leaders and the executive branch. The DoD is headquartered in the Pentagon office complex, located in Arlington, Virginia.

Establishment of the Department of Defense

Before the establishment of the DoD, the responsibility for national defense was divided between the Department of War and the Department of Navy. During World War II, persistent interdepartmental conflicts interfered with military operations as the departmental leaders disagreed about engagement strategies and troop deployment. In 1945 President Harry S. Truman (1884–1972) spearheaded a proposal to unify the defense departments under a single executive office. The primary issue in the debate was whether or not the military would be more functional under centralized control.

In September 1947 Truman’s proposal was accepted and the departments were combined into the National Military Establishment (NME). Truman appointed former Secretary of Navy James Forrestal (1892–1949) as the nation’s first secretary of defense. Under the original framework, the heads of the three military divisions—army, navy, and air force—were designated members of the executive cabinet.

Over the next two years, as tensions grew over a possible military conflict in the Korean Peninsula, the structure of the NME was refined and amended. In 1949 the NME was renamed the Department of Defense, and the secretaries of the army, navy, and air force were joined into the Joint Chiefs of Staff, under the supervision of the secretary of state, who was thereafter the only department member to serve in the cabinet.

The Korean War

The first major task for the Department of Defense was to mobilize the united military for the Korean War (1950–1953). During the first months of the conflict, the DoD was disorganized and unable to meet the president’s guidelines for progress. In 1950 Truman called for the resignation of secretary Louis Johnson (1891–1966) and asked former Secretary of State George C. Marshall (1880–1959) to come out of retirement to take control of the DoD.

Known for his skill as a diplomat, in 1947 Marshall had developed a comprehensive foreign aid initiative, known as the “Marshall Plan,” to stimulate the recovery of Europe after the devastation of World War II. For his innovative strategies and focus on humanitarian aid, Marshall received the 1953 Nobel Peace Prize. Marshall’s tenure as secretary of defense lasted for only one year, and his main contributions were in helping to establish an administrative structure that allowed for command decisions to be quickly implemented in the field. Marshall and his deputy secretary, Robert A. Lovett (1895–1986), were also responsible for defining the relationship between the DoD and the Central Intelligence Agency.

Throughout the Korean conflict, public opinion of the war effort remained low, and Marshall was subject to extensive criticism for his role. In 1951 Senator Joseph McCarthy (1908–1957) attacked Marshall as the individual most directly responsible for military failures in Korea. In the wake of McCarthy’s character assaults, Marshall resigned and left the post to Lovett. Though operations in Korea were largely unsuccessful, Lovett’s administration succeeded in creating a more efficient system for mobilization, securing funding needed to encourage military growth, and creating an ongoing military preparedness that served as a model for peacetime operation of the DoD.

The Vietnam War

The Vietnam War (1959–1975) began during the administration of President Dwight D. Eisenhower (1890–1969), when the DoD was under the leadership of Thomas S. Gates Jr. (1906–1983). In 1961, when President John F. Kennedy (1917–1963) defeated Eisenhower to win the presidency, he replaced Gates with Robert S. McNamara (1916–), a business executive who had been recommended by former secretary Robert Lovett.

Kennedy and McNamara fundamentally altered the focus of the military by reducing the concentration on preparing for major engagements in favor of training in flexible strategies for use in localized conflicts. The primary objective was to contain enemy forces without allowing a major escalation that might lead to nuclear conflict. McNamara further centralized the DoD administration and encouraged greater civilian involvement and leadership. Under McNamara, civilian strategists and analysts took part in military planning in an effort to foster innovation and to make the DoD more adaptable in responding to the unique challenges of the Vietnam War.

McNamara and Kennedy were responsible for approving the Bay of Pigs operation, in which the United States armed and supported a group of exiled Cuban nationalists in attempting to overthrow Cuban dictator Fidel Castro (1927?–). The insurgency failed, and the United States narrowly avoided military conflict with Cuba. In the wake of the incident, both the United States and the Soviet Union (the world power that was Cuba’s patron) began stockpiling weapons in preparation for conflict. The ensuing Cuban Missile Crisis is often regarded as the closest the world has come to nuclear war. After his retirement, McNamara cited the Bay of Pigs as his primary failure.

After the assassination of Kennedy in 1963, McNamara remained secretary under President Lyndon B. Johnson (1908–1973). By 1966, after U.S. public opinion had turned against the American involvement in Vietnam, McNamara began to depart from state policy and disagreed with Johnson’s recommendations for deploying additional troops. McNamara recommended decreasing troops and abandoning bombing operations in North Vietnam. When McNamara announced his decision to resign in 1968, many opponents of the war saw the resignation as an admission that U.S. policy in Vietnam had failed.

In 1971 an employee of the State Department leaked crucial DoD policy documents, which became known as the Pentagon Papers, to the New York Times. The documents were part of a comprehensive analysis of the Vietnam conflict commissioned in 1967 by McNamara. Among other things, the Pentagon Papers revealed covert military activities including secret aerial attacks authorized by the president and executive efforts to mislead Congress regarding the status of the war effort. The lasting effect of the Pentagon Papers was an increase in popular dissent against the administration of President Richard M. Nixon (1913–1994), who continued the war effort after Johnson.

Ongoing until 1975, the Vietnam War resulted in more than 58,000 American combat deaths and the loss of more than one million Vietnamese lives (many of them civilians), and saw the development of the largest antiwar movement in American history. When the war ended, the DoD shifted its focus toward domestic defense and, more specifically, toward developing contingencies for nuclear war.

The End of the Cold War

In 1975 newly elected president Gerald R. Ford (1913–2006) appointed Donald Rumsfeld (1932–) secretary of defense. During Rumsfeld’s administration the military was dealing with the aftereffects of the Vietnam crisis, including transitioning to an all-volunteer system after the abandonment of the draft in 1973. Rumsfeld’s administration helped increase emphasis on research and development.

During the period from 1985 to 1990, the annual DoD budget exceeded $400 billion, largely focused on research, development, and anti-nuclear defense strategies in response to the Cold War—a period of tension between the United States and the Soviet Union that had been ongoing since the 1940s. The DoD budget continued to increase until the collapse of the Soviet Union in the early 1990s, when the Cold War drew to a close.

Dick Cheney (1941–), who served as chief of staff under the Ford administration, was named secretary of defense under President George H. W. Bush (1924–). Cheney supervised the DoD during the 1991 Operation Desert Storm, in which the United States sent a military force to rebuff an Iraqi invasion of Kuwait. After Bill Clinton (1946–) became president in 1993, military spending and the influence of the DoD gradually diminished. During Clinton’s two terms as president, the military budget reached its lowest levels since before the Vietnam War.

The War on Terror

The next president, George W. Bush (1946–), asked Donald Rumsfeld to serve a second term as secretary of defense, beginning a new period of prominence for the U.S. military. The terrorist attacks of September 2001 initiated a rapid military escalation that Bush referred to as the “Global War on Terror,” which included the 2003 American invasion of Iraq despite no evidence that Iraq had ties to terrorist organizations. By 2007 the annual military budget, at more than $600 billion, had reached its highest levels since World War II, even when adjusted for inflation. Nearly 75 percent of the budget was allocated to the ongoing American occupation of Iraq, while the remainder funded development projects and peacekeeping efforts in other parts of the world. The DoD faced intense criticism at home and abroad for both the invasion of Iraq and the implementation of the war effort, which was widely viewed as bungled. Under extreme pressure from his Republican constituency, Bush accepted Rumsfeld’s resignation, replacing him with Robert M. Gates (1943–) in December 2006.

The Department of Transportation

The U.S. Department of Transportation (DOT) is a cabinet-level department of the federal government that coordinates national transportation policy. Created in 1967 with the passage of the National Traffic and Motor Vehicle Safety Act of 1966, DOT consolidated transportation-related functions previously scattered among at least fifty different federal agencies. This allowed for the first time the development of a cohesive nationwide approach to highway system development, aviation and automobile safety, and many other kinds of transportation planning. Today, DOT contains eleven individual agencies, including the Federal Aviation Administration, the Federal Highway Administration, and the Federal Transit Administration.

Unsafe at Any Speed

For years prior to passage of the National Traffic and Motor Vehicle Safety Act, proposals had been put forward for the creation of a single agency that would oversee all federal transportation functions. However, the coordination of transportation functions was not an issue that was considered a high priority by legislative leaders, and independent agencies continued to operate with limited oversight and authority. By the mid-1960s developments in transportation technology and growing public concern about an escalating number of traffic deaths brought the issue to the forefront of national politics. This public awareness of the need for a national approach to automobile safety was reinforced by the publication in 1965 of consumer advocate Ralph Nader’s (1934–) influential book Unsafe at Any Speed, which was highly critical of auto manufacturers for prioritizing power and styling over vehicle safety.

On March 2, 1966, President Lyndon B. Johnson (1908–1973) proposed in a message to the U.S. Congress the formation of a new department that would bring together a host of federal functions involved in transportation planning and regulation. After some negotiation over exactly what duties would be included in the new department, the National Traffic and Motor Vehicle Safety Act was signed into law on October 15, 1966. The DOT was officially activated on April 1 of the following year. Incorporated into the department were the Federal Aviation Administration, Bureau of Public Roads, Coast Guard, Alaska Railway, St. Lawrence Seaway Development Corporation, Great Lakes Pilotage Administration, high-speed surface transportation program, and various new regulatory and safety functions mandated by the act.

Upon its birth, DOT instantly became the fourth-largest cabinet-level department. Johnson appointed Alan S. Boyd, a former chairman of the Civil Aeronautics Board and undersecretary of commerce for transportation, as DOT’s first secretary, charged with organizing it according to instructions from Congress and bringing the act to life.

New Responsibilities

The department expanded its authority soon after it was established when oversight of urban mass transit systems was brought under DOT administration in 1968. Initially, urban transit was under the control of the U.S. Department of Housing and Urban Development (HUD), with the idea that it would be easier for one agency to coordinate cities’ transportation needs with housing patterns. However, President Johnson believed these functions could more efficiently be managed by the DOT.

Several important developments took place at DOT during the first term of President Richard Nixon (1913–1994), which began in 1969. With the nation’s passenger rail system in crisis in the late 1960s, a new consolidated system was proposed, leading to the creation in 1971 of the quasi-public National Railroad Passenger Corporation, better known as Amtrak. By law, the secretary of transportation sits on Amtrak’s board of directors. In 1970 the Highway Safety Act separated highway safety from administration of the nation’s highway design, construction, and maintenance programs, creating the National Highway Traffic Safety Administration to assume safety administration duties.

Several structural changes have taken place at DOT since its early days. In the early 1980s, President Ronald Reagan (1911–2004) shifted the Maritime Administration from the U.S. Department of Commerce to DOT. In 1984 the Office of Commercial Space Transportation was created within the department. The 1991 passage of the Intermodal Surface Transportation Efficiency Act created two new entities: the Bureau of Transportation Statistics and the Office of Intermodalism, both aimed at promoting a more coordinated, data-driven approach to the nation’s intermodal (using more than one type of transportation in a single trip) transportation system. And finally, in 2002 the Transportation Security Administration was transferred out of DOT and into the newly created U.S. Department of Homeland Security.

Birth of the Federal Aviation Administration

The federal government has been regulating civil aviation since 1926, when the Air Commerce Act was passed. That law put the secretary of commerce in charge of issuing and enforcing air traffic rules, pilot licensing, aircraft certification, and other key aviation functions. In 1938 responsibility for regulating civil aviation was moved from the U.S. Department of Commerce to the newly created Civil Aeronautics Authority, which was given the power to regulate airline fares and determine commercial airline routes. Two years later, the agency was split into the Civil Aeronautics Administration (CAA), which was in charge of certification, safety enforcement, and airway development; and the Civil Aeronautics Board (CAB), which took care of making safety rules, investigating accidents, and regulating the airlines’ economics.

As commercial flight grew—along with the number of midair collisions and near misses—and with jet airliners getting ready to make their appearance, Congress passed the Federal Aviation Act of 1958. The act created the independent Federal Aviation Agency, and transferred all of CAA’s functions to the new entity. The Federal Aviation Agency also took over CAB’s role as maker of safety rules, and gave the new agency sole responsibility for developing and maintaining an air traffic control system to be used for both civil and military flight. When the DOT went into operation in 1967, the Federal Aviation Agency—renamed the Federal Aviation Administration (FAA)—became one of its core components. The FAA continued to gain new powers over the years that followed. In 1968 Congress gave it the authority to implement aircraft noise standards, and the Airport and Airway Development Act of 1970 put the FAA in charge of safety certification of airports served by commercial carriers.

See also Ralph Nader

Important Figures of the Day

Harry S. Truman

As the thirty-third president of the United States, Harry S. Truman (1884–1972) led the country through the critical transition from global conflict to peacetime prosperity during the late 1940s. He also ushered in the Cold War era in America, positioning the United States as the chief impediment to the worldwide spread of communism. Truman was the last of a breed of “ordinary citizens” to rise to the highest levels in politics, reinforcing the traditional American notion that anybody could become president of the United States through hard work and common sense.

Harry Truman was born on May 8, 1884, in Lamar, Missouri, the eldest of John and Martha Truman’s three children. His middle name was “S.,” as a tribute to both his maternal grandfather, Shipp Truman, and his paternal grandfather, Solomon Young. Truman grew up in and around Independence, Missouri, where his father operated a successful family farm. A somewhat sickly child, he did not participate in sports and needed thick eyeglasses from the time he was six years old. Truman graduated from high school in Independence in 1901 at age seventeen. He hoped to attend the U.S. Military Academy at West Point, New York, but after failing the vision exam, he instead enrolled at the local business college. A family financial downturn prevented him from completing his studies, however, so Truman went to work at a series of jobs in the Kansas City, Missouri, area, including bank clerk and timekeeper for a railroad builder. In 1906 he left the city and moved back home to work on the family farm. He stayed there for twelve years.

From Humble Haberdasher to Powerful Politician

In 1917, soon after the United States entered World War I, Truman enlisted in the U.S. Army. He was shipped to France, where he was put in charge of an artillery unit and rose to the rank of captain. In 1919 he returned to Kansas City, where he married Bess Wallace and opened a haberdashery with a friend. The business went bankrupt within a few years.

About this time Truman became interested in politics. One of his army friends had family connections to Thomas J. Pendergast, a powerful boss of Missouri’s Democratic “machine.” At the suggestion of his friend, Truman decided to run for an administrative position in the Jackson County court. He won the 1922 election, but was defeated in his 1924 reelection bid. With Pendergast’s support, however, Truman was elected to the top administrative post in the county in 1926. He served in this position until 1934. Despite his ties to Pendergast, whose corruption was common knowledge, Truman was able to forge a reputation as an honest and committed public servant.

As a dedicated soldier in Missouri’s Democratic ranks, Truman was recruited to run for the U.S. Senate in 1934. With Pendergast’s help again, Truman secured narrow victories in both the Democratic primary and the general election. In the Senate, Truman was a staunch supporter of President Franklin Roosevelt’s (1882–1945) New Deal policies aimed at correcting the nation’s foundering Great Depression–era economy. While Truman was serving his first term in the Senate, Pendergast’s political machine collapsed, with some two hundred functionaries convicted of election fraud and other charges. Pendergast himself was sentenced to prison.

Because of his longstanding ties to Pendergast, Republicans saw Truman as an easy target when he came up for reelection in 1940. By then, however, Truman was associated much more closely with the popular Roosevelt than with Pendergast in the minds of voters, and he easily won another election victory. Early in his second term, Truman was named to head a special Senate committee formed to investigate possible corruption in the National Defense Program’s contracting process.

Rewarded for Loyalty

Throughout his tenure in the Senate, Truman remained fiercely loyal to Roosevelt—not terribly difficult, since he was a firm believer in Roosevelt’s New Deal philosophy. Truman was rewarded for this loyalty when he was nominated as Roosevelt’s running mate at the 1944 Democratic national convention. That November, he was elected vice president of the United States.

Truman was vice president for only eighty-three days. Roosevelt, whose health was already failing as he ran for an unprecedented fourth term as president, died on April 12, 1945, and Truman became president. Truman’s first challenge was to win the confidence of Roosevelt’s advisers, many of whom viewed him as an unsophisticated “hick.” Truman quickly made it clear that he intended to carry on with Roosevelt’s policies, both domestic and foreign.

As Truman took office, World War II was nearing its end. The war in Europe was already under control, and indeed victory there was declared just weeks into Truman’s presidency. Japan was another matter. Made aware of the availability of a powerful new weapon capable of ending the war quickly, Truman in July 1945 authorized the military to drop atomic bombs on the Japanese cities of Hiroshima and Nagasaki. The bombs were dropped in early August, and the war effectively came to an end soon afterward; on September 2 the Japanese formally surrendered.

Guided Nation’s Postwar Transition

With the war over, Truman immediately turned his attention to shifting the American economy back into peacetime mode. The years immediately following World War II were marked by high inflation, shortages of consumer goods, and labor unrest. Truman responded by seizing government control of a handful of key industries and transportation systems, and intervening aggressively in labor strikes. Some of his tactics were not well received—especially by the newly Republican-controlled Congress—and his political popularity suffered. His move to desegregate the military and his support of civil rights legislation further cost him support among Southerners.

America’s victory celebration was soon cut short by the realities of the Cold War. Truman responded to the threat of Soviet aggression by involving the United States much more in world affairs. Known as the Truman Doctrine, the aim of U.S. foreign policy during the postwar period was to oppose Soviet expansion wherever it occurred around the globe. Challenging Soviet expansion remained at the core of American foreign policy for decades to come. When Soviet-backed rebels threatened to take control in Greece and Turkey in 1947, Truman provided substantial financial support to the existing governments. He orchestrated an airlift of food and supplies into West Berlin when the Soviets blockaded the city in 1948. Over the years that followed, Truman and his administration implemented the European Recovery Program, known as the Marshall Plan. Named for Secretary of State George C. Marshall (1880–1959), the program was based on the premise that aiding the war-shattered nations of Europe would promote regional economic stability and usher in period of international peace and prosperity. Under the Marshall Plan the United States provided more than $13 billion in aid to Western Europe over four years.

Truman entered the 1948 presidential campaign in a weakened political position. In addition to Republican candidate Thomas Dewey (1902–1971), the governor of New York, Truman found himself in a race against two other candidates representing parties that had splintered off of the Democrats: Henry Wallace (1888–1965), who had been the vice president before Truman, and whom Truman had fired from his job as secretary of commerce after he criticized Truman’s anti-Soviet policies, representing the new Progressive Party; and South Carolina governor Strom Thurmond (1902–2003) of the States’ Rights Democratic Party, better known as the Dixiecrats. National polls had Truman trailing Dewey, and most major newspapers gave Truman little chance of winning. Truman went on the offensive with a whistle-stop campaign tour, covering more than 22,000 miles by train and making over two hundred speeches. Rallying under the campaign slogan “Give ’em hell, Harry,” he focused his campaign on industrial workers and agricultural centers. He was also the first major presidential candidate to actively court voters in the predominantly African-American Harlem section of New York City. On election night the Chicago Daily Tribune famously prepared an extra edition to be distributed the next morning featuring the banner headline “DEWEY DEFEATS TRUMAN.” The Tribune’s projection was inaccurate; Truman won the popular election by more than two million votes. His party also regained control of Congress.

Korean War

The Cold War intensified during the first year of Truman’s second term in office when the Soviet Union exploded its first atomic bomb in 1949. In June 1950 communist North Korean forces, backed initially by the Soviets and later by the People’s Republic of China, invaded South Korea. Truman quickly sent American troops to repel the attack, sanctioned by the United Nations. The following year Truman fired General Douglas MacArthur (1880–1964) from his post as head of the Far East Command of the U.S. Army, citing repeated acts of insubordination. The firing of MacArthur, a hero of World War II, brought down new waves of criticism on Truman. Burdened by difficulties both at home and abroad, Truman announced in 1952 that he would not run for reelection to a second full term.

When his presidency was over, Truman eased into a quiet retirement back home in Independence, Missouri. He published two volumes of memoirs, in 1955 and 1956. While his reputation had suffered considerably during the final years of his presidency, Truman enjoyed a surge in public esteem once out of office. He came to be admired as a “straight shooter” in an era when Americans were growing increasingly skeptical of the intentions and honesty of politicians. Truman died on December 26, 1972, and was buried on the grounds of the Truman Library in Independence. His reputation improved after his death, as the public reacted with disgust to the political scandals of the Nixon era. Truman’s stature seemed to grow throughout the ensuing decades. He was the subject of an award-winning biographical movie on HBO in 1995 and was quoted by presidential candidates of both major parties during the 1996 campaign season. In 2003 an unmarked notebook discovered at the Truman Library turned out to be Truman’s handwritten diary from 1947, sparking yet another wave of interest in his presidency. By 2007 Truman was a frequent topic of biographical studies, volumes of correspondence, and military and political analyses concerning his presidency, in particular his straightforward leadership style, his decision to use nuclear weapons to end World War II, and his attempt to contain the spread of communism.

See also The Cold War

See also The Korean War

See also The Marshall Plan

Dewey Defeats Truman … or Does He?

The Chicago Daily Tribune’s erroneous “DEWEY DEFEATS TRUMAN” headline in 1948 is remembered as one of the biggest gaffes in the history of American journalism. It is hard to imagine such a mistake taking place today, but in retrospect it is not hard to see how the Daily Tribune’s blunder came about.

Perhaps the biggest factor was that most pollsters had Harry Truman (1884–1972) trailing Thomas Dewey (1902–1971) by a substantial margin, and few thought Truman could pull off a victory. After all, the Democratic Party had splintered, and offered three different candidates for president. Many analysts believed that some Southern states, long considered safely Democratic, would instead go for the Dixiecrat candidate Strom Thurmond (1902–2003). Some also thought that Progressive candidate Henry Wallace (1888–1965) could steal enough votes from Truman to tip some of the tighter state races toward Dewey. All of these thoughts influenced the decision of Daily Tribune editors to jump the gun in the face of slow election returns and looming print deadlines. The early election returns seemed to indicate a Dewey victory was in progress, and, based on pre-election polling, newspaper staff had no reason to suspect those early returns were misleading. In addition, a portion of the Daily Tribune’s regular staff was on strike, meaning the type was being set by novices. These inexperienced workers even set five lines of type upside down in this historic edition.

As more election returns came in across the nation, it became apparent that the race was very close. But it was too late, the newspaper had already hit the stands. Soon it was clear that Truman had won the election. Panic set in at the Daily Tribune. Trucks were dispatched to pick up newspapers that had already been delivered, and thousands of copies were retrieved. Truman went to bed on election night behind in the race. When he woke the next morning, he had won. Traveling to Washington, D.C., that day by train, he was handed a copy of the erroneously headlined newspaper. This moment was captured in a photograph that quickly became famous. Asked to comment on the fiasco, a grinning Truman simply replied, “This is one for the books.”

Perhaps even more than the Daily Tribune, the nation’s prominent pollsters came out of the 1948 presidential election with egg on their collective faces. They wondered how they could have been so wrong. One problem was their sampling methods. Much of Truman’s support was from working-class people, minorities, and the poor. These groups were grossly underrepresented in most polls compared to the affluent, who were more likely to support Dewey. Moreover, even as the latest polls showed the gap between Truman and Dewey narrowing, the media stubbornly refused to revise their earlier predictions. They still believed Truman had no chance to win. In the aftermath of the debacle, major pollsters, such as Gallup and Roper, reexamined their methods.

J. Edgar Hoover

J. Edgar Hoover (1895–1972) directed the Federal Bureau of Investigation (FBI) for nearly forty-eight years, from 1924 until his death in 1972. He transformed the agency from a small, poorly trained outfit into a modern, highly effective, and professional crime fighting organization. Hoover gained almost cult-like status among the American public for upholding law and order and espousing conservative, wholesome virtues. However, his dedication to protecting the country evolved into an obsession in which his agency trampled civil liberties in the name of national security. The FBI investigated and hounded people who had not committed crimes, but were deemed by Hoover to threaten U.S. interests. In the process he collected personal and embarrassing information about many prominent personalities. His power and ruthless reputation for seeking revenge kept his critics at bay until after his death, when his excesses became widely known. The FBI achieved many admirable feats during his long tenure as its director, but fairly or unfairly, Hoover is remembered more for his faults than his successes.

Early Life and Career

John Edgar Hoover was born on January 1, 1895, in Washington, D.C., to a middle-class family. His father was in civil service with the U.S. Coast Guard. His mother was a disciplinarian who strongly impressed her Victorian-era moral values on young Hoover. His father died in 1921, and Hoover continued to live with his mother until her death in 1938. He never married.

In high school, Hoover excelled at debating and hoped to go into politics. He earned a law degree at George Washington University and accepted a legal position at the Department of Justice (DOJ). He impressed his superiors with his organizational skills and clean-cut, patriotic image. Hoover made a name for himself in 1919 by playing an integral role in the Palmer Raids—a series of raids initiated by Attorney General Mitchell Palmer against suspected communists in the United States. The country was in the midst of a “red scare,” an intense and fiercely paranoid fear about communism, that Hoover took to heart and embraced for the rest of his life.

In 1924 a new Attorney General, Harlan Fiske Stone, asked Hoover to take over the Department of Justice’s Bureau of Investigation (BOI), a small law enforcement agency that had been in operation for more than a decade. The BOI had a poor reputation. There were few federal crimes at the time, and the agency was full of appointees who were considered cronies of their political benefactors. Stone wanted Hoover to transform the BOI into an agency resembling Scotland Yard, the metropolitan police service in London, England, that is known for its success in crime detection. Hoover, who was only twenty-nine years old at the time, agreed to take the position, but only if he was given broad authority to make personnel decisions at the agency. Stone granted his request.

Building a New Image

Hoover took the reins of the BOI in the midst of Prohibition. A national crime wave was driven by public antipathy toward the liquor laws and the availability of two innovations—the automobile and the machine gun. Local and state law enforcement authorities proved ineffective against heavily armed gangsters and bank robbers who easily crossed state lines. The stock market crash of 1929 and the ensuing Great Depression produced public cynicism about the government’s authority and made outlaws like John Dillinger and Bonnie and Clyde into cultural celebrities.

Hoover launched a massive and highly successful media campaign to turn public opinion against these romanticized criminals. He pushed his media contacts to cast the outlaws as heartless villains and his government agents, or G-men, as brave heroes. His efforts were aided by a tragic shootout in June 1933 that left four law enforcement officers dead from an ambush by gangster Pretty Boy Floyd. Congress passed new laws greatly expanding the power of the BOI in fighting federal crimes. In a few short years almost all of the notorious gangsters had been killed or captured. The agency also earned high praise for its role in catching the kidnapper and murderer of the baby of famous aviator Charles Lindbergh. A new law making kidnapping a federal crime solidified the BOI’s involvement in future similar cases.

Hoover worked diligently to modernize the agency, which was renamed the Federal Bureau of Investigation in 1935. Under his direction the FBI obtained sophisticated laboratory equipment, established a centralized fingerprint file, conducted research in criminology and forensics, and developed high-tech methods for investigating crimes and identifying suspects. He established a national academy to train law enforcement officials from around the country (and later from around the world) in professional crime-fighting techniques.

Spies and Subversives

During the 1930s President Franklin D. Roosevelt (1882–1945) was worried about threats to national security from fascists and communists. He ordered Hoover to investigate potential agitators. The FBI director seized on this authority and dramatically increased surveillance of Americans he believed embraced left-leaning politics. Years later it would be learned that the president’s own wife, Eleanor Roosevelt (1884–1962), was among Hoover’s targets, because of her liberal political views. With America’s entry into World War II in 1941 the FBI garnered sweeping new powers in its investigations of espionage, sabotage, and suspected subversion (support for the overthrow of the government). The Smith Act (or Alien Registration Act) of 1940 had outlawed activities and even statements deemed to be subversive.

In 1942 the FBI scored a public relations victory when it captured dozens of German spies. The FBI obtained convictions of nearly three-dozen people for violations of the Smith Act and/or for espionage. Later that year the agency arrested eight German spies who had snuck onto U.S. shores from a submarine. The men were sent to conduct sabotage operations against American targets. Before they could carry out their missions, one of the spies had a change of heart and turned himself into the FBI, which quickly captured the others.

By the end of 1943 Hoover oversaw more than 13,000 employees at the FBI, including approximately 4,000 specially trained agents. By this time he had been the FBI director for nearly two decades and had earned the nation’s respect as a protector of U.S. security and values.

A New Red Scare

Following the war Hoover’s anticommunist zeal found momentum in a new “red scare.” His acquisition of power was helped by several events that occurred during the administration of President Harry Truman (1884–1972). In 1947 Truman’s Executive Order 9835 created the Federal Employee Loyalty Program. The FBI began running background checks on millions of federal employees looking for a variety of misdeeds, including connections to subversive organizations. The next year the DOJ used the Smith Act to obtain the first of numerous indictments against American members of the Communist Party. Meanwhile, Republican Senator Joseph McCarthy (1908–1957) and other politicians fanatically hounded Americans on the slightest suspicion that they were communists or had communist leanings. At first, Hoover heartily supported these efforts and even supplied information from FBI files; however, he gradually saw McCarthy as a political threat to his power and ended his support.

By the mid-1950s anticommunist hysteria had somewhat diminished. McCarthy was censured by the U.S. Senate and ended his political service in disgrace. The U.S. Supreme Court weakened the government’s power under the Smith Act to prosecute communists. Hoover determined to take on the battle himself. By that time the FBI was learning much from a top-secret U.S. and U.K. intelligence program called Venona in which Soviet diplomatic messages were slowly being decoded. Venona information indicated that a number of Soviet spies had been working in the United States for some time. These revelations led to the convictions of several people, most famously Julius (1918–1953) and Ethel (1915–1953) Rosenberg, who were executed for spying.

Hoover believed the “red menace” posed a grave threat to American society. In 1956 he initiated a secret FBI program called COINTELPRO that would go on for nearly two decades and ultimately tarnish his hard-fought image as America’s guardian of law and order.

COINTELPRO

COINTELPRO was short for Counterintelligence Program. It combined surveillance techniques with trickery designed to harass suspected subversives. Hoover had long advocated the use of electronic bugs and secret tape recorders with or without legal permission from his superiors. These techniques were used liberally by the FBI during COINTELPRO. Originally limited to communists and their supporters, the program was soon expanded to cover other groups Hoover considered threatening to American security. These included civil rights leaders, like Martin Luther King Jr. (1929–1968), protesters against the Vietnam War, anti-government radicals, and members of the Ku Klux Klan. A typical COINTELPRO scheme was to send anonymous letters to the wives, families, or employers of suspects. These letters would include damaging or embarrassing information designed to ruin marriages, careers, and reputations. Tape recordings of sexual encounters were obtained by the FBI and used in a similar manner.

Hoover’s Last Years

In 1966 a Time article noted “An unwritten rule of American politics is: Never tangle with J. Edgar Hoover.” For decades Hoover had been rumored to be compiling his own secret files containing personal information about prominent Americans, including politicians, Hollywood celebrities, journalists, and others. His biographers would later say that these files helped cement his grip on power for so many decades. Politicians, in particular, feared what Hoover might know about them and what he might do with the information. It was all part of the immense legend that grew around the man and insulated him and his agency from presidential and congressional criticism. One administration after another kept Hoover as FBI director, long after he had reached retirement age.

In 1971 Hoover’s public façade began to crack when allegations flew around Washington, D.C., that the FBI had planted secret listening devices in congressional offices. House Democratic Leader Hale Boggs of Louisiana accused Hoover of using “the tactics of the Soviet Union and Hitler’s Gestapo” and called for the FBI director to be replaced. The accusations followed a break-in at an FBI office in Pennsylvania in which numerous documents were stolen and turned over to the press. They provided the first tantalizing clues about the existence of the COINTELPRO program, which Hoover terminated soon afterward. The full story would come out after his death.

Hoover died of a heart attack at his home on May 2, 1972. He was seventy-seven years old and had served as FBI director under eight U.S. presidents.

The Fallout

Hoover’s death was quickly followed by the fall from grace of President Richard Nixon (1913–1994) in the Watergate scandal. Congressional hearings into those matters uncovered unsavory activities by the FBI, including bugging the offices of White House aides, National Security Council members, and prominent reporters. Nixon resigned in 1974, before he could be impeached. By the end of the next year Hoover’s COINTELPRO program had become public knowledge and was widely criticized as a gross violation of American civil liberties. Many of the FBI’s operations had targeted law-abiding Americans engaged in peaceful activities, such as protesting the Vietnam War or speaking out against the government. The reputations of Hoover and the FBI were severely damaged by the revelations. New limits were placed on the FBI’s authority to conduct counterintelligence and domestic security investigations.

The existence of Hoover’s long-suspected secret files was confirmed by his private secretary who testified before Congress that she had destroyed many of the files after he died. Those that remained became a source of frequent Freedom of Information requests by curious journalists and provided much insight into the workings of the FBI during the long Hoover era.

In death, Hoover elicited a flood of negative publicity about COINTELPRO and other matters. He became the focus of numerous books, some of which claimed that he had engaged in scandalous sexual behavior. Critics have also faulted Hoover for the FBI’s slow response in the 1950s and 1960s to organized crime and civil rights abuses perpetuated against minorities. As a result, Hoover’s legacy is decidedly mixed. He is reviled for the abuses of power that characterized his dictatorial reign at the FBI, but he is also praised for building a premier law enforcement agency renowned for its technological and investigatory skills.

See also The Federal Bureau of Investigation

Alger Hiss

Alger Hiss (1904–1996), a high-ranking official in the U.S. Department of State, was an early victim of McCarthyism, an anticommunist political climate that ruined the careers of many innocent Americans. Hiss was convicted in 1950 of perjury after having allegedly provided classified documents to Time magazine editor Whittaker Chambers (1901–1961), an admitted member of a communist organization.

A Stellar Government Career

Hiss was born on November 11, 1904, in Baltimore, Maryland. He and his four siblings were raised in a genteel environment by their mother and an aunt following the suicide of his father, an executive with a dry goods firm. Hiss excelled in school. He received a scholarship to Johns Hopkins University, where he performed well enough to earn entrance into Harvard Law School. He continued to stand out academically at Harvard. After graduating from law school in 1929, he served as a clerk for legendary U.S. Supreme Court Justice Oliver Wendell Holmes (1841–1935), serving in that capacity for about a year.

After short stints at two prestigious East Coast law firms, Hiss accepted a position in 1933 with the federal government, working on a variety of New Deal projects within President Franklin Roosevelt’s (1882–1945) administration. In 1936 Hiss transferred to the State Department. He moved quickly up the State Department’s organizational ladder over the next decade. By the end of World War II he was high enough in the departmental bureaucracy to serve as an aid to Roosevelt at the Yalta Conference of 1945. Hiss attended conferences that led to the development of the United Nations (UN) charter, and was an adviser to U.S. delegation at the very first meeting of the UN General Assembly. He left his government post in 1946 to become president of the Carnegie Endowment for International Peace.

Chambers’s Damaging Testimony

Hiss’s promising career came to a screeching halt in 1948. On August 3, Chambers, a senior editor at Time who had acknowledged his own past association with the Communist Party, named Hiss as a communist sympathizer in his testimony before the House Committee on Un-American Activities. Chambers also claimed that in the late 1930s Hiss had stolen classified documents from the State Department and given them to Chambers for delivery to the Soviets. Hiss sued Chambers for slander, whereupon Chambers produced a substantial paper trail implicating Hiss.

Because the statute of limitations had run out on his alleged actions, Hiss could not be charge with espionage. He was instead tried for perjury, based on his testimony before Congress that he had never met Chambers before. A 1949 trial ended in a hung jury, but Hiss was retried the following year and found guilty of perjury. He was sentenced to five years in prison, of which he ended up serving forty-four months. He was also disbarred. He was released from prison in 1954, but his life was in shambles and his political career was over.

The conviction of Hiss fueled the anticommunist hysteria gripping the country, as it supported the notion that communists had successfully infiltrated the highest levels of American government. Throughout his incarceration and afterward, Hiss and his supporters strenuously argued that he was innocent and had been framed; and indeed over the years credible evidence emerged that the evidence against him had been mishandled, or perhaps even fabricated. Nevertheless, accusations about Hiss continued in the post-Soviet era as previously unavailable documents were declassified and translated. Several historians have identified Hiss as the Soviet spy who is code-named “Ales” in intercepted Soviet communications. However, the question of his guilt or innocence has never been definitively resolved. Hiss died in 1996 at the age of ninety-two.

See also McCarthyism

Julius and Ethel Rosenberg

During the height of anticommunist hysteria in the United States in the early 1950s, Julius (1918–1953) and Ethel (1915–1953) Rosenberg, an otherwise nondescript married couple, were convicted of espionage and executed for their alleged role in passing nuclear secrets to the Soviets. While information that has surfaced in the years since their trial tends to confirm that their guilty verdict was correct, the trial itself ignited outrage among liberals worldwide, as did their death sentence, which seemed to many wildly out of proportion to the severity of their offense.

Background

Julius Rosenberg was born into a Jewish family in New York on May 12, 1918. While still in his teens, he became active in the Young Communist League, where he met his future wife, Ethel Greenglass, a Jewish woman three years his senior. Julius received a degree in electrical engineering from the City College of New York in 1939, and the following year he began working for the U.S. Army Signal Corps. Ethel was a frustrated singer and actress who was active in labor organizing. The couple married in 1939 and had two sons, Michael, born in 1943, and Robert, born in 1947. Active members of the Communist Party until Michael was born, the couple then dropped out of the party and lived an apparently quiet domestic life in Brooklyn.

In June 1950, Ethel Rosenberg’s brother David Greenglass, a machinist at the Los Alamos, New Mexico, nuclear research facility, confessed to the Federal Bureau of Investigation (FBI) that he had passed sensitive information about the Manhattan Project—the project that led to the development of the first atomic weapons—indirectly to the Soviets. He also said that he had given secret documents to his sister Ethel and her husband. FBI agents visited the Rosenberg household the next day. The Rosenbergs firmly denied that they were involved in spying, but others involved in the ring implicated them. Julius Rosenberg was arrested in July 1950, and Ethel was taken into custody the following month. A grand jury indicted both of them on August 17, 1950, for conspiracy to commit espionage, though the evidence against them was not very strong.

The Rosenberg Trial

At trial, a handful of witnesses testified that the Rosenbergs had indeed been part of the spy ring at various times during the 1940s, though no other concrete evidence against them materialized. The most damaging information seemed to be that Julius Rosenberg had begun making preparations to leave the country upon learning that Dr. Karl Fuchs, a Manhattan Project scientist, had confessed to passing atomic secrets to the Soviet Union. The other major strike against them was their indisputable involvement with the Communist Party, despite membership in the party being fairly common during the period in which they were active, and that the Soviet Union was an ally of the United States at the time. When asked on the witness stand about their affiliation with the Communist Party, both Rosenbergs invoked their Fifth Amendment right to avoid incriminating themselves. Meanwhile, their defense attorney Emmanuel Bloch was of little help, failing miserably in his attempts to play down the importance of the information the Rosenbergs were accused of helping deliver to the Soviets.

The Rosenbergs maintained throughout the trial that they were innocent, but the jury believed the prosecution. They were both convicted of espionage, and on April 5, 1951, they were sentenced to die in the electric chair.

Around the world, a wave of protest rang out regarding the unfairness of the government’s treatment of the Rosenbergs and the severity of their punishment. The Rosenbergs attempted for two years to appeal their convictions, taking the case all the way to the U.S. Supreme Court, but they were unsuccessful. They were both executed on June 19, 1953, at the Sing Sing prison in New York. The case of the Rosenbergs continues to stir debate. While more recent examinations of the evidence against them suggest they had in fact engaged in spying for the Soviet Union, the idea of executing a married couple for passing relatively unimportant information to a nation that was not even an enemy at the time is difficult to comprehend more than half a century later. However, the sentencing judge at their trial asserted that by helping the Soviets acquire nuclear weapons sooner than they otherwise would have, the Rosenbergs contributed to causing thousands of deaths in the Korean War and to the “constant state of tension” that characterized the nuclear stalemate of the Cold War era.

See also McCarthyism

Dwight D. Eisenhower

Dwight D. Eisenhower (1890–1969) parlayed his fame as a leader of Allied forces during World War II into a successful political career that culminated in his election as the thirty-fourth president of the United States. Eisenhower led the nation during a period of unprecedented worldwide power and great postwar prosperity, but did so under the cloud of the Cold War.

Dwight David Eisenhower was born on October 14, 1890, in Denison, Texas, and raised in the small farming community of Abilene, Kansas. The third of seven sons of David Eisenhower and Ida Stover Eisenhower, he was originally named David Dwight Eisenhower, but his first and middle names were flipped at an early age to avoid the confusion of having two Davids in the house. A standout both in the classroom and on the athletic field, Eisenhower earned an appointment to the U.S. Military Academy at West Point, New York, after graduating from high school and working with his father at the local creamery for two years.

Average Student, Competent Officer

Eisenhower was only an average student at West Point, graduating sixty-first in a class of 164 in 1915. After graduation, he was commissioned as a second lieutenant and stationed at Fort Sam Houston in Texas. There he met Mamie Doud, and the two were married in 1916. When the United States entered World War I in 1917, Eisenhower hoped he would be assigned a combat command in Europe. Instead, he remained stateside, where he instructed troops at various training camps. When he finally received a European command, it was just a month before the war came to an end; by then his services overseas were no longer needed.

Eisenhower proved to be a competent officer over the next several years of his military career, but his superiors perceived his chief skills to be organization and administration rather than in the more glamorous realm of battle. So rather than being assigned to a field position in some distant part of the world, Eisenhower became a senior aide to General Douglas MacArthur (1880–1964) in 1932. A few years later, MacArthur made him his assistant military adviser for the Philippine Commonwealth, a position in which he remained until 1939.

Though he was successful in his job and well respected by army commanders, Eisenhower was frustrated with the pace of his career advancement in the military, having reached only the rank of lieutenant colonel by 1940. The entrance of the United States into World War II finally gave him the opportunity to demonstrate his leadership abilities. Just days after the Japanese attack on Pearl Harbor in December 1941, Eisenhower was called to Washington, D.C., and put in charge of the War Plans Division (later renamed the Operations Division) of the War Department. Because of his experience in the Philippines, he was initially consulted on matters relating to the war in the Pacific. Soon, however, Eisenhower was advocating a “Europe first” policy, which called for the United States to take the offensive in Europe and Africa while holding the line in the Pacific theater. Eisenhower was quickly promoted to one-star general, and his influence in the conduct of the war in Europe expanded. By 1942 he was in charge of U.S. forces based in Great Britain.

Normandy Invasion

A short time later, Eisenhower was promoted again, and was named to lead joint U.S.-British efforts in North Africa. Under Eisenhower’s leadership, Allied forces succeeded in defeating the Axis powers in North Africa, forcing the surrender of enemy forces in Tunisia in May 1943. From there, Eisenhower, now a four-star general, led the amphibious assaults on Sicily and, in September of that year, Italy. Based on his efforts in these successful campaigns, Eisenhower was named supreme commander of Operation Overlord, the code name for the huge Allied incursion into Nazi-occupied France. The invasion at Normandy, launched in June 1944, was the biggest amphibious attack in the history of the world. It proved to be a key turning point in the war in Europe, and made Eisenhower a celebrity in the United States. Allied forces moved on into Germany, ultimately forcing the Nazi surrender and ending the European portion of the war. During the war’s closing stages, Eisenhower made a critical decision that would have a lasting impact on the Cold War that followed: he left the capture of Berlin to the Soviet army, while he focused the attentions of Western forces elsewhere, thus setting the stage for the partition of the city and the eventual erection of the Berlin Wall, the Cold War’s most visible symbol.

In 1945 Eisenhower returned to the United States a hero. He was named army chief of staff, replacing General George C. Marshall (1880–1959). Eisenhower held that position for two years, but he did not particular enjoy it, and retired from the military altogether in 1948. Because of his overwhelming popularity, there was much talk at that time of a presidential run. Instead, Eisenhower wrote a book about his wartime experiences, Crusade in Europe (1948), and accepted a position as president of Columbia University. After only a few years at that job, Eisenhower was summoned out of retirement from the military by President Harry Truman (1884–1972) to serve as supreme commander of Allied forces in Europe during the formation of a new alliance designed to counter potential communist aggression. This new entity would evolve into the North Atlantic Treaty Organization.

From War Hero to President

By 1952 Republican leaders had managed to persuade Eisenhower to run for president. Still immensely popular from his wartime exploits, Eisenhower easily defeated Democratic nominee Adlai Stevenson (1900–1965) in the 1952 election. Eisenhower took office at a time when war was raging in Korea and the Cold War was simmering everywhere else. At home, the communist “witch hunt” spearheaded by Senator Joseph McCarthy (1908–1957) was in high gear. During the first year of his presidency, Eisenhower managed to extricate the United States from Korea, negotiating a cease-fire that was signed in July 1953. Eisenhower moved to reduce military spending by beefing up the U.S. nuclear arsenal—which he considered the most effective deterrent to Soviet aggression—while reducing the size of the nation’s expensive conventional forces and weaponry.

After the death of Soviet leader Joseph Stalin (1879–1953), Eisenhower made attempts to improve relations between the United States and the Soviet Union. These efforts met with little success, though he did meet with Soviet leaders in 1955. While Eisenhower kept the United States out of full-scale wars for the remainder of his presidency, he was aggressive in battling communism around the globe. In 1953 the Central Intelligence Agency (CIA) directed the overthrow of the democratically elected government of Iran based on the notion that they were pro-communist. A similar overthrow was orchestrated in Guatemala the following year. In 1954 Eisenhower oversaw the creation of the Southeast Asia Treaty Organization, a military alliance formed to combat the spread of communism in that region.

Eisenhower suffered a heart attack in 1955, but his recovery progressed quickly enough to allow him to run for reelection in 1956. Once again, he defeated Democratic challenger Stevenson by a comfortable margin.

Second Term

In 1957 Eisenhower faced his next Cold War crisis when the Soviet Union successfully launched the first manmade satellite into orbit. This development represented a major blow to the American psyche; to many it meant that the Soviets had surpassed the United States in technological sophistication. In response, Eisenhower urged Congress to commit more funding for scientific and military research, and in July 1958 the National Aeronautics and Space Administration was formed. The space race had begun.

Cuba represented Eisenhower’s next Cold War challenge. In the late 1950s, the Eisenhower administration became disenchanted with the brutality and corruption of the government of Cuban president Fulgencio Batista y Zaldívar (1901–1973), and in 1958 the United States withdrew its military support of the Batista regime. The Cuban government quickly collapsed, and in the power struggle that ensued, leftist guerrilla leader Fidel Castro (1926–) emerged as the new Cuban head of state. Castro immediately developed a close alliance with the Soviet Union, guaranteeing that relations between Cuba and the United States would remain distant. Eisenhower assigned the CIA to develop a plan to support an invasion of Cuba by Cuban exiles living in the United States. The plan was still in development when Eisenhower’s term as president came to an end. Launched in the spring of 1961, the Bay of Pigs invasion failed miserably and was a great embarrassment to the new president, John F. Kennedy (1917–1963).

While Eisenhower’s main area of interest was always on foreign policy, a number of important domestic developments took place during his presidency as well. The civil rights movement gained considerable momentum during the 1950s. In 1954 the U.S. Supreme Court, under Eisenhower-appointed Chief Justice Earl Warren (1891–1974), unanimously declared school segregation unconstitutional in the landmark case Brown v. Board of Education of Topeka. When Arkansas governor Orval Faubus tried to block implementation of the law, Eisenhower intervened, sending federal troops to Little Rock in 1957 to address the situation. Eisenhower was also the first president to hire African-Americans to positions of consequence within his administration, most notably Assistant Secretary of Labor J. Ernest Wilkins. Eisenhower’s other domestic achievements included signing the legislation that created the national highway system and creating the new cabinet-level U.S. Department of Health, Education, and Welfare.

Eisenhower was the first president subject to the Twenty-second Amendment to the U.S. Constitution, which states that no one may serve as president of the United States for more than two four-year terms. He left office in 1961, and he and his wife, Mamie, retired to their homes in Gettysburg, Pennsylvania, and Palm Desert, California. Over the next few years, he wrote three more books of memoirs: Mandate for Change (1963), The White House Years (1963–1965), and At Ease: Stories I Tell to Friends (1967). While he was occasionally called on to play the role of elder statesman for the Republican Party, he mostly played golf and painted with watercolors. Eisenhower had several heart attacks in 1968 and died in 1969.

See also The Cold War

See also The Korean War

See also McCarthyism

National Aeronautics and Space Administration

Many Americans reacted with dismay in 1957 to the news that the Soviet Union had successfully launched Sputnik, the first manmade satellite, into orbit. The news was especially disturbing coming in the wake of the Soviet news agency’s announcement the previous summer that Russia had successfully tested an intercontinental ballistic missile (ICBM) capable of delivering a nuclear warhead to targets within the United States. To observers already disappointed by the perceived “missile gap” between the United States and the Soviet Union was now added a perceived “space lag.” It did not help matters that the United States’ initial attempt at putting a small satellite into orbit ended disastrously when the rocket carrying it exploded a few seconds after liftoff.

By early 1958 the United States had managed to send two satellites—Explorer and Vanguard—into orbit, and the United States’ ICBM supply grew over the next few years. President Dwight D. Eisenhower (1890–1969) was convinced that the American nuclear arsenal was adequate, but he also believed that an overall national space program was in order. While the military implications of space superiority over the Soviets were obvious, he was more interested in a space program that was primarily nonmilitary and believed that such a program should be run by civilians. He decided to form a national space program using as its foundation an existing agency called the National Advisory Committee for Aeronautics, which had been established in 1915 but had never been awarded much of a budget.

The new, improved version of the agency, to be known as the National Aeronautics and Space Administration (NASA), was created with the signing of the National Aeronautics and Space Act on July 29, 1958. Eisenhower named T. Keith Glennan, president of the Case Institute of Technology and a former member of the Atomic Energy Commission, as NASA’s first administrator. NASA formally went into action on October 1, 1958. Within a year, NASA had realized its first successful satellite launch. NASA’s attentions soon shifted from satellites to manned spaceflight, starting with Project Mercury, established in November 1958. Between 1961 and 1963, NASA carried out half a dozen successful manned missions.

In spite of these successes, the assertion by President John F. Kennedy (1917–1963) in May 1961 that the United States could land on the moon “before the decade is out” seemed unduly optimistic to many at the time. However, when two American astronauts set foot on the lunar surface in July 1969, Kennedy’s optimism proved justified.

Rosa Parks

Rosa Parks (1913–2005) was responsible for one of the defining moments of the civil rights movement when she refused to give up her seat to a white man on a crowded Montgomery, Alabama, bus on December 1, 1955. Her act of defiance galvanized African-Americans across the country and made her an instant civil rights heroine. Her subsequent arrest and fine led to a successful boycott of the Montgomery bus system by African-American riders, sparking a sequence of events that changed race relations in the South forever.

Youth in Segregated Alabama

Parks was born Rosa McCauley on February 4, 1913, in Tuskegee, Alabama, the daughter of James McCauley, a carpenter, and Leona (Edwards) McCauley, a schoolteacher. Shortly after her younger brother Sylvester was born, her parents separated. Leona McCauley took the children to live with their grandparents in Pine Level, Alabama. James McCauley migrated north and had little contact with his children after the move.

When she was a child, Parks was a good student, but not the kind who would have led teachers to predict that she was destined to become a historical figure. She attended the Montgomery Industrial School for Girls, then graduated from the all-black Booker T. Washington High School in 1928. For a short time she attended Alabama State Teachers College in Montgomery.

In 1932 Rosa married Raymond Parks, a barber with little formal education. With both Rosa and Raymond holding steady jobs, the young family enjoyed a degree of financial stability. They were, nevertheless, held back by the harsh reality of segregation and second-rate service for African-Americans at many establishments in Alabama. As they became increasingly incensed with sitting in the back of the bus and using “colored” facilities inferior to those reserved for “whites only,” both Rosa and Raymond became increasingly involved in civil rights endeavors. They worked on voter registration drives, and Rosa became active in the National Association for the Advancement of Colored People (NAACP). She was named secretary of the NAACP’s Montgomery branch in 1943.

Fateful Bus Ride

The segregation laws in Montgomery in 1955 reserved the first four rows of seats on city buses for whites. The middle section was a sort of “no man’s land,” where blacks could sit if there were no white riders occupying the seats. The rear section of the bus was African-American only. Often, African-American riders had to stand even if there were empty seats available in the front of the bus. If an African-American passenger was seated in the middle section, she was expected to give up her seat if a white rider boarded and the first four rows were full. To add to the humiliation, African-Americans were often made to exit out the front door after paying their fare, then reenter through the back door so that they would not pass by the white riders up front. Parks was certainly not the first African-American to stand up to this policy. Riders were frequently kicked off of buses or even arrested for refusing to abide by these racist laws.

Parks worked at a number of different occupations over the years, including insurance sales, housecleaning, and sewing. In 1955 she was working as a seamstress at the Montgomery Fair department store. On December 1, she was exhausted after a grueling day at work. When she boarded the bus to go home, she found an available seat in the middle section. A few stops later, a white rider entered the bus, and the driver ordered Parks and three other African-American passengers to move. The other three riders obeyed, but Parks had had enough, and she continued to resist even after the driver threatened to call the police.

Parks was taken to jail. She used her one phone call to contact a prominent member of the local NAACP chapter, and word of her arrest quickly spread in the African-American community. Community leaders recognized Parks’s case as an opportunity to challenge Montgomery’s backward segregation laws. Parks agreed to take part in the challenge, refusing to pay her fine and appealing her guilty verdict to the Montgomery Circuit Court. Under the leadership of the NAACP and Montgomery’s black churches, a boycott of the city’s buses was organized. One of the key organizers was Martin Luther King Jr. (1929–1968), the young pastor of a Baptist church in Montgomery. African-Americans, who accounted for nearly three-quarters of the bus company’s business, were urged to stop riding on buses until the company agreed to abolish its racist rules and to begin hiring African-American drivers.

Bus Boycott Spurs Change

The boycott was highly successful, as nearly every African-American in Montgomery used other means to get around. Meanwhile, the newly formed Montgomery Improvement Association filed suit to have Alabama’s segregation laws declared unconstitutional. The case made its way to the U.S. Supreme Court, and on December 20, 1956, Montgomery officials were ordered to end discriminatory practices on their buses. Thus the boycott came to an end after 381 days.

Though victorious in her battle against the humiliation of Montgomery’s racist bus policies, the struggle took a toll on Parks and her family. There was a great deal of backlash from the white community, and she and Raymond were unable to find stable employment afterward. In 1957 Parks, her husband, and her mother all moved to Detroit, where her brother Sylvester already lived. She continued her civil rights activism in Detroit, working with the Southern Christian Leadership Conference and the NAACP. Parks was now a civil rights icon and was in great demand as a public speaker. A boulevard in Detroit was named after her, and she received dozens of other awards and honors over the years, including the Presidential Medal of Freedom and the Congressional Gold Medal. In 1965 Parks was hired to work in the Detroit office of Congressman John Conyers, where she remained employed until her retirement in 1988. After she died at her Detroit home on October 24, 2005, she became the first woman to lie in honor in the U.S. Capitol Rotunda.

See also The Civil Rights Movement

Thurgood Marshall

A leading civil rights crusader for much of the twentieth century, Thurgood Marshall (1908–1993) was the first African-American to serve on the U.S. Supreme Court. He was influential in bringing about the desegregation of public schools and universities in the United States and fought for equal treatment for people of color in the military. As a private lawyer, as the legal director of the National Association for the Advancement of Colored People (NAACP), and as a judge, Marshall represented the interests not only of African-Americans, but of all the nation’s disadvantaged populations. He brought a progressive viewpoint to U.S. policymaking on behalf of those who lacked the money or clout to otherwise be heard.

Early Life

Marshall was born into a middle-class family on July 2, 1908, in Baltimore, Maryland. His mother was a teacher in a segregated public elementary school. His father worked at a posh all-white yacht club. Marshall grew up in a comfortable home in a racially diverse neighborhood, where black children and white children played together unselfconsciously. He was popular in school, though somewhat of a class clown, which prevented him from achieving stellar grades. After graduation from high school, Marshall entered Lincoln University, an all-black college near Philadelphia, Pennsylvania. He graduated with honors in 1930 and enrolled in law school at Howard University, the historically all-black institution in Washington, D.C. Marshall graduated from law school at the top of his class in 1933, and later that year was admitted to the Maryland bar. He set up a private legal practice in Baltimore, specializing in civil rights and criminal cases. Many of his clients could not afford to pay him, but he believed it was important that all people have access to legal representation regardless of their economic status.

Marshall also began serving as counsel to the Baltimore chapter of the NAACP, and it was in that capacity that he handled his first important case, representing an aspiring African-American law student who was trying to get into the all-white University of Maryland law school. Marshall’s team won the case, and he was soon invited to work at the NAACP’s national headquarters in New York. In 1938 the organization named Marshall head special counsel, and two years later he was named director of the newly created NAACP Legal Defense and Education Fund.

Marshall spent the next twenty years litigating civil rights cases across America—often under threat of physical violence—challenging the nation’s racist status quo. He was indisputably one of the key legal forces against racial discrimination in America, arguing thirty-two cases before the U.S. Supreme Court during the 1940s and 1950s and winning twenty-nine of them. Among his victories were cases dealing with white-only primary elections, discrimination in jury selection, and racist housing policies.

Brown v. Board of Education

The case that put Marshall in the national spotlight was Brown v. Board of Education of Topeka (1954), the landmark school desegregation case. In Brown, Marshall successfully argued before the Supreme Court that the “separate but equal” doctrine that had been established in the 1896 Plessy v. Ferguson case was unconstitutional, based on the obvious conclusion that separate educational systems were by their very nature unequal. The Court’s decision in Brown v. Board of Education began the slow, controversial, sometimes violent process of desegregating the nation’s schools. Another slow, painful battle, that of cancer, took the life of Marshall’s first wife, the former Vivian Burey (better known as “Buster”). About a year later, he married Cecilia Suyat, a secretary at the New York office of the NAACP.

Marshall continued to battle against discrimination over the next several years, attempting to integrate Little Rock Central High School in Arkansas and the University of Alabama in the late 1950s. When the Democrats gained the White House in the 1960 presidential election, Marshall made clear to party leaders that he was interested in a judicial appointment. He did not have to wait long to realize this desire. In 1961 President John F. Kennedy (1917–1963) nominated Marshall for a spot as federal judge on the U.S. Court of Appeals for the Second Circuit, which covers New York, Vermont, and Connecticut. The confirmation process stretched for nearly a year because of resistance from Southern segregationists, but he eventually prevailed. In his four years on the circuit court, Marshall wrote more than one hundred opinions, with none reversed by the Supreme Court. Four years later, President Lyndon Johnson (1908–1973) appointed Marshall solicitor general, meaning he would argue the federal government’s position in Supreme Court cases. In 1967 Johnson nominated Marshall to become the first African-American justice on the nation’s highest court. In spite of reluctance from Southern conservatives in the Senate—most notably Strom Thurmond (1902–2003) of South Carolina—Marshall was confirmed after two months, thus breaking the Supreme Court racial barrier.

U.S. Supreme Court

In his earliest years on the Supreme Court, Marshall fit right in among the liberals he joined, including Chief Justice Earl Warren (1891–1974). Over time, however, the Court grew more conservative as long-serving liberals retired and were replaced by justices appointed by Republican presidents Richard M. Nixon (1913–1994), Gerald R. Ford (1913–2006), Ronald Reagan (1911–2004), and George H. W. Bush (1924–). Marshall often found himself an isolated liberal voice on a court moving gradually but steadily to the right. In case after case he was the Court’s most consistent defender of society’s underdogs. For example, in Powell v. Texas (1968), Marshall wrote that habitual drunkenness was a medical condition rather than a criminal act. He was also a fierce guardian of the rights to free speech and privacy. In 1969 Marshall wrote the Court’s opinion in Stanley v. Georgia, in which he and his colleagues overturned a man’s conviction for possessing obscene material in his own home. Marshall was also a strong opponent of capital punishment. One of his most important pieces of work was his sixty-page concurring opinion in Furman v. Georgia (1972), the case that ended the death penalty in a number of states as it was applied at the time, resulting in many death sentences being overturned.

By the 1970s Marshall had become quite reclusive. He gave very few interviews and did his best to avoid big public events. He spent his workdays in his chambers and spent the majority of his off time with his family, consisting of his wife, Cecilia, and their two sons, Thurgood Jr. and John. As the Court drifted to the right under Chief Justice Warren Burger (1907–1995) and his successor William Rehnquist (1924–2005), Marshall clung to his liberal principles, but more and more often found himself on the losing side of key decisions, along with his friend and fellow liberal Justice William J. Brennan Jr. (1906–1997). Supreme Court justices have traditionally restrained themselves in expressing their political views, but in the 1980s, Marshall was unable to contain his contempt for Republican Presidents Reagan and Bush. Dismayed by the Court’s reversals of important decisions related to affirmative action and minority set-aside programs, Marshall criticized these presidents sharply for their support of policies that he believed turned back the clock on civil rights, undoing much of the movement’s hard-won progress. So deep was his disapproval of Reagan that he was once quoted in Ebony as saying, “I wouldn’t do the job of dogcatcher for Ronald Reagan,” a burst of public disrespect rare for a justice.

Brennan’s retirement in 1990 left Marshall even more isolated as the lone remaining genuine liberal on the Supreme Court. Marshall, whose health had been deteriorating for several years, announced his own retirement a year later. His retirement may have had as much to do with his frustration level as with his medical problems. Less than two years later, on January 24, 1993, Marshall died of heart failure.

Earl Warren

Earl Warren’s (1891–1974) sixteen-year tenure as chief justice of the U.S. Supreme Court was marked by an unprecedented series of landmark decisions related to civil rights, freedom of speech, and other individual civil liberties. As detested by conservatives for his “judicial activism” as he was adored by liberals for championing progressive ideals, Warren led a Court that helped change American society during the 1950s and 1960s.

Warren was born on March 19, 1891, in Los Angeles, California. His father, a Norwegian immigrant, was a railroad worker. The family moved to Bakersfield, California, when Warren was a child. During his college years, Warren supported himself by working summers on a railroad crew. After receiving a bachelor’s degree from the University of California (UC) at Berkeley, Warren went straight into UC’s law school, earning a law degree in 1914. He enlisted in the U.S. Army in 1917, then returned to Northern California to practice law when his World War I tour of duty was over.

Early Career

In 1920 Warren took a job with the Alameda County, California, district attorney’s office, initially intending to stay just long enough to gain some legal experience. In 1925, however, he was elected to the first of his three terms as district attorney.

As district attorney, Warren, a liberal Republican, gained a reputation as a tough-but-fair prosecutor. In 1938 he was elected attorney general for the state of California. A few years later, Warren earned one of the few black marks on his career. Following the Japanese attack on Pearl Harbor in December 1941, Warren was one of the most prominent state officials calling for the internment of Japanese-Americans living on the West Coast, which history has judged to be a gross violation of the rights of those individuals, the majority of whom were American citizens.

Governor of California

In 1942 Warren was elected governor of California, winning by a considerable margin. He proved to be a popular governor, and was reelected in 1946 and 1950 with broad, bipartisan support. In 1948 he suffered the only electoral defeat of his career, as the vice-presidential candidate on Thomas Dewey’s (1902–1971) 1948 Republican ticket that lost to Harry Truman (1884–1972).

Warren was instrumental in garnering support for Dwight D. Eisenhower’s (1890–1969) nomination as the 1952 Republican presidential candidate. Eisenhower rewarded Warren by appointing him to replace U.S. Supreme Court Chief Justice Fred Vinson (1890–1953) when Vinson died unexpectedly in 1953.

Supreme Court

Warren was quickly faced with a case of historical importance in Brown v. Board of Education of Topeka. In that case, the Warren Court ruled unanimously that the “separate but equal” doctrine that had allowed states to maintain segregated schools was unconstitutional. The decision helped kick the civil rights movement into high gear, but it also infuriated segregationists in the South, many of whom called for Warren’s impeachment.

In addition to civil rights and segregation, the Warren Court also made its mark in the area of criminal justice with a series of key decisions. In Griffin v. Illinois (1956), Warren with the majority ruled that states had to furnish an indigent defendant with a copy of the evidence against him. Mapp v. Ohio (1961) extended the constitutional protection against unreasonable search and seizure to apply to state courts and officers. Similarly, Gideon v. Wainright (1963) extended the right to legal counsel to state as well as federal criminal proceedings. Miranda v. Arizona (1966) required that suspects be informed of their right to counsel before interrogation. Another key area in which the Warren Court made important rulings was the geography of representation in government. In Baker v. Carr (1962) and Reynolds v. Sims (1964), Warren and his colleagues held that state legislatures must be apportioned based on population rather than geography, a doctrine known as “one person, one vote.”

Warren Commission

In 1963 President Lyndon B. Johnson (1908–1973) appointed Warren to head a commission created to investigate the circumstances surrounding the assassination of John F. Kennedy (1917–1963). The Warren Commission, as it was popularly known, issued a massive report in 1964 in which the members concluded that Lee Harvey Oswald had acted alone in assassinating Kennedy, and that there was no convincing evidence of a broader conspiracy. The Warren Report has remained somewhat controversial since the day it was issued; a number of vexing questions about the assassination have never been answered to the satisfaction of many Americans.

Warren retired from the Supreme Court in July 1969 and was succeeded as chief justice by Warren Burger (1907–1995). Earl Warren died in 1974.

See also The Warren Commission

Fidel Castro

Fidel Castro (1926–) has been president of Cuba since 1959, when he led the revolution that ousted dictator Fulgencio Batista y Zaldívar (1901–1973) and formed the Western Hemisphere’s first communist government. One of the world’s most recognizable leaders on the basis of his trademark bushy beard and military fatigues, Castro is revered by some as a man of the people who dramatically improved most Cubans’ standard of living, and loathed by others as a ruthless despot who brutally represses all opposition. Under Castro’s leadership, Cuba remains steadfastly communist even after the disintegration of its longtime ally the Soviet Union.

Fidel Castro Ruz was born on August 13, 1926 (or, according to some sources, 1927), in Cuba’s Oriente Province, and grew up on his family’s thriving sugar plantation. As a child, Castro had an unquenchable thirst for knowledge. He pestered his parents to provide him with a formal education, and though they had not planned on sending him to school, his persistence eventually paid off when they relented. Castro attended Jesuit schools in Santiago for several years before entering Belén College, an exclusive college preparatory high school in Havana, in 1941. He excelled in nearly every subject, and was a standout athlete as well.

Revolutionary Climate

In 1945 Castro began studying law at the University of Havana. Revolution was in the air at the university, and Castro quickly became interested in politics. He acquired a reputation as a fiery speaker and committed activist. In 1947 he left school temporarily to take part in an attempt to overthrow dictator Rafael Trujillo Molina of the Dominican Republic. The revolt was aborted en route on the sea, and Castro was forced to swim ashore through shark-infested waters while holding his gun above his head. The following year, while attending a conference in Bogotá, Colombia, Castro took part in the Bogotazo, a series of riots there that took place in the wake of the assassination of Liberal Party leader Jorge Eliécer Gaitán.

Later that year, Castro married Mirta Díaz-Balart, and in 1949 the couple had a son, Fidelito. Fidel Castro completed his legal studies in 1950 and helped establish the law firm of Azpiazu, Castro y Rosendo. Much of his work involved representing poor and working-class individuals, often for free. Castro continued his political activity, joining the liberal Ortodoxo Party, which emphasized social justice and economic independence. In 1952 Castro began campaigning for a seat in the Cuban parliament. His run for office was cut short, however, when Batista, a general in the army, staged a coup that overthrew democratically elected president Carlos Prio Socarras.

Outraged, Castro officially challenged Batista’s actions by filing a petition with the Court of Constitutional Guarantees. The petition was rejected. Castro, however, was not ready to give up quite so easily. He formed a small band of about 165 rebels, and on July 26, 1953, launched an attack on the Moncada barracks in Santiago de Cuba, in hopes that the assault would spark a broader uprising in Oriente Province. The attack failed miserably; half the participants were captured, tortured, and/or killed. Castro and his brother Raúl were taken into custody, and Fidel Castro was sentenced to fifteen years in prison.

Movement Regroups in Mexico

Castro was released under a general amnesty order in May 1955, having served only a year-and-a-half of his sentence. A few months later, Castro and a small number of followers left Cuba and regrouped in Mexico, where they formed the “26th of July” movement, named after the date of the failed Moncada barracks attack. There he met a young Argentine doctor and revolutionary named Ernesto “Che” Guevara (1928–1967). Together, they trained their small army in military tactics and skills.

Led by the Castro brothers and Guevara, a tiny army of about eighty-two men returned to Cuba on an old yacht in December 1956. The ragtag force was met by Batista’s army almost the moment they landed on the north shore of Oriente, and more than half of the rebels were killed. The survivors, including the Castros and Guevara, retreated to the Sierra Maestra mountains. Over the next two years, Castro and his followers worked to build support among the local peasants, winning them over with promises of land reform. The revolutionaries spent most of their time avoiding Batista’s forces and engaging in occasional ambushes and small attacks. Meanwhile, word of Castro and his band of revolutionaries spread across Cuba, and revolutionary fervor began to arise among the nation’s poor.

Castro’s rebel movement began to gain the upper hand in early 1958. It helped that the United States, aggravated by the corruption and brutal tactics of the Batista government, suspended its military support of his regime. In December of that year, Guevara led an attack on a government train column and dealt Batista’s army a devastating blow. Sensing imminent defeat, Batista fled the country on New Year’s Day 1959. Castro marched into Havana the next day and assumed control of Cuba.

Socialist Reforms

To his supporters, Castro was a charismatic hero. To his enemies, Castro was a criminal. Some members of the Batista regime were arrested and executed. Castro began confiscating the property of Batista supporters, as well as the inherited wealth of Cuba’s elite families. In all, 1,500 new laws were passed the first day of the revolution. One law passed early in Castro’s reign, the Agrarian Reform Act, confiscated land from anyone with an estate of more than 1,000 acres. While wealthy Cubans were horrified by Castro’s moves to redistribute the nation’s wealth, the impoverished masses were elated. They were equally delighted by Castro’s campaigns to promote literacy and improve access to health care. These campaigns were highly successful. Average life expectancy in Cuba rose dramatically, and Cuba now boasts one of the highest literacy rates in the Western Hemisphere. Schools and universities were built, and postgraduate education was made free.

Castro’s early socialistic policies and nationalistic rhetoric did not please the administration of U.S. President Dwight D. Eisenhower (1890–1969). U.S.-Cuban relations were strained further when Castro signed an agreement with the Soviet Union to trade sugar for oil. In response, the United States reduced its imports from Cuba, and in January 1961 Eisenhower cut off relations with Cuba completely. Cuba’s ties with the Soviets quickly intensified.

Eisenhower was naturally distressed by the presence of a Soviet ally so close to the U.S. mainland. This unease carried over into the Kennedy presidency. In April 1961 the United States supported a mission to invade Cuba and overthrow Castro that was conducted by a small army of Cuban exiles trained by the Central Intelligence Agency. Landing at the Bay of Pigs on Cuba’s southern coast, the exile army was quickly overwhelmed by Castro’s military, and the invasion ended in a colossal failure. The event was a major embarrassment to the United States, and solidified support for Castro within Cuba. Following the Bay of Pigs debacle, Castro officially proclaimed Cuba to be a communist state with a Marxist-Leninist orientation akin to that of the Soviets.

October 1962: Cuban Missile Crisis

Relations between Castro and the Americans continued to deteriorate. In October 1962 U.S. intelligence detected the presence of powerful Soviet missiles in Cuba. The resulting showdown, known as the Cuban missile crisis, brought the United States and the Soviet Union to the brink of nuclear war, as the United States imposed a blockade on the island, and both superpowers issued ultimatums to each other. The Soviets eventually agreed to remove their Cuban-based missiles, after the United States agreed to various concessions negotiated in secret meetings. The crisis brought Cuba even closer to the Soviet Union, and isolated it from most of its neighbors. This isolation grew when, in the aftermath of the crisis, the Organization of American States (OAS) suspended Cuba’s membership, and, two years later, cut off all diplomatic and trade relations. Among OAS nations, only Mexico opted not to participate in the boycott.

While he was reforming Cuba’s economy—nationalizing its industries, centralizing economic planning, and seizing government control of the media—Castro also resorted to repressive tactics to maintain social order. Tens of thousands of political prisoners were locked up during the 1960s, as were thousands more simply for being homosexuals or for being artists or intellectuals whose work was deemed subversive.

Castro also sought to export his revolution to other countries. In 1966 he helped found the Organization of Solidarity of the People of Asia, Africa and Latin America, which supported anti-imperialist struggles in Third World countries all over the world. In the years that followed, he sent Cuban personnel to fight in Angola’s civil war, to assist Ethiopia’s efforts to repel attacks from Somalia, and to support guerrilla movements in several Latin American countries, including the Sandinistas in Nicaragua.

Castro’s hold on the reins of power in Cuba has not been seriously threatened since the 1959 revolution. As the Soviet Union reformed its economic system and dismantled itself in the late 1980s and early 1990s, Castro clung to his socialist policies in Cuba. In 2006 serious illness forced him to transfer leadership duties to his brother Raúl. With Castro entering his eighties, it was unclear whether he would ever again be healthy enough to fully resume his role as president.

See also The Cuban Missile Crisis

The Mariel Boatlift

During the first decade of Fidel Castro’s (1926–) rule in Cuba, very few people were allowed to leave the country permanently. It was not until 1980 that a window of opportunity opened for those seeking to emigrate from Cuba. Over several months during that year, about 125,000 Cubans fled their homeland and crossed the Straits of Florida on boats of all shapes, sizes, and levels of safety to resettle in the United States. Known as the Mariel boatlift, the mass exodus carried these migrants from the port of Mariel on the northern coast of Cuba between April and October 1980.

With Cuba suffering myriad social problems in 1980, including housing shortages and job shortages and an overall stagnant economy, thousands of Cubans gathered at the Peruvian Embassy in Havana in hopes of gaining permission to leave the country. As the number of would-be emigrants swelled to over ten thousand, most of them lacking food and water, Castro came up with a novel way to deflect the embarrassment the situation was causing: he announced that anybody who wanted to leave Cuba was free to go, a major departure from the nation’s longstanding, rigid emigration policies. Beginning on April 15, a flotilla of watercraft, many of them arriving from the Miami area, where a large number Cuban exiles were already living, began transporting Cubans across the narrow channel between Cuba and South Florida.

The sudden influx of so many Cuban immigrants placed a huge burden on the U.S. immigration system. Many of the newcomers were detained in American prisons, some because they had criminal backgrounds, others simply because they did not have an American sponsor. Having grown up in Cuba’s socialist climate, the “marielitos,” as they came to be called, generally expected the state to take care of many of their needs, such as housing and health care. Nevertheless, while U.S. President Jimmy Carter (1924–) released several million dollars in emergency aid for the refugees, most of them found that they were on their own. Even the established Cuban population in South Florida was less than enthusiastic about receiving this sudden wave of immigrants, in part because the majority were young, poor, dark-skinned, and ill-equipped to successfully plunge directly into the American economy. The media tended to exaggerate the criminal histories of the marielitos; in fact only a small percentage had records that warranted incarceration in the United States. And contrary to the fears of Miamians, the big influx of unskilled workers did not have a negative impact on the local economy in the long run.

John F. Kennedy

John F. Kennedy (1917–1963), the thirty-fifth president of the United States, became the most recent U.S. head of state to die in office when he was assassinated on November 22, 1963. Kennedy was the youngest American president ever elected, as well as the youngest to die. He was also the first Roman Catholic to occupy the White House.

Privileged Childhood

John Fitzgerald Kennedy was born on May 29, 1917, in Brookline, Massachusetts, just outside of Boston. He was raised at the Kennedy family’s homes in New York and Hyannis Port, Massachusetts, on Cape Cod. He was second-oldest among Joseph and Rose Kennedy’s nine children. The Kennedy clan was immensely wealthy and immensely powerful politically. His paternal grandfather, Patrick Joseph Kennedy, rose from poverty to become a successful liquor importer, and translated his business achievements into political clout, as he became an important backroom power broker in the Boston area and eventually a member of the Massachusetts legislature. His maternal grandfather, John Francis Fitzgerald, was also a political figure, serving three terms in Congress and two terms as mayor of Boston.

Patrick Kennedy’s son Joseph—John F. Kennedy’s father—built on his father’s success through well-timed investments in a range of industries, including banking, shipbuilding, motion pictures, and the family’s main line, liquor. These investments made him one of the wealthiest men in America. Joseph Kennedy used his vast wealth to support the presidential candidacy of Franklin Roosevelt (1882–1945) in 1932, and was rewarded with a series of appointments to coveted government posts, including chair of the U.S. Securities and Exchange Commission in 1934, chair of the U.S. Maritime Commission in 1937, and finally the one he sought most, the ambassadorship to Great Britain in December 1937.

John F. Kennedy, who went by “Jack,” grew up in the shadow of his older brother, Joe Jr. It was Joe Kennedy who was the apple of his father’s eye, destined to carry on the family legacy of power and prominence. Jack followed Joe, two years his senior, through the prestigious prep school Choate and into Harvard University. Jack struggled with various medical problems, in contrast to his strong, healthy brother. Jack Kennedy graduated cum laude from Harvard in 1940. With the onset of World War II, he enlisted in the U.S. Navy, initially serving as an intelligence officer in Washington, D.C. After the Japanese attack on Pearl Harbor in December 1941, Kennedy was reassigned to sea duty on a PT boat—a small, fast vessel used to torpedo larger surface ships—in the Pacific. He was given his own PT boat to command in March 1943. A few months later, his boat, PT109, was rammed and sunk by a Japanese destroyer. Kennedy swam three miles to safety, towing an injured crewmate along with him. Though Kennedy tended to play down his act of heroism, he was honored for his courageous act, and it became part of the Kennedy aura that would serve him well during his political career.

Early Political Career

The following year, his brother Joe was killed while flying a dangerous mission over Europe. The entire family was devastated. Joseph Sr., who had long groomed Joe Jr. to go into politics, tapped Jack to take his brother’s place. Supported by his father’s financial fortune and political connections, Kennedy ran for a seat in the U.S. House of Representatives in 1946. He was a political novice, but won the election handily, taking the congressional seat held by his grandfather John “Honey Fitz” Fitzgerald a half-century earlier.

In the House of Representatives, Kennedy did nothing that stood out among his peers. In spite of the Kennedy family’s privileged economic status, he tended to support policies that favored workers and ordinary people. In 1947 Kennedy was diagnosed with Addison’s disease, a degenerative condition that affects the adrenal glands. The condition improved in 1950, when he began receiving cortisone treatments. Meanwhile, he was reelected to the House twice. In 1952 Kennedy ran for U.S. Senate, taking on the incumbent Republican Henry Cabot Lodge (1902–1985). The Republicans rode a wave of anticommunist hysteria to victory in most congressional races that year; but on the strength of his personal charisma and the growing Kennedy mystique, Kennedy was able to score a rare victory for the Democrats in that race.

During his first year as a U.S. senator, Kennedy married Jacqueline Bouvier. In the Senate, Kennedy was assigned to several key committees: the Labor and Public Welfare Committee, the Government Operations Committee, the Select Committee on Labor-Management Relations, the Foreign Relations Committee, and the Joint Economic Committee. From there, he was able to guide several bills designed to help the Massachusetts fishing and textile industries through the legislative process.

Profiles in Courage

During the mid-1950s, Kennedy began to suffer intense pain from an old back injury, for which he underwent dangerous surgeries in 1954 and 1955. The surgeries were not entirely successful; Kennedy lived with chronic back pain for the rest of his life. As he recuperated from surgery, he spent several months writing a book about U.S. senators who had taken courageous stands and demonstrated remarkable integrity. The book, Profiles in Courage, was published in 1956 and awarded a Pulitzer Prize the following year.

Despite his physical struggles, Kennedy was considered a rising star in the Senate by this time. He was nearly nominated to be the Democratic Party vice-presidential candidate in 1956, but narrowly lost the nomination battle to Senator Estes Kefauver of Tennessee. By 1960 Kennedy was in a position to vie for the top spot on the Democratic ticket. Running on a campaign theme of achieving “national greatness” through sacrifice, and arriving at a “New Frontier,” Kennedy coasted through the Democratic convention and was named the party’s presidential candidate.

Kennedy’s opponent in the 1960 presidential election was Richard M. Nixon (1913–1994), who was running on his reputation as a fierce Cold Warrior, tough on communism and hostile to the proliferation of costly domestic programs. This was the first election in which television played an important role, and Kennedy benefited mightily from four televised debates at which he appeared to be the more poised, sophisticated, and charismatic of the two candidates. Kennedy prevailed in what was one of the closest presidential elections in the nation’s history. He won in part by carrying 68 percent of the African-American vote, more than three-quarters of the Catholic vote, and, largely thanks to his savvy selection of Texan Lyndon B. Johnson (1908–1973) as his running mate, by winning a handful of Southern states.

Kennedy Presidency

As president, Kennedy was immediately faced with a number of challenges associated with the Cold War. In 1959 Fidel Castro (1926–) had become leader of Cuba after successfully toppling President Fulgencio Batista y Zaldívar (1901–1973). As Castro began forging close ties with the Soviet Union, American leaders grew increasingly nervous about the presence of a communist client state so close to American soil. Kennedy inherited a plan first conceived by the Central Intelligence Agency (CIA) under the Eisenhower administration to support with money, weapons, and training an invasion of Cuba by Cuban exiles living in the Unites States, in hopes of ousting Castro from power. In April 1961 Kennedy authorized the execution of the plan. The invasion, focused at a site called the Bay of Pigs, failed miserably and was a major embarrassment for the young Kennedy administration.

Another crisis involving Cuba emerged the following year, when American intelligence detected signs that the Soviets were deploying powerful missiles in Cuba that were capable of carrying atomic warheads to targets across the United States. A high-stakes showdown between the United States and the Soviet Union ensued, as threats and ultimatums were exchanged, bringing the world to the brink of nuclear war. This time, Kennedy was able to avert a catastrophe. A flurry of secret, high-level negotiations resulted in a deal under which the Soviets agreed to remove their Cuban-based missiles in return for a U.S. promise not to invade Cuba. Undisclosed at the time was another part of the agreement: the United States would remove its missiles based in Turkey.

No sooner was the Cuban missile crisis defused, however, than Vietnam began to emerge as the nation’s next Cold War crisis. Determined to resist efforts by North Vietnam to unify Vietnam under communist rule, Kennedy increased the number of military advisers there, creating the conditions that would explode into full-blown U.S. participation in the Vietnam War under Kennedy’s successor Lyndon Johnson.

Despite these crises, the Kennedy years are generally remembered as a time of great prosperity and hope in America. Among Kennedy’s historic accomplishments were the creation of the Peace Corps and the Agency for International Development. He was also supportive, if not aggressively so, of the civil rights movement. On a handful of occasions Kennedy intervened when Southern officials and establishments sought to prevent racial desegregation from taking place as mandated by U.S. Supreme Court rulings.

Kennedy Assassination

Kennedy was assassinated on November 22, 1963, while touring Dallas, Texas, in a motorcade. The presumed assassin, Lee Harvey Oswald, was himself gunned down two days later, while in police custody, by Dallas nightclub owner Jack Ruby. A number of questions lingered in the wake of the murders, the most important being whether Oswald had acted alone or been part of a broader conspiracy. Incoming president Johnson convened a special panel to investigate the strange circumstances surrounding the Kennedy assassination. The panel, led by U.S. Supreme Court Chief Justice Earl Warren (1891–1974) and generally referred to as the Warren Commission, released an 888-page report in September 1964, in which it concluded that there was no persuasive evidence of a conspiracy and Oswald had in fact acted alone.

The conclusions of the Warren Commission remain controversial today, and many Americans remain convinced that a conspiracy was at work. Regardless of how the assassination was orchestrated, Kennedy’s death is widely perceived as a transforming moment in American history. Over the next several years, a general mood of optimism gave way to one of conflict and upheaval, with the escalation of U.S. involvement in Vietnam, the rise of racial strife in many cities, and other divisive social phenomena taking hold.

See also The Cuban Missile Crisis

See also The Warren Commission

The Kennedy Curse

Fabulous wealth and power have not spared the Kennedy family from tragedy. In fact, the Kennedy clan has suffered through an extraordinary number of tragic events over the years, leading some observers inclined toward superstition to believe the Kennedys are “cursed.”

The so-called curse goes back to the 1940s. Rosemary Kennedy, the sister of President John F. Kennedy (1917–1963), suffered from violent mood swings. She was given an experimental treatment called a lobotomy, which involved blindly poking an instrument into a patient’s brain. The results in her case were terrible, and she remained institutionalized for the rest of her life. She died in 2005.

In 1944 Joe Kennedy Jr., President Kennedy’s older brother, was killed over the English Channel while flying a mission during World War II. Four years later another sister, Kathleen Kennedy Cavendish, died in a plane crash in France. In the mid-1950s, as John F. Kennedy was making his name as a U.S. senator and struggling with severe back ailments, his wife Jacqueline had a miscarriage one year and gave birth to a stillborn daughter the next. In August 1963 John and Jacqueline had a son, Patrick Bouvier Kennedy. He was born nearly six weeks premature, and died two days after his birth. John Kennedy was assassinated that November.

In 1964 one of Kennedy’s younger brothers, Edward “Ted” Kennedy (1932–), was severely injured in a plane crash that killed one of his aides and the pilot. He spent weeks in the hospital recovering from injuries that included a punctured lung, broken ribs, and internal bleeding. Another brother, Robert F. Kennedy (1925–1968)—the former attorney general of the United States, whom many thought to be the leading contender for the 1968 presidential election—was assassinated while leaving an event celebrating his victory in the California Democratic presidential primary in June 1968.

The following year, Ted Kennedy drove a car off a bridge after a party on Chappaquiddick Island near Martha’s Vineyard in Massachusetts, killing his passenger Mary Jo Kopechne, a former aide to Robert Kennedy. Ted Kennedy escaped serious injury, but questions surrounding the event may have thwarted his own future presidential aspirations. In 1999 John F. Kennedy Jr., his wife, Caroline Bessette-Kennedy, and her sister were killed when the plane Kennedy was piloting crashed into the Atlantic Ocean.

Amazingly, even this list is incomplete. Similar tragedies have befallen distantly related members of the clan, and the family has experienced a variety of additional, less dramatic misfortunes over the years.

William Brennan

During his thirty-four years on the U.S. Supreme Court, Justice William J. Brennan Jr. (1906–1997) was one of the nation’s most consistent and passionate champions of individual liberties. A staunch advocate for racial and gender equity, Brennan was undoubtedly one of the most influential justices in the nation’s history. As a mainstay of the Court’s liberal wing over a period that spanned eight presidencies and saw sweeping social and political changes, Brennan wrote landmark opinions on such critical issues as freedom of speech, civil rights, and separation of church and state. Brennan wrote more than 1,200 opinions while serving on the Supreme Court, more than any other justice except William O. Douglas (1898–1980).

Son of a Working-Class Politician

Brennan was born on April 25, 1906, in Newark, New Jersey. He was the second of eight children born to William J. Brennan Sr. and Agnes McDermott, both Irish immigrants. William Sr. was a coal shoveler in a brewery before becoming active in politics, first as a labor union official and later as a member of the Newark City Commission. He eventually became police commissioner and, finally, director of public safety, one of the most powerful posts in Newark’s municipal bureaucracy.

After graduating with honors from the University of Pennsylvania, William Jr. moved on to Harvard Law School, where he served as president of the student Legal Aid Society. He graduated from law school near the top of his class in 1931. Returning home to Newark, Brennan took a job with the corporate law firm of Pitney, Hardin and Skinner, becoming a partner in 1937. In this position, Brennan often found himself arguing on behalf of corporate management, which contradicted his natural inclination, inherited from his labor activist father, to champion the interests of underdogs and regular working folks.

Brennan served in the U.S. Army during World War II, working on labor and procurement matters, and was discharged with the rank of colonel. After the war, he returned to the law firm, where two senior partners had passed away, leaving Brennan as top manager. Brennan soon became involved in efforts to reform the New Jersey judicial system. In 1959 he was appointed to the appellate division of the New Jersey Superior Court, and two years later he was named to the state’s Supreme Court.

U.S. Supreme Court

When Justice Sherman Minton (1890–1965) announced his retirement from the U.S. Supreme Court in 1956, President Dwight D. Eisenhower (1890–1969) sought to replace him for political reasons with a fairly young, sitting judge who was Catholic and a Democrat. Brennan, who had already impressed Attorney General Herbert Brownell, fit the profile precisely. Brennan was appointed to the Supreme Court on October 16, 1956, and confirmed by the Senate the following March, with only Republican Senator Joseph McCarthy (1908–1957) of Wisconsin voting against his confirmation.

During his first thirteen years on the Court under Chief Justice Earl Warren (1891–1974), with whom he had a very close working relationship, Brennan was a key member of the Court’s liberal majority. Brennan wrote his first landmark opinion in 1962 in the case Baker v. Carr. In this ruling, the Court held that cases challenging unequal legislative apportionment could be heard in federal court, leading to the series of “one person, one vote” reapportionment cases that essentially revolutionized electoral districting and redistricting in the United States. The following year, Brennan—one of the most religious justices on the Court—made a strong statement for the strict separation of church and state in his seventy-page concurring opinion in Abington School District v. Schempp and Murray v. Curlett, which held that state-mandated Bible reading and recitation of the Lord’s Prayer were unconstitutional. In 1964 Brennan wrote the Court’s unanimous opinion in the landmark First Amendment case New York Times v. Sullivan. This case established that the press was free to criticize public officials as long as the statements were not deliberately false or made with malicious intent.

Endured Court’s Rightward Swing

Brennan’s role on the Court was diminished slightly when Warren was replaced as chief justice by Warren Burger (1907–1995), with whom Brennan did not enjoy such a close friendship. Nevertheless, he remained the Court’s most consistent liberal voice and vote, usually in alliance with Thurgood Marshall (1908–1993). In 1970 Brennan wrote the majority opinion in Goldberg v. Kelly, a case in which the Court ruled that states cannot cut off welfare benefits without first giving the recipient a hearing.

During the second half of Brennan’s tenure on the bench, the Supreme Court became much more conservative. President Ronald Reagan (1911–2004) and his allies on the right criticized Brennan for using the Court to effect social changes they believed were more appropriately handled through legislation, if at all. He saw many of the progressive changes he had been part of undone by the conservative courts of the 1980s. One of Brennan’s last major cases was Texas v. Johnson (1989), in which he and others in the majority ruled that laws banning flag burning as a political statement were unconstitutional.

Brennan retired in 1990, leaving a judicial legacy matched by only a few other Supreme Court justices in the history of the nation. He died in 1997 in Arlington, Virginia.

Martin Luther King Jr.

Martin Luther King Jr. (1929–1968) was one of the most important leaders of the civil rights movement in the United States during the 1950s and 1960s. King advocated a nonviolent approach to protest, a philosophy influenced by the Indian leader Mahatma Gandhi (1869–1948). King’s passion and determination inspired countless Americans to action as the nation struggled with issues of social justice and racial equality.

King was born Michael King Jr. on January 15, 1929, in Atlanta, Georgia, the second child of Michael King Sr. and Alberta Williams King. Both King and his father later changed their names to honor Martin Luther, the religious leader who spearheaded the Protestant Reformation in the sixteenth century.

Early Life

King attended public schools in Atlanta. Upon graduation from high school, he enrolled at Morehouse College, a historically all-black liberal arts school in Atlanta. While still a student at Morehouse, King was ordained into the ministry of the National Baptist Church. He was also exposed for the first time to the philosophy of Mahatma Gandhi, the pacifist Indian leader. King graduated from Morehouse in 1948 and entered Crozer Theological Seminary in Pennsylvania. King graduated from Crozer in 1951, and then moved on to Boston University to begin working toward a Ph.D. in theology. During his time in Boston, King met Coretta Scott, a voice student at the New England Conservatory of Music. King and Scott were married on June 18, 1953.

In 1954, while he was still working on his doctoral dissertation, King was named pastor of the Dexter Avenue Baptist Church in Montgomery, Alabama. He was awarded his Ph.D. in June of the following year. It did not take long after King’s arrival in Montgomery for the him to become a central figure in the civil rights movement. In December 1955 an African-American woman named Rosa Parks (1913–2005) was arrested for refusing to give up her seat on a Montgomery bus to a white man, as was required by local law. The African-American community was outraged by Parks’s treatment. King and other religious leaders and activists, including Alabama state NAACP chairman Ralph Abernathy (1926–1990), quickly organized a boycott of the segregated Montgomery bus system. As the leader of the boycott, King came under harsh criticism from proponents of segregation. At one point his home was firebombed, and he received a steady stream of threats. The boycott lasted for more than a year, and in the end, after the U.S. Supreme Court ruled that segregation on the city’s buses was unconstitutional, the bus company relented. Segregation on Montgomery’s buses was ended, and the success of the bus boycott brought national attention to King.

In January 1957 several dozen African-American religious leaders met in Atlanta to organize a permanent group that would work on civil rights issues. The group became the Southern Christian Leadership Conference (SCLC), and King was chosen as its first president. The following year, the SCLC launched the “Crusade for Citizenship,” an initiative to register thousands of new black voters across the South. In support of the project King traveled to cities all over the region to speak on the importance of voting in the overall campaign for social justice.

King moved to Atlanta in 1960 to become associate pastor at the Ebenezer Baptist Church. That year, he helped coordinate the “sit-in” movement, in which groups of African-Americans occupied “whites-only” lunch counters and other segregated venues. As the movement gained momentum, King encouraged the students at the core of the movement to remain independent of the SCLC, leading to the formation of the Student Nonviolent Coordinating Committee (SNCC, pronounced “snick”).

Freedom Riders

The sit-ins were highly successful in raising public awareness of segregation in public places. Their success led to the idea of bringing sit-ins to the interstate transportation system. The SCLC, SNCC, and the Congress of Racial Equality (CORE) formed a coalition, chaired by King, to organize a series of actions in which pairs of African-American and white volunteers would board interstate buses scheduled to travel through Southern states. These volunteers, called Freedom Riders, were brutally assaulted upon their arrival in some Southern cities. Nevertheless, the project was effective; as a result the Interstate Commerce Commission began enforcing existing laws prohibiting racial segregation on interstate buses.

Support for the civil rights movement continued to grow, as did the militancy of the resistance against it. Violence erupted repeatedly when schools began the process of court-ordered desegregation. In June 1963 King and 125,000 supporters marched in a Freedom Walk in the streets of Detroit, Michigan, and in August of that year twice that many activists of many races descended on Washington, D.C., for a gigantic rally, where King delivered his famous “I have a dream” speech. King’s efforts paid off a year later when President Lyndon B. Johnson (1908–1973) signed the Civil Rights Act of 1964 into law. In December of that year, King was given the Nobel Peace Prize in recognition of his inspiring leadership of the civil rights movement.

In spite of the passage of the Civil Rights Act, King found that much work remained to be done. African-American citizens were being denied their voting rights through a variety of underhanded mechanisms in many parts of the South. King and the SCLC decided to focus their efforts first on Selma, Alabama. To raise awareness of the issue, in 1965 King helped organize a march from Selma to Montgomery, Alabama. Alabama Governor George C. Wallace (1919–1998) ordered state troopers to stop the march, and many participants were beaten viciously. The march continued, however, with more than ten thousand people taking part. It helped promote the passage of the Voting Rights Act, which President Johnson signed in 1965.

Decline of Nonviolent Protest

In spite of this progress, the civil rights movement had begun to splinter by the mid-1960s. Some factions of the movement had lost patience with King’s nonviolent approach. SNCC, for example, no longer believed in nonviolence, and the Nation of Islam preached a distrust of all white people, even those who had been supportive of civil rights. Militant groups such as the Black Panthers emerged, and as racial violence ignited in major cities, many white liberals who had been active in the movement became disengaged.

Another significant factor was the emergence of the Vietnam War as the main target of political dissatisfaction among American liberals. In 1967 King became a vocal critic of American involvement in Vietnam, to the chagrin of some longtime civil rights crusaders who saw involvement in antiwar protest as a distraction from their main objective.

In March 1968 King traveled to Memphis, Tennessee, to support striking city sanitation workers. The protests erupted in violence and chaos, and the police responded with violence of their own. On April 3 King addressed a rally, calling on followers to remain committed to nonviolent protest. He also spoke of threats on his life, and the need for the movement to continue on the nonviolent path regardless of what happened to him. The following evening, King was shot to death as he stood on the balcony of his room at the Lorraine Motel. News of his death sparked rioting in more than one hundred cities around the United States. Tens of thousands of people were injured and forty-six killed before order was restored more than a week later. In 1986 Martin Luther King Jr. Day was made a national holiday, celebrated on the third Monday of January each year.

See also The Civil Rights Movement

Bloody Sunday

One of the most dramatic moments in the civil rights movement took place in Selma, Alabama, in March 1965. The Civil Rights Act had passed the previous year, but it did nothing to explicitly ensure the voting rights of minorities. Throughout 1963 and 1964, Selma was a focal point for voter registration efforts by the Student Nonviolent Coordinating Committee (SNCC) and other groups. The segregationists who held power in Selma resisted these efforts forcefully. By early 1965, civil rights activists were pouring into Selma to help the cause. A protest march from Selma to Montgomery, the state capital, was planned to raise awareness of what was going on in Selma.

The first attempt at the march took place on Sunday, March 7, 1965. Over five hundred participants headed out of Selma on U.S. Highway 80. The march, led by Hosea Williams of the Southern Christian Leadership Conference and John Lewis of SNCC, made it only as far as the Edmund Pettus Bridge six blocks away before they were set upon by Alabama state troopers and officers from the Dallas County Sheriff’s Department wielding clubs, whips, and tear gas. The attack was brutal; seventeen marchers were hospitalized, and about sixty more were treated at the hospital and released. The event became known as Bloody Sunday.

Footage of the violence was broadcast nationwide, resulting in an outpouring of support for the movement. A second march was planned for March 9, led by King, but it was merely a prenegotiated symbolic crossing of the Pettus Bridge, and many participants were left confused and dissatisfied. Even the scaled-down second march was not without bloodshed; after the march a Unitarian minister named James Reeb was brutally assaulted by a group of racists, and died later at the hospital.

After much political maneuvering, a third march was organized. This time the march was authorized by a judge, and state and local authorities were ordered not to interfere. About eight thousand people began the third march on Sunday, March 24 and over thirty thousand participated in it overall, including a number of celebrities, such as Harry Belafonte, Tony Bennett, and Leonard Bernstein. Even the successful third march was marred by deadly violence: a white Michigan housewife and mother named Viola Liuzza was shot and killed in her car as she drove some African-Americans home from the historic march.

The same day the march began, President Lyndon B. Johnson (1908–1973) sent his voting rights bill to Congress. The following year, it became the Voting Rights Act of 1965.

George C. Wallace

Before entering the 1968 presidential race as an independent candidate, George C. Wallace (1919–1998) was best known for his efforts as governor of Alabama to preserve racial segregation in his state. Wallace repeatedly attempted to defy the federal government’s attempts to integrate public schools and other institutions across Alabama. During his time as governor, Alabama was the epicenter of the civil rights movement, as activists repeatedly clashed, sometimes with violent results, with Wallace and his segregationist allies. Ironically, on most other issues Wallace was considered a moderate, even tilting toward the liberal side at times.

Early Years

George Corley Wallace was born on August 25, 1919, in Clio, Alabama, located in the rural southeastern corner of the state. His father was a cotton farmer, his mother a school music teacher. As a youth, Wallace was a skilled boxer, winning two state Golden Gloves titles while in high school. After graduating from high school, Wallace attended the University of Alabama, where he received a law degree in 1942. He married Lurleen Burns, a sixteen-year-old store clerk, the following year.

Wallace served in the U.S. Air Force from 1942 to 1945, serving first as an airplane mechanic and later flying several missions over Japan during World War II. Returning from the war, Wallace immediately went into politics. In 1946 he landed a job as assistant state attorney general, and later that year he won election to the Alabama House of Representatives. In the state legislature, Wallace forged a reputation as being a somewhat liberal Democrat, even on issues related to race. As a delegate at the 1948 Democratic National Convention, Wallace refused to join in the walkout orchestrated by fellow Southerners who splintered from the party and formed the Dixiecrats.

Wallace continued to represent Barbour County in the Alabama House until 1953, when he was elected as a state district court judge in Alabama’s Third Judicial District. He held that position until 1958. That year, Wallace launched his first campaign for governor, running as a racially tolerant moderate. His opponent was state attorney general John Patterson, who campaigned as a solid segregationist. Wallace was clearly the more progressive candidate; he even had the endorsement of the National Association for the Advancement of Colored People (NAACP). Wallace lost the election in a runoff vote. This defeat marked a turning point in Wallace’s political career. In its aftermath, he resolved to never again lose an election by being “out-segged,” or appearing to be less supportive of segregation than his opponent.

Governor of Alabama

After spending four years in private law practice with his brother Gerald, Wallace ran for governor again in 1962, this time as a strict segregationist. His opponent was former governor—and Wallace’s former political mentor—James Folsom. This time Wallace had the backing not of the NAACP, but of the Ku Klux Klan. In his inaugural speech, he made clear his intentions with regard to race: “Segregation now! Segregation tomorrow! Segregation forever!”

Wallace’s first year as governor was marked by all-out war between segregationists and civil rights proponents. On June 11, 1963, Wallace personally blocked an entrance at the University of Alabama to prevent two African-American students from enrolling. He backed down only after being confronted by federal marshals and the Alabama National Guard. That September, Wallace sent Alabama state troopers to various parts of the state to prevent African-American students from entering all-white public schools in several different cities. When Martin Luther King Jr. (1929–1968) led a major protest march from Selma to Montgomery, Alabama, in 1965, Wallace ordered state troopers to block their way. The encounter erupted in violence, and emerged as a watershed moment in the civil rights movement.

Alabama’s state constitution prevented Wallace from serving two consecutive terms as governor, so in 1966 his wife Lurleen ran instead, winning easily. Wallace, officially her special assistant, essentially continued to call the shots. Lurleen died in office of cancer two years later. In 1968 Wallace made a serious run as an independent candidate for president of the United States. Appealing to voters fed up with paying taxes and other forms of federal interference in their lives, he managed to gain the support of many conservatives, both wealthy and blue collar, almost all white. Wallace managed to capture more than 13 percent of the popular vote, and won outright in five Southern states.

Reversal on Racism

Wallace ran for governor again in 1970, and won handily. In 1972 he mounted another presidential campaign, this time running as a Democrat. Wallace was running well, with strong showings in early primaries, when he was shot by Arthur Bremer while campaigning in Maryland. Wallace suffered a severe spinal injury in the shooting and used a wheelchair for the rest of his life.

Wallace successfully ran for governor in 1974 and again, after a three-year break, in 1982. By then he had renounced his former racist positions and apologized publicly for his earlier segregationist policies. He appointed African-Americans to a number of high-ranking posts in his administration, and developed relationships with civil rights leaders. Wallace declined to run for governor again in 1986—the state law preventing consecutive terms having been erased—because of health problems. Wallace died in 1998, at the age of 79.

See also The Civil Rights Movement

Malcolm X

Malcolm X (1925–1965) was a prominent Black Muslim minister and a leader of the black nationalist movement in the United States. A central figure in the fight for racial justice in America in the 1950s and the first half of the 1960s, Malcolm X preached an aggressive brand of black self-reliance and rebellion against white authority, in contrast to the more peaceful civil rights message Martin Luther King Jr. (1929–1968) was advocating at the same time.

Malcolm X was born Malcolm Little on May 19, 1925, in Omaha, Nebraska. His childhood was marked by ongoing trauma and violence. His father, Earl Little, was a Baptist minister and an organizer for Marcus Garvey’s (1887–1940) Universal Negro Improvement Association. As an outspoken advocate of black nationalism, Earl Little was a regular target for harassment by various white groups. The family lived under a constant threat of Ku Klux Klan violence, and eventually their home was burned down.

A Traumatic Childhood

The family moved to Lansing, Michigan, in 1929, where Earl Little continued to deliver rousing sermons that infuriated the white community. In 1931 Earl was found dead on the streetcar tracks, his body nearly severed in half and his skull caved in. The death was ruled a suicide, but there was widespread speculation in the community that he had actually died at the hands of the Klan or another racist group. Malcolm’s mother, Louise Little, was devastated by the death of her husband. Unable to tolerate the strain of raising seven children on her own, she suffered a mental breakdown shortly afterward and was committed to an institution. She remained institutionalized until 1963.

With their father dead and their mother incapacitated, Malcolm and his siblings were placed in different foster homes and state facilities. Even while bouncing between foster families over the next several years, Malcolm proved to be an excellent student. When he confided to a teacher his dream of becoming a lawyer, however, he was told to come up with a different goal that was “more realistic” for an African-American.

Disillusioned, Malcolm dropped out of school after eighth grade and moved to Boston, where he lived with his half-sister. In Boston, he worked at a series of menial jobs, including shoe shining and restaurant work. Gradually, he drifted into a life of petty crime. In about 1942 Malcolm moved to the Harlem neighborhood of New York City, where his criminal activities escalated. He began selling drugs and organized a gambling ring, operating under the nickname Detroit Red. In 1946 Malcolm was sentenced to ten years in prison for burglary. In prison, Malcolm began filling in the gaps in his education, absorbing books on religion, history, and philosophy. He became especially interested in the religion of Islam, and began studying the teachings of Nation of Islam leader Elijah Muhammad (1897–1975).

Nation of Islam

Upon his release from prison in 1952, Malcolm went to Chicago to meet Muhammad. He was quickly welcomed into the Nation of Islam, and given a new name, Malcolm X. The X symbolized a rejection of the “slave name” his family had been given upon arrival from Africa. Malcolm was appointed assistant minister of the Detroit mosque. A year later he went back to Chicago to study directly with Muhammad. He was then sent to Philadelphia to open a mosque there. In 1954 Malcolm X became leader of the movement’s flagship mosque in Harlem, where he became the most visible public face and voice of the Nation of Islam.

Malcolm’s charisma helped the Nation of Islam grow from a small fringe group of about four hundred to a movement more than one hundred thousand people strong in 1960. Malcolm preached a philosophy almost diametrically opposite that of mainstream civil rights leaders such as King. He endorsed black separatism, urging African-Americans to defend themselves, with violence if necessary, against their white oppressors. This approach terrified not only most white Americans, but also many blacks, who feared he was stirring up a brutal race war that would result only in more bloodshed.

When President John F. Kennedy (1917–1963) was assassinated in November 1963, Malcolm described it as an instance of “the chickens coming home to roost,” meaning such events were deserved by a society that tolerated violence by the white majority against blacks. Muhammad quickly suspended Malcolm, ordering him not to speak publicly on behalf of the Nation of Islam for ninety days. In March 1964 Malcolm announced that he was breaking with the Nation of Islam completely and forming his own group, the Organization of Afro-American Unity.

No More “White Devils”

That spring, Malcolm embarked on a series of trips to Africa and the Middle East, including a pilgrimage to Mecca, Saudi Arabia, the holiest place in the world for Muslims. During the course of his travels, Malcolm was struck by the sight of Muslims of all skin colors worshipping together. He came to the realization that the Nation of Islam’s characterization of all whites as evil was wrong-headed. He sought to practice a more “pure” version of Islam, and changed his name to el-Hajj Malik el-Shabazz.

Upon his return, Malcolm became increasingly critical of Muhammad and the Nation of Islam. He disavowed his old rhetoric, which he now considered racist. He now believed that the enemy was racism itself, and pledged to work with progressively minded white leaders to improve race relations worldwide. His new philosophy still emphasized black pride and anticolonialism, but he now believed separatism was counterproductive. He was more interested in forging bonds of shared culture and heritage among black people worldwide than in encouraging them to shun their white neighbors. He also made a number of public accusations against Muhammad, claiming his former boss had engaged in numerous affairs with his young secretaries, as well as financial irregularities. These accusations made Malcolm a marked man. He began receiving a steady stream of death threats.

Hostilities between Malcolm and Muhammad continued to escalate through the rest of 1964 and into 1965. The threats on Malcolm’s life evolved into actual attempts. His home was firebombed on February 14, 1965. Exactly one week later, on February 21, Malcolm was shot more than a dozen times by three men who rushed the stage of the Audubon Ballroom in Harlem, where he was about to began a speech to an audience of several hundred followers. Malcolm died at the hospital a short time later.

In the years that followed, responses to Malcolm’s life and death were mixed. His message of black pride and unity resonated strongly with a generation of young African-Americans who came of age during the 1960s. The more radical aspects of his teachings were taken up by members of the militant elements of the Black Power movement of the late 1960s and early 1970s, while the more moderate message Malcolm delivered in his later speeches influenced such mainstream figures as U.S. Supreme Court Justice Clarence Thomas (1948–), a political conservative by any measure.

See also The Civil Rights Movement

The Nation of Islam

The Nation of Islam was founded as the Allah Temple of Islam in 1930 in Detroit, Michigan. Its founder, Wallace D. Fard (1877?–1934?), showed up in a poor, black Detroit neighborhood as a peddler with a murky background. He began dispensing spiritual advice to his customers, informing them of their “true religion,” which was not Christianity but the Islamic faith practiced by dark-skinned people in Asia and Africa. As he gained followers, he set up a temple in a rented space. Fard taught his adherents that white people were “blue-eyed devils” who had come to power through violence and trickery. He established several special purpose branches: the University of Islam, to spread his teachings; the Muslim Girls Training, which instructed girls in their role as homemakers; and the Fruit of Islam, the sect’s paramilitary arm.

One of Fard’s early aides was Robert (or Elijah) Poole. Poole, who quickly became Fard’s most trusted assistant, took on the Muslim name Elijah Muhammad (1897–1975). Poole’s family had migrated to Detroit from the South in 1923. In 1934 Fard mysteriously disappeared, leading to a power struggle within the fledgling organization. Muhammad emerged from the tussle as leader of the Nation of Islam. He moved his family to Chicago in 1936, where he set up Temple of Islam No. 2. This eventually became the organization’s national headquarters. Muhammad led the Nation of Islam until his death in 1975. Under Muhammad’s leadership, the Nation of Islam developed a philosophy of self-reliance for black people. In addition to temples, they set up grocery stores, restaurants, and other small businesses in black neighborhoods, in part for their own economic gain and in part so that members would not have to patronize white-owned establishments.

After Muhammad’s death, one of his six sons, Wallace Muhammad (who later changed his name to Warith Deen Muhammad), was named the head of the Nation of Islam. Warith Deen Muhammad soon disavowed the organization’s position on the evil of white people, and moved the Nation of Islam toward a more mainstream form of Sunni Islam. He also changed the name of the organization several times. Warith’s reforms caused resentment among some Nation of Islam leaders, leading to a schism within the organization. In 1978 a group led by Louis Farrakhan (1933–) that sought to retain the organization’s separatist values reconstituted the Nation of Islam under its old name. Farrakhan’s Nation of Islam continues to thrive in many urban areas of the United States.

Robert McNamara

As secretary of defense under Presidents John F. Kennedy (1917–1963) and Lyndon B. Johnson (1908–1973), Robert McNamara (1916–) was one of the key advisers behind the escalation of the Vietnam War. He later became disillusioned with the war, and left that post in 1967. Soon after that, McNamara was named head of the World Bank, where he focused on promoting economic development in Third World countries. After leaving the World Bank in 1981, McNamara became a vocal critic of nuclear weapons proliferation, writing several books and articles on the subject.

Early Life

Robert Strange McNamara was born on June 9, 1916, in San Francisco, California, the son of a manager at a shoe wholesale company. He was a standout student in the public schools of Piedmont, California, and went on to attend the nearby University of California (UC) at Berkeley, majoring in philosophy and economics. He graduated from UC in 1937 and enrolled at Harvard University’s Graduate School of Business Administration, earning a master’s degree in business administration two years later.

After a year with the accounting firm Price Waterhouse & Company, McNamara returned to Harvard as an assistant professor of business administration. When the United States entered World War II, he taught a course for U.S. Air Force officers under a special arrangement between Harvard and the U.S. Army. He also worked as a consultant to the army on statistical systems for handling military logistical problems. In 1943 McNamara took a leave of absence from Harvard and saw active duty with the Air Force overseas—though poor eyesight prevented him from flying—until his release from service in 1946.

After the war, McNamara joined an elite team of statistical specialists that was hired by Ford Motor Company. He rose quickly up the ranks of Ford’s corporate bureaucracy, becoming company president in 1960, the first non-Ford family member ever to hold that position. After only about a month on the job, however, the call came from president-elect Kennedy offering him the top post at the U.S. Department of Defense.

Secretary of Defense

In January 1961 McNamara was sworn in as secretary of defense. Selected mainly for his business wizardry, he brought with him to the department many of the “whiz kids” with whom he had climbed the ladder at Ford. At the time, he knew very little about nuclear strategy and other key military issues. What he did know about was management and organizational efficiency, and he applied his expertise to what had become a bloated, dysfunctional bureaucracy.

McNamara’s relative inexperience in the politics of nuclear arms did not last long. Under President Dwight D. Eisenhower (1890–1969), the United States had become reliant on a policy of nuclear deterrence, the idea that a powerful nuclear arsenal would prevent aggression on the part of hostile countries. McNamara focused on changing the direction of the military, reinvigorating the nation’s conventional forces and placing less emphasis on nuclear deterrence. At the same time, however, political realities demanded that McNamara and the new president who had appointed him respond to the perceived “missile gap” between the U.S. arsenal and that of the Soviet Union. While McNamara promoted a doctrine of “flexible response,” meaning being prepared to respond to threats in a variety of ways beyond using nuclear weapons, he simultaneously presided over a massive buildup of nuclear arms.

Containing communism was the driving force behind most U.S. foreign policy decisions at the time, and McNamara became one of Kennedy’s key advisers on the military aspects of that objective. In 1961 McNamara was one of the voices arguing in favor of supporting the efforts of Cuban exiles to overthrow Cuban leader Fidel Castro (1926–). This support culminated in the failed Bay of Pigs invasion, a project McNamara later regretted as being doomed from the start. The following year, McNamara was part of the small committee of presidential advisers gathered to contend with the sequence of events that became known as the Cuban missile crisis. McNamara was an advocate of Kennedy’s policy to quarantine Cuba in order to stop the flow of additional weapons from the Soviet Union into the island nation.

Meanwhile, U.S. involvement in Vietnam increased under President Kennedy and, after Kennedy’s assassination in 1963, President Johnson. Under Johnson, McNamara was the chief engineer of U.S. endeavors in Vietnam. Journalists took to calling the conflict “McNamara’s War.” The expansion of the U.S. role in that conflict became the central issue of McNamara’s career. McNamara fully supported Johnson’s decision to increase the number of U.S. troops in Vietnam and to launch a massive bombing campaign there in 1965. By 1966 McNamara had become convinced that the United States could prevail in Vietnam by using its technological superiority to create an “electronic battlefield,” an idea that became known as the “McNamara line.”

World Bank

While McNamara continued to support the Vietnam War in his public statements, he gradually became disillusioned, and was expressing doubts about the war in private. By late 1967 McNamara had changed his mind about many of the policies he had previously supported, including the heavy bombing of North Vietnam. In November of that year he resigned from his post as secretary of defense. In April 1968 Johnson appointed McNamara president of the International Bank for Reconstruction and Development, better known as the World Bank. McNamara remained at the head of the World Bank until 1981. During his last year there, McNamara oversaw some 1,600 economic projects in more than 100 developing countries, with a total value of about $100 billion, though some critics of the World Bank argue that the conditions attached to the loans for these projects pinch domestic social spending in recipient nations and ultimately harm the poorest people who live there.

After he retired from the World Bank in 1981, McNamara continued to speak and write on a number of different issues, most notably economic development and poverty in developing countries, and nuclear policy. He became a vocal critic of nuclear proliferation, advocating for a dramatic reduction in the number of nuclear missiles in the arsenals of both the United States and the Soviet Union. In 1995 McNamara published a book, In Retrospect: The Tragedy and Lessons of Vietnam, in which he admitted that he had lied to Congress and the American people about the rationale for U.S. involvement in the Vietnam War. While he takes responsibility in the book for some of the war’s negative impact, the book came across as self-serving and shallow to some critics. A 2003 documentary by Errol Morris, The Fog of War: Eleven Lessons from the Life of Robert S. McNamara, shed additional light on McNamara’s ideas and activities during one of the most divisive periods in the nation’s history.

See also The Cuban Missile Crisis

See also The Vietnam War

Lyndon B. Johnson

Lyndon Baines Johnson (1908–1973), the thirty-sixth president of the United States, served in the White House from late 1963, when he took office after the assassination of John F. Kennedy (1917–1963), to January 1969 after he declined to run for a second full term. Johnson presided over a nation struggling with an assortment of divisive issues, from civil rights to the conflict in Vietnam (1964–1975). Along the way, he launched a number of ambitious domestic initiatives aimed at defeating poverty in America in hopes of transforming the nation into a “Great Society.”

Early Years

Johnson was born on August 27, 1908, near Stonewall, Texas, the oldest of five children. His father, Sam Johnson, was a rancher who had served in the Texas legislature. The family moved to Johnson City, about fifteen miles from Stonewall, in 1913. Johnson graduated from high school there in 1924, before enrolling at Southwest State Teachers College in San Marcos, Texas. While working toward his degree, Johnson earned money teaching at an all-Hispanic junior high school in Cotulla, Texas. He also became involved in student government activities at Southwest State, and gained a reputation as a personable leader with a gift for motivating people to do what he wanted, a good formula for success in politics.

Upon graduating from college, Johnson accepted a teaching job at Sam Houston High School in Houston, where he taught an assortment of courses in such varied subjects as public speaking, geography, and math. Johnson proved to be an excellent teacher, but his interest in politics was stronger. In 1931 he accepted a position on the staff of newly elected congressman Richard Kleberg. Johnson essentially ran Kleberg’s Washington office, taking care of correspondence with constituents and dealing with lobbyists and other insiders. Johnson also became active in the “Little Congress,” an informal organization of congressional staffers. He was elected speaker of that body, and was able to use that position to leverage a certain amount of political power of his own. Over his four years in Kleberg’s office, Johnson learned the ins and outs of federal politics quite thoroughly. He also found time to court Claudia Alta Taylor, known to all as Lady Bird. The couple was married in 1934.

In 1935, through the connections he had developed in government, Johnson was appointed Texas director of the National Youth Administration, a project of the Works Progress Administration that provided relief funds and work opportunities for children and young adults. Two years later, the congressman representing Texas’s 10th District died unexpectedly. Johnson quickly added his name to the already crowded list of candidates vying in the special election to replace him. Johnson won the race, and at the tender age of twenty-eight became a member of the U.S. House of Representatives. He was reelected to serve a full term a year later. Another death, that of Texas senator Morris Sheppard, prompted Johnson to run for the U.S. Senate in 1941. Johnson lost a close race to Texas governor W. Lee O’Daniel, the only election he would ever lose; for years afterward, Johnson claimed the election had been stolen from him through underhanded tactics. Johnson, defeated but not demoralized, returned to Washington to serve out the remainder of his term in the House.

When the Japanese attacked Pearl Harbor in 1941, Johnson fulfilled a campaign promise by requesting active duty in the U.S. Navy. He served as a lieutenant commander in the Pacific, participating in a bombing mission over New Guinea before President Franklin Roosevelt (1882–1945) called all members of Congress in the military back to Washington in 1942.

Senate Leadership

Johnson’s second bid for a Senate seat was successful, albeit controversial. Learning from the dirty tricks of his opponent in his previous Senate attempt, Johnson reportedly employed several underhanded tactics during his 1948 Senate campaign, including vote fraud, voter intimidation, bribery, and mudslinging. Johnson prevailed by just eighty-seven votes over popular Texas governor Coke Stevenson, earning the ironic nickname “Landslide Lyndon” in reference to his slim margin of victory. After Johnson’s death it was established fairly conclusively that this election was fixed.

It quickly became clear that Johnson was not a typical Senate rookie. He arrived with a deep understanding of both the key issues of the day and the unwritten rules of Senate politics. He used his political connections and savvy to land prime jobs and contracts for businessmen who had supported his campaigns for office. In 1951 he was named Democratic Party whip, a rare honor for a newcomer to the Senate. Two years later he was selected minority leader, and when the Democrats seized control of both houses of Congress in 1954, Johnson became the youngest Senate majority leader in the nation’s history. As the most powerful Democrat in the country, Johnson worked closely with Republican president Dwight D. Eisenhower (1890–1969), and was instrumental in the passage of many of the most important pieces of legislation of the 1950s, including the extension of Social Security benefits, the 1956 highway bill, the Civil Rights Act of 1957, and the omnibus housing bill also passed in 1957.

By 1960 it was clear that Johnson could be a serious contender for the presidency. Johnson briefly mounted a campaign, but it quickly became apparent that Kennedy had more support. Since Kennedy was a New Englander, it made sense for him to seek a running mate from the South to help attract votes from that region, and Johnson’s Texas roots seemed to make him the ideal candidate to add to the ticket as vice president. With Johnson’s help, Kennedy defeated Republican candidate Richard Nixon (1913–1994) in one of the closest presidential elections in history.

Vice Presidency

As vice president of the United States, Johnson took on a much bigger role than was traditionally associated with that post. As a Senate leader a few years earlier, he had helped shepherd through Congress the legislation that created the National Aeronautics and Space Administration, and space exploration remained a keen interest. Kennedy rewarded that enthusiasm by appointing Johnson chairman of the Space Council, a position from which he could closely monitor the progress of the budding space program. Johnson also took an active role in the civil rights efforts of the Kennedy administration. Early in his career, Johnson had supported segregation, a more or less necessary position for a politician from Texas to take in the 1930s and 1940s. By the 1950s, however, he had shifted his position, and was now a supporter of civil rights initiatives, making him a reasonable choice to head the newly formed President’s Committee on Equal Employment, a federal program aimed at expanding job opportunities for African-Americans.

Johnson made substantial progress in these roles—especially civil rights, where he helped lay the groundwork for the landmark legislation to come a few years later—but as vice president serving under a young, enormously popular president, it seemed unlikely that Johnson would ever rise any higher than his current job. That changed suddenly and horribly on November 22, 1963, when Kennedy was assassinated. Johnson was instantly thrust into the spotlight as leader of the world’s most powerful country.

The Johnson presidency was marked by important achievements at home—exemplified by the Civil Rights Act of 1964 and the creation of a variety of programs designed to combat poverty in the United States—which in the end were all overshadowed to some degree by the escalation of U.S. involvement in Vietnam. In his first State of the Union address, Johnson declared a “War on Poverty,” and made that the cornerstone of his domestic agenda. He successfully pushed the Economic Opportunity Act of 1964 through Congress. The act created the Office of Economic Opportunity, home to such new antipoverty programs as Job Corps, Head Start, Community Action, and Volunteers in Service to America. Through these programs and others, Johnson sought to attack the root causes of the nation’s domestic problems, including illiteracy, joblessness, urban decay, and inadequate public services. Taken together, these initiatives were designed to move America closer to Johnson’s vision of the “Great Society,” a concept he outlined in a May 1964 speech in Ann Arbor, Michigan. In that address, he said,

We are going to assemble the best thought and the broadest knowledge from all over the world to find those answers for America. I intend to establish working groups to prepare a series of White House conferences and meetings—on the cities, on natural beauty, on the quality of education, and on other emerging challenges. And from these studies we will begin to set our course toward the Great Society.

The breadth of legislation enacted to help the poor during Johnson’s first term rivaled that of Franklin Roosevelt’s New Deal initiatives a few decades earlier.

Vietnam War Drags On

On the strength of these achievements, Johnson cruised to an easy victory in his 1964 presidential bid against conservative Republican Barry Goldwater (1909–1998), one of the most lopsided races in the nation’s history. Johnson leveraged this popularity by pushing vast amounts of legislation through Congress. Federal spending on education and health care increased dramatically during the Johnson presidency. The Civil Rights Act of 1964 was followed by the Voting Rights Act of 1965, which prohibited discrimination in the electoral process. Blue-collar wages increased during these years, and the United States enjoyed several consecutive years of economic growth.

Johnson was unable to savor his domestic accomplishments, however. In August 1964 it was reported that two U.S. destroyers in the Gulf of Tonkin off the coast of North Vietnam had been the recipients of unprovoked attacks. Johnson took this news—which later turned out to be distorted and overblown—as justification for a strong military response. He quickly authorized retaliatory air raids on selected North Vietnamese targets. He then persuaded Congress to pass the Gulf of Tonkin Resolution, which essentially gave Johnson a blank check to escalate American involvement in Vietnam as he saw fit in order to repel North Vietnamese aggression “by all necessary measures.”

The Vietnam War—technically a “police action” rather than a war, since war was never formally declared—quickly became an all-consuming issue for Johnson and the nation. As the conflict dragged on, Johnson sent increasing numbers of young Americans overseas to fight a war that began to seem unwinnable. Opposition to the war grew with every television newscast showing dead and injured Americans. By 1968 a large percentage of the American public, including many returning Vietnam veterans, was fed up with the quagmire Vietnam had become. In the face of massive protest, Johnson announced in March 1968 that he would not seek reelection that November. He served out the rest of his term quietly, then retired to his ranch near San Antonio, Texas, where he tended to cattle and worked on his memoirs. He died of a heart attach in 1973.

See also The Model Cities Program

See also The Vietnam War

The Great Society

As the nation mourned the death of President John F. Kennedy (1917–1963), his successor, Lyndon B. Johnson (1908–1973), committed himself to following through with many of Kennedy’s domestic proposals. The time was ripe for such a push. The economy was relatively stable, and Americans’ emotions were running high in the aftermath of the assassination of the popular president. Seizing the initiative, Johnson proposed a sweeping array of laws covering poverty, employment, civil rights, education, and health care. Together, these proposals represented steps toward establishing what Johnson dubbed the “Great Society,” a term he coined in a May 1964 speech in Ann Arbor, Michigan.

In many ways the Great Society mirrored the New Deal enacted by Franklin Roosevelt (1882–1945), one of Johnson’s political heroes. The main difference was that the New Deal was a response to the Great Depression, a time when a large portion of the American public was suffering from the effects of a collapsed economy; the Great Society initiative, in contrast, came at a time when the American economy was strong. It was instead an attempt to spread the nation’s wealth more equitably, reaching corners of the population whom the prosperity of the time had somehow bypassed.

The list of programs and agencies that were created during the War on Poverty waged by Johnson includes many that now form the heart of the nation’s social safety net. Medicaid and Medicare were founded to ensure access to health care for the poor, the elderly, and the disabled. The U.S. Department of Housing and Urban Development was established to improve the condition and supply of the nation’s housing stock, especially in major cities. Head Start was created to improve school readiness among disadvantaged children. On the cultural front, the National Endowment for the Humanities and the Corporation for Public Broadcasting emerged as contributors to the development of a Great Society. As the federal government became more conservative during the 1980s and 1990s, funding for many of these programs dwindled, with many politicians turning to private sector solutions to social problems and championing the notion of “small government.”

Betty Friedan

With the publication of her 1963 book The Feminine Mystique, Betty Friedan (1921–2006) effectively launched the modern feminist movement. A few years after the book came out, Friedan helped found the National Organization for Women (NOW), the most visible and effective political arm of the feminist movement.

Background

The United States of the late 1940s and 1950s was a place where women were expected to assume the traditional roles of wife, mother, and homemaker. During World War II, when a large percentage of the nation’s men were fighting overseas, women took their places in many crucial jobs that had previously been filled almost exclusively by men. After the war, women were encouraged and expected to leave the workforce and return to their former domestic roles. Higher education was an afterthought for most women, and political activity was considered unseemly. By the 1960s, increasing numbers of women had grown disenchanted with the role that was being forced on them by society. They wanted to enjoy the same opportunities as their male counterparts—the right to pursue a top-notch education, a rewarding career, and political power. During that decade, the feminist movement emerged, giving voice to the collective frustration of American women who wanted more out of life than to be defined only as a wife or mother. Friedan was one of the early leaders of the movement.

Friedan was born Betty Naomi Goldstein on February 4, 1921, in Peoria, Illinois. Her parents, Harry and Miriam Goldstein, were the children of Jewish immigrants from Eastern Europe. In high school, Friedan founded a literary magazine, and she graduated at the top of her class. After high school, she attended Smith College, graduating summa cum laude in 1942. Friedan was offered a fellowship to pursue a doctorate in psychology, but seeing no future for herself in academia, she instead moved to New York to be a journalist. In 1947 she married Carl Friedan. The couple settled in Queens, New York, and soon had three children. After being fired from her writing job for taking maternity leave, she continued to work as a freelance writer.

The Feminine Mystique

Over time, Friedan found herself increasingly dissatisfied with her life as a typical suburban American housewife. In 1957 she decided to find out if other women felt the same way. She sent an extensive questionnaire to two hundred of her former Smith classmates. She learned that she was not alone in her feelings of frustration, and that a great many women felt trapped and resentful in their culturally mandated roles as housewives and mothers. Friedan wrote an article about her findings, but none of the women’s magazines she sent it to was interested in publishing it. Rather than give up, she expanded her ideas into a book. The resulting work, The Feminine Mystique, struck a nerve with women across the United States. It was a best seller and sparked a revolution of consciousness among American women.

Friedan instantly became one of the women’s movement’s preeminent spokespersons, both nationally and internationally. Three years after the publication of The Feminine Mystique, Friedan was instrumental in creating NOW, and she became the organization’s first president. Under her leadership, NOW fought hard for equality in the workplace, including enforcement of Title VII of the 1964 Civil Rights Act, which prohibited employment discrimination on the basis of gender. NOW also advocated for passage of the Equal Rights Amendment, which had been failing to achieve Congressional approval for more than forty years. In 1969 Friedan helped launch another organization, the National Association for the Repeal of Abortion Laws (NARAL; the name changed in 1973 to National Abortion and Reproductive Rights Action League), which was instrumental in the movement that led to the legalization of abortion in 1973 by virtue of the U.S. Supreme Court’s decision in Roe v. Wade.

Friedan stepped down from the presidency of NOW in 1970. For the next few years, she wrote a regular column for McCall’s magazine called “Betty Friedan’s Notebook.” As the feminist movement went through inevitable changes over time, Friedan became less visible as a spokesperson. She nevertheless continued to write prolifically on a range of topics. In her 1981 book The Second Stage, she called for a new focus in the movement that would emphasize the needs of families, allowing both men and women to gain freedom from the shackles of gender-based stereotypes. In 1993 she published The Fountain of Age, in which she explored the rights and challenges of elderly and aging people. Friedan died in 2006.

Ralph Nader

Ralph Nader (1934–) essentially invented the consumer protection movement and is without question the most important advocate for consumer safety and corporate accountability in recent history. While his status as a crusader for consumer rights is unassailable, he enraged many Democrats by running for president on the Green Party ticket in 2000, siphoning crucial votes away from Democratic candidate Al Gore and, in the view of his detractors, helping hand the election to Republican George W. Bush (1946–). Nader ran for president again in 2004, but he attained limited support and was not a factor in the outcome.

Nader was born on February 27, 1934, in Winsted, Connecticut, the youngest child of Lebanese immigrants Nadra and Rose (Bouziane) Nader. His parents, who owned a restaurant and bakery, were interested in politics; current affairs were a standard topic of discussion around the family dinner table. His father’s ideas about social justice impressed Nader at an early age, and he was still young when he decided that he wanted to study law. A brilliant student, Nader graduated in 1955 with highest honors from Princeton University’s Woodrow Wilson School of Public and International Affairs. He then moved on to Harvard Law School, earning a law degree in 1958. Nader’s first experience as an activist came while he was at Princeton, where he attempted—unsuccessfully—to persuade the university to stop spraying campus trees with the harmful pesticide dichlorodiphenyltrichloroethane, or DDT.

Vehicle Safety

While at Harvard, Nader became interested in vehicle safety, an issue with which he would later make his mark professionally. Nader was convinced that poor vehicle design, not just driver error, was responsible for many traffic fatalities. In 1958 he published his first article on the topic, “American Cars: Designed for Death,” in the Harvard Law Review. That article foreshadowed the major work of his career.

After law school, Nader served briefly in the U.S. Army, then traveled extensively overseas. Returning to Connecticut, he went into legal practice and lectured on history and government at the University of Hartford from 1961 to 1963. In 1964 Nader moved to Washington, D.C., where he hoped to continue his work on automobile safety. He landed a job as a consultant at the U.S. Department of Labor, and there he wrote a report calling for expanded federal regulation of vehicle safely. Nader left the Labor Department in May 1965 in order to devote himself full time into the task of writing a book on the subject that had become his passion. The book, Unsafe at Any Speed: The Designed-In Dangers of the American Automobile, was published later that year. It caused an immediate stir, and quickly became a best seller. Federal hearings were held, at which Nader provided key testimony.

The major car manufacturers, particularly General Motors—whose Chevrolet Corvair was singled out as being especially unsafe—were antagonized by Nader’s crusade, and they embarked on a campaign of harassment and intimidation. Nader withstood the pressure and pressed on with his effort. The eventual result was the passage in 1966 of the National Traffic and Motor Vehicle Safety Act, which created the National Traffic Safety Agency to enforce the new safety regulations established by the act. Ralph Nader became a household name.

Nader’s Raiders

After his success with automobiles, Nader moved on to address safety issues in other industries. He took on the meatpacking industry, resulting in the 1967 passage of the Wholesome Meat Act. He also scrutinized coal mining and natural gas pipelines. Living a frugal lifestyle, Nader used the money he earned from book sales and lectures to assemble a small army of young people to help him with research and data collection. The group, composed mainly of students at law schools and other colleges, became known as “Nader’s Raiders.” In 1969 Nader formed the Center for the Study of Responsive Law, a think tank devoted to consumer safety issues. Two years later, he founded Public Citizen, an advocacy group that lobbies for public policy related to consumer protection. Nader founded various other groups over the next several years including the Public Interest Research Groups that sprang up on college campuses in many states during the 1970s.

Nader continued to write prolifically during the 1970s. His influence in Washington began to wane in the 1980s as the atmosphere in the nation’s capital grew more conservative, though he continued to score occasional successes, such as the rollback of California auto insurance rates in 1988. Nader made token runs for president in 1992 and 1996 before mounting a serious campaign with the Green Party in 2000, which generated nearly three million votes.

See also The U.S. Department of Transportation

The Chevy Corvair: Unsafe at Any Speed?

The Chevrolet Corvair was one of a handful of American cars produced in response to the popularity of the sporty, fuel-efficient European imports showing up on U.S. streets during the middle part of the twentieth century. General Motors (GM) manufactured the Corvair from 1960 to 1969. The Corvair had a number of unique design features, such as an air-cooled aluminum engine in the rear of the car. It also had an unusual suspension design that required the owner to inflate the tires to a pressure exceeding the tire manufacturers’ recommendations, and with a striking imbalance between the front and rear tire inflation pressures.

Ralph Nader (1934–) deemed the Corvair inherently unsafe in his breakthrough book Unsafe at Any Speed (1965), and while seven of the book’s eight chapters have nothing to do with the Corvair, it is the model with which the book has always been most closely associated. Nader wrote that the car’s unusual features made the Corvair more prone to roll over. He also criticized the car’s steering column, remarking that because of its rigidity it was likely to impale the driver in a head-on collision.

Sales of the Corvair plummeted following the publication of Unsafe at Any Speed, in spite of GM’s attempts to discredit Nader. Some of the company’s personal attacks on Nader were highly unethical and resulted in a successful lawsuit by Nader. Nader’s claim that the Corvair was inherently unsafe, however, was not universally accepted, and critics point out that his assessment was not based on the kind of sophisticated testing available today. Tests run by the National Highway Traffic Safety Administration in 1971 found that the Corvair actually handled better than several comparable models. Nevertheless, Nader’s book made the powerful point—based on many more examples beyond the Corvair—that automakers were thinking a lot more about looks than about safety, and that it was the consumer who ultimately paid the price for those misguided priorities.

Cesar Chavez

The son of Mexican immigrants, Cesar Chavez (1927–1993) rose from humble beginnings as a migrant worker in California to become one of the most important labor leaders in U.S. history as founder and leader of the United Farm Workers. Chavez spent his entire career battling for the rights of agricultural workers in the face of exploitation by companies looking for a cheap and vulnerable workforce, often composed largely of immigrant families.

Cesar Estrada Chavez was born on March 31, 1927, near Yuma, Arizona, one of five children in a poor family that ran a grocery story and a small farm. During the Great Depression, the Chavez family lost their farm, and they joined the burgeoning host of families in the rural Southwest that headed for California to become temporary agricultural workers. The family moved from farm to farm in search of work, living in tents in small encampments wherever there was work to be done. The wages for these migrant workers were extremely low, and the living conditions were primitive.

When Chavez was about twelve years old, the Congress of Industrial Organizations (CIO), a conglomeration of labor unions, began trying to organize workers in the dried fruit industry. His father and uncle supported these efforts, and Chavez observed some of their early labor actions. Organizing migrant workers, however, is difficult, mostly because they make up an inherently mobile workforce, so the CIO made little progress initially.

Community Service Organization

Chavez attended school from time to time, but he was never in one place long enough to graduate. He joined the U.S. Navy in 1944 and served for two years during World War II. After the war, he returned to the California fields and was reunited with his family. In 1948 he married Helen Fabela, a fellow migrant worker. The couple settled in Delano, California, a town near San Jose, and began raising a family that eventually included eight children. Chavez began his career as a labor activist in 1952, while working as a migrant grape and cotton farmer. That year he joined the Community Service Organization (CSO), a group working to register and mobilize Mexican-Americans. Chavez proved to be a highly skilled organizer, and he quickly moved into a leadership role within the CSO. Over the next several years, he led voter registration drives and worked on a variety of issues affecting Mexican-Americans, including immigration practices, welfare policy, and police abuse. He also successfully launched several new CSO chapters. Chavez was named national general director of the CSO in 1958.

Chavez was able to accomplish a great deal with CSO, but the organization was not involved in the issue that was most important to him: labor conditions for migrant farm workers. He proposed that the CSO start working to organize migrant laborers, but the board of directors declined to take up the cause. In response, Chavez resigned from his job, and with $1,200 of his own savings he founded a new organization, the National Farm Workers Association (NFWA), in 1962.

La Causa

In spite of the difficulties inherent in trying to organize a migrant workforce, Chavez managed to build a membership of about 1,700 workers by 1965. In September of that year, Filipino grape pickers in the Delano area went on strike demanding higher wages. The NFWA voted to join the strike. They also pledged to keep their actions nonviolent. One of Chavez’s chief role models was the Indian leader Mahatma Gandhi (1869–1948), and like Gandhi, Chavez championed the strategy of passive resistance. He brought attention to his cause through such tactics as inviting arrests and undertaking well-publicized fasts. Over time, the strike began to blossom into a full-blown movement called La Causa, which is Spanish for “the cause.”

The grape growers resisted the workers’ challenge, but gradually the strike began to gain the support of consumers around the country. A nationwide boycott of California grapes was in place by 1968, and the financial impact was great enough that the growers were forced to begin negotiating a settlement with the workers. The strike lasted a total of five years. Following the success of the grape boycott, Chavez next decided to take on iceberg lettuce growers in Arizona and California. Working against the combined forces of the growers, large agricultural corporations, and a rival union (the Teamsters), Chavez and his organization managed to initiate a successful nationwide boycott of California lettuce, forcing the growers to come to the bargaining table. His efforts also resulted in the passage of California laws to protect agricultural workers, including California’s Agricultural Labor Relations Act in 1975, which led to the formation of the Agricultural Labor Relations Board.

Along the way, the NFWA became part of the gigantic labor umbrella organization the American Federation of Labor and Congress of Industrial Organizations (AFL-CIO) and changed its name to the United Farm Workers Organizing Committee, which was shortened to United Farm Workers (UFW) in 1972.

Union Power Declines

By the early 1980s the UFW’s membership and political clout were both in decline. Later in that decade Chavez set his sights on a new issue: growers’ use of pesticides that were harmful to the health of farm workers. In 1987 he called for another grape boycott aimed at forcing the growers to change their pesticide practices. To publicize the boycott, Chavez undertook his longest fast yet, lasting thirty-six days.

Chavez died unexpectedly in his sleep on April 23, 1993. By the time of his death, UFW membership was a small fraction of what it had been at its peak, and agricultural workers had been forced to give back many of the gains they had made in wages and working conditions. Nevertheless, Chavez’s contribution to the well-being of agricultural workers is unparalleled in history, a fact recognized by President Bill Clinton (1946–) when he posthumously awarded Chavez the Presidential Medal of Freedom, the nation’s highest civilian honor, in 1994.

Henry Kissinger

Henry Kissinger (1923–) was the most influential foreign policy adviser to Presidents Richard Nixon (1913–1994) and Gerald R. Ford (1913–2006) during the late 1960s and early 1970s. As national security adviser and then secretary of state, Kissinger made popular the term shuttle diplomacy, a strategy in which he acted as a go-between to defuse tensions between hostile nations. Kissinger was awarded the Nobel Peace Prize in 1973 for his efforts to bring the Vietnam War to a conclusion.

Kissinger was born Heinz Alfred Kissinger on May 27, 1923, in Furth, Germany. As a Jewish family, the Kissingers suffered anti-Semitic discrimination as the Nazis rose to power in the 1930s. His father, Louis, lost his job as a schoolteacher, and the young Kissinger was forced to switch to an all-Jewish school. The Kissingers fled Germany in 1938, just in time to escape the horrors of the Holocaust. They initially went to London, then moved on to the United States several months later.

The Kissinger family settled in New York, where Heinz changed his name to Henry. He attended high school at night while working in a factory during the day to help support his family. After graduating from high school in 1941, Kissinger entered the City College of New York and began studying accounting. He became a naturalized American citizen in 1943, and the same year he was drafted into the armed forces to serve in World War II. In the U.S. Army, Kissinger was assigned to military intelligence service in Germany, where he served as an interpreter for a general.

Academic Career

After the war, Kissinger returned to the United States and resumed his college education, enrolling at Harvard University in 1947. He graduated with honors in 1950, then received a master’s degree two years later and a Ph.D. in 1954. After completing his studies, Kissinger stayed at Harvard as an instructor. He also joined the Council on Foreign Relations, an independent foreign policy think tank based in New York. While there, Kissinger wrote a book called Nuclear Weapons and Foreign Policy (1957), which earned him a reputation as a rising star in the field of international relations. Kissinger was hired as a lecturer at Harvard in 1957. He was promoted to associate professor in 1959 and to full professor in 1962.

As his standing among foreign affairs specialists increased, presidents began seeking Kissinger’s counsel on matters of state. He served as a consultant to the National Security Council in the early 1960s. In 1965 the U.S. Department of State hired Kissinger as a consultant, focusing specifically on the situation in Vietnam. During the 1968 presidential campaign, he worked as a speechwriter and policy adviser to Republican candidate Nelson Rockefeller (1908–1979). Although Rockefeller failed to win the Republican nomination, Kissinger impressed eventual winner Nixon enough to prompt Nixon to hire him as head of the National Security Council upon taking office in 1969.

Key Presidential Adviser

Kissinger quickly became Nixon’s most trusted adviser on foreign policy matters, wielding more influence than members of Nixon’s own cabinet. Kissinger traveled the globe conducting secret talks with the Soviet Union, China, and North Vietnam, engaging in a practice that was tagged shuttle diplomacy. Kissinger believed that the Kennedy and Johnson administrations had taken the wrong line in their relations with the Soviet Union. While he was as troubled as they were by the prospect of communist expansion, he recognized that the Soviet Union was a legitimate world power, and that the most sensible policy was therefore to maintain a workable balance of power between the United States and the Soviets. He sought to ease tensions between the two superpowers, advocating an approach that became known as detente, a French word roughly meaning “relaxation.” The right wing of the Republican Party was not particularly happy with the approach, coming from the more moderate Kissinger. They believed that this sort of appeasement was wrongheaded, a sign of weakness, and that the correct strategy was to take a hard line against communism.

Kissinger also played a key role in the later stages of the Vietnam War. He advocated secret bombings and a ground invasion of Cambodia in 1970, a controversial strategy at a time when most Americans were calling for a de-escalation of American involvement in the Far East. Kissinger also, however, conducted a series of secret meetings in Paris with North Vietnamese diplomat Le Duc Tho (1911–1990), leading directly to the end of the conflict. As a result of those talks, Kissinger and Tho shared the Nobel Peace Prize in 1973 (although it was declined by Tho).

Kissinger scored a number of other diplomatic breakthroughs during the early 1970s. He initiated the Strategic Arms Limitation Talks (SALT) with the Soviet Union, which resulted in the 1972 signing of the historic SALT I treaty between the two superpowers. He was also deeply involved in efforts to settle the explosive situation in the Middle East, working with Egypt, Syria, Israel, and other nations in the region.

Nixon named Kissinger secretary of state at the beginning of his second presidential term in 1973, and Kissinger remained in the post under Ford after Nixon was forced to resign in disgrace in the wake of the Watergate scandal. Since 1977 Kissinger has mostly remained out of the public eye, though he has continued to write prolifically on foreign policy and has been consulted by government leaders from time to time on diplomatic issues, including meetings with President George W. Bush (1946–) and Vice President Dick Cheney (1941–) about the war in Iraq.

See also The Vietnam War

Jesse Jackson

Jesse Jackson (1941–) emerged in the 1960s as a leading crusader for civil rights and social justice in the United States. A follower of Martin Luther King Jr. (1929–1968), Jackson was in King’s entourage when King was assassinated in 1968. In the 1970s Jackson left the Southern Christian Leadership Conference (SCLC) to start his own organization, People United to Save Humanity, better known as Operation PUSH, based in Chicago. In the 1980s Jackson formed the National Rainbow Coalition, a group dedicated to mobilizing the nation’s dispossessed of all races. Operation PUSH and the National Rainbow Coalition were merged into a single organization in the 1990s.

Jackson was born Jesse Louis Burns on October 18, 1941, in Greenville, South Carolina, the son of Helen Burns and her married next-door neighbor Noah Robinson. In 1943 his mother married Charles Henry Jackson, who later adopted Jesse. A star athlete at Greenville’s Sterling High School, Jackson was awarded a football scholarship to attend the University of Illinois. When he learned that an African-American would not be allowed to play quarterback, Jackson left Illinois and enrolled at North Carolina Agricultural and Technical College in Greensboro, a traditionally African-American institution. There he was elected president of the student body. He also became active in the civil rights movement during his senior year, and worked for a short time with the Congress of Racial Equality (CORE). After graduating in 1964 with a degree in sociology, Jackson entered the Chicago Theological Seminary to prepare to become a minister. He was ordained in 1968.

Operation Breadbasket

In 1966 Jackson joined the Atlanta-based Southern Christian Leadership Conference (SCLC), the renowned civil rights group led by King. With King, he participated in the SCLC-orchestrated protest activities in Selma, Alabama, around that time. Later that year, King assigned Jackson to head the Chicago branch of Operation Breadbasket, an SCLC project that promoted economic development and employment opportunities in African-American communities. Jackson was also a leading figure in protests against alleged racial discrimination on the part of Chicago mayor Richard J. Daley (1902–1976).

Within a year, King appointed Jackson national director of Operation Breadbasket. In that capacity, Jackson worked to put pressure on businesses and industries with large African-American customer bases, such as bakeries and soft-drink bottlers. When businesses refused to comply with the SCLC’s demands regarding fair employment practices and contracts with minority-owned companies, Jackson organized boycotts, often resulting in a negotiated compromise.

In April 1968 Jackson was with a group of SCLC leaders accompanying King in Memphis, Tennessee, when King was assassinated while standing on the balcony of his hotel room. Jackson appeared on national television the next day wearing a shirt covered with what he claimed to be King’s blood, saying he had held King in his arms after the shooting and was the last person to speak to the fallen hero. Others present at the scene disputed Jackson’s account. Regardless of the accuracy of Jackson’s statements, his emotional television appearance made him the new face of the civil rights movement in the minds of many Americans and, perhaps more important, the media.

Operation PUSH

After King’s death, tensions grew between Jackson and new SCLC leader Ralph Abernathy (1926–1990). Abernathy considered Jackson a self-promoter, using Operation Breadbasket to advance his own agenda without regard to SCLC direction. He disliked Jackson’s attention-seeking antics and aggressive personality. In 1971 the SCLC suspended Jackson for what organization leaders termed “administrative improprieties and repeated acts of violation of organizational policy.”

After his suspension, Jackson broke with the SCLC entirely and founded his own organization, Operation PUSH. Operation PUSH carried on many of the strategies of Operation Breadbasket, and expanded them into the social and political realms. Over the next two decades, Jackson evolved into the nation’s leading civil rights activist. He ran for president in 1984 and 1988, making an impressively strong showing on the second try, when he garnered nearly seven million votes in Democratic primaries.

Since 1990 Jackson has kept a substantially lower profile, though he has not hesitated to employ his abundant oration skills when moved to comment publicly about an important issue. In 2000 President Bill Clinton (1946–) awarded Jackson the Presidential Medal of Freedom, the highest honor the federal government can give to a civilian.

See also The Civil Rights Movement

“Waving the Bloody Shirt”

When Jesse Jackson (1941–) appeared on television wearing a shirt allegedly covered with the blood of Martin Luther King Jr. (1929–1968), it was not the first time a bloody shirt has been used to inspire political sympathy. In fact the phrase “waving the bloody shirt” has long referred to the practice of politicians pointing to the blood of a martyr in order to gain support or deflect criticism. During times of war, politicians routinely “wave the bloody shirts” of fallen soldiers—in the metaphorical sense, of course—as they seek to defend the foreign policy decisions that ultimately led to battlefield casualties. The phrase originated after the American Civil War, when Republicans used it as part of their anti-Southern rhetoric to associate Democrats with the bloodshed of the war and the assassination of Abraham Lincoln (1809–1865). Benjamin Franklin Butler (1818–1893), during a speech before his fellow congressmen, held up what he claimed was the shirt of a federal tax collector who had been whipped by the Ku Klux Klan. The “bloody shirt” ploy was used similarly even earlier than that. One of the first known uses occurred in AD 656, when a bloody shirt and some hair allegedly from the murdered Uthman, the third caliph (worldwide Muslim leader), were used to build support for seeking revenge against his opponents. There is also a scene in William Shakespeare’s Julius Caesar in which Mark Antony waves Julius Caesar’s bloody toga in order to whip up the emotions of the Roman populace (though the phrase “waving the bloody toga” apparently never quite caught on).

Political Parties, Platforms, and Key Issues

The Dixiecrats (States’ Rights Democratic Party)

As President Harry S. Truman (1884–1972) and the Democratic Party began to embrace civil rights legislation and racial integration policies in the months leading up to the 1948 elections, a number of Southern Democrats created an offshoot political entity called the States’ Rights Democratic Party, better known as the Dixiecrats, as an alternative for Southerners who opposed Truman’s platform. Subscribing to a philosophy of “states’ rights”—that is, that states have the right to govern themselves without interference from the federal government—the Dixiecrats argued that states had the right to maintain segregationist policies and that the federal government could not, therefore, impose integration measures.

An Anti–Civil Rights Campaign

As his first term in office—the term he inherited upon the death of President Franklin Roosevelt (1882–1945)—drew to a close, President Truman proposed civil rights legislation to Congress. Many Democratic politicians from the South were adamantly opposed to Truman’s civil rights gestures, and they were frustrated in their attempts to obstruct the proposals. On May 10, 1948, Mississippi Governor Fielding Wright (1895–1956) convened a regional meeting of fifteen hundred delegates in Jackson, Mississippi, to discuss their response. They decided that if Truman received the party’s nomination at the upcoming Democratic National Convention, they would break from the Democratic Party. Leaders of the movement articulated the group’s position clearly. In his keynote address at the meeting, South Carolina Governor (and later U.S. Senator) Strom Thurmond (1902–2003) declared that “all the laws of Washington and all the bayonets of the army cannot force the Negro into our homes, our schools, our churches, and our places of recreation.”

Truman received the Democratic Party’s nomination at the convention on July 15, 1948, defeating Richard Russell of Georgia. Delegates at the convention also endorsed by a narrow margin an ambitious civil rights agenda, to the chagrin of states’ rights proponents. Immediately after the vote, the Mississippi delegation and half of the delegates from Alabama walked out of the hall. A mere two days later, the States’ Rights Democratic Party was created at a meeting of six thousand states’ rights advocates in Birmingham, Alabama. The party was soon given the nickname “Dixiecrats” by Bill Weisner, a North Carolina journalist. The Dixiecrat platform centered on preserving segregation in the South, but the party also embraced a handful of other conservative policies, including opposition to labor unions and rolling back the federal welfare system, which had grown dramatically during the Roosevelt era.

The Dixiecrats nominated their own ticket. Their presidential candidate was Thurmond, and Wright was his running mate. The hope was that the Dixiecrat ticket would draw enough votes away from both Truman and Republican candidate Thomas Dewey (1902–1971) to prevent either from gaining the required majority of votes in the electoral college. If that happened, the presidential contest would then move to the House of Representatives, where Southern members could bargain for concessions on civil rights legislation.

Thurmond and Wright campaigned aggressively across the South, and gained substantial support in several states. Their views were not universally accepted by Southern Democrats, however. Their position on matters of race was considered too extreme by even some fairly conservative voters.

Victorious in Four States

Thurmond and Wright received about 1.2 million votes in the 1948 presidential election, and carried South Carolina, Mississippi, Louisiana, and Alabama, totaling 39 electoral votes. However Truman prevailed in the rest of the South, and won the election—carrying 28 states and receiving 303 electoral votes—despite nearly universal predictions by pollsters and the media that Dewey (who received not quite 22 million votes and 189 electoral votes) would be the next president.

The States’ Rights Democratic Party disbanded shortly after the election, and most members returned to the regular Democratic Party, but politics in the South were never the same. The region would never again be considered “safely” Democratic. Thurmond was elected to the U.S. Senate as a Democrat in 1954, where he joined with other conservative Democrats in attempting to impede the civil rights movement. In 1964, with the civil rights struggle raging, Thurmond led a mass defection of conservative Southern elected officials from the Democratic Party over to the Republicans.

Current Events and Social Movements

The Marshall Plan

The European Recovery Program, better known as the Marshall Plan, was an investment by the United States of nearly $15 billion in economic assistance to rebuild the devastated agricultural and industrial infrastructures of Western European countries in the aftermath of World War II. It was named after Secretary of State George C. Marshall (1880–1959), who first proposed this infusion of money in a 1947 speech.

Bolstering War-Torn Economies

In 1946 Europe was in shambles. Six years of war had left every nation involved in the war in a state of physical and economic chaos. Industrial centers and transportation hubs had been reduced to rubble by years of relentless bombing. Even the countries that had been on the winning side faced years of rebuilding before their economies would be capable of performing at their prewar levels. So thorough was the destruction that there was not nearly enough capital or raw materials available to complete the job.

The United States, in contrast, emerged from World War II in better shape economically than it had entered it. Not only had the war not taken place on American soil—with the notable exception of the attack at Pearl Harbor—but wartime production had kept money flowing into the coffers of the nation’s big industrial companies, the heart of the U.S. economy.

Former General Hatches a Plan

Marshall had first made his mark as a military man. He was a key tactical officer in France during World War I, and in 1939 he was named chief of staff of the U.S. Army, effectively placing him at the very top of the U.S. fighting machine that fought World War II. Marshall retired from the army in November 1945, and was named secretary of state by President Harry S. Truman (1884–1972) in 1947.

One of Marshall’s first assignments as secretary of state was to attend a conference in Moscow, along with representatives of Britain, France, and the Soviet Union, to discuss the future of vanquished Germany and Austria. Over the course of the meetings, it became clear that the Soviet Union saw the collapsed economies of Europe as an opportunity to spread communism across the continent. Marshall came away from the conference determined to forge an American response that would help restore those economies to viability and keep them operating according to free-market principles.

A team of Truman administration policy experts, which included William L. Clayton (1880–1966), George F. Kennan (1904–2005), and Dean Acheson (1893–1971), quickly put together a plan for a gigantic aid package that they hoped would stabilize the war-ravaged economies of Europe, with the idea that economic stability was the key ingredient in the recipe for political stability. Marshall officially unveiled the plan during his June 5, 1947, commencement address at Harvard University. In the address, Marshall played down the role of perceived Soviet aggression in the eagerness of the United States to lend a helping hand, stating that the program was “directed not against any country or doctrine but against hunger, poverty, desperation, and chaos.” Marshall invited the nations of Europe to inventory their own needs and suggest a plan for the most effective use of American economic assistance.

Sixteen Countries Form Core Group

A month later, representatives of eighteen Western European countries met in Paris to discuss such plans. Together, these countries—Austria, Belgium, Denmark, France, Great Britain, Greece, Iceland, Ireland, Italy, Luxembourg, the Netherlands, Norway, Portugal, Spain, Sweden, Switzerland, Turkey, and West Germany—and the United States formed the Organization for European Economic Cooperation. The Soviet Union was invited, but American policy makers doubted that Soviet leaders would join the coalition, and doubted even more that Congress would agree to any plan that gave a large amount of aid to the Soviets. Planners ensured that outcome by attaching conditions they knew would be unacceptable to the Soviet Union, including movement toward a unified European economy. As expected, the Soviet Union refused to join the organization, and representative Vyacheslav Molotov (1890–1986) abruptly left the Paris gathering. The Western European counties put together a proposal for $16 billion to $22 billion in aid to boost their economies until 1951.

While the plan met with some resistance in Congress, U.S. lawmakers eventually agreed to President Truman’s request for $17 billion in economic assistance for Europe. The Economic Cooperation Administration (ECA) was created in 1948 to administer the program. The Soviets refused to let their client nations Poland and Czechoslovakia participate in the program, and launched their own version, the Molotov Plan, to promote Soviet-style economic development in Eastern Europe. The Molotov Plan integrated Eastern European countries into the Soviet Union’s socialist economy and led to the creation of the Council for Mutual Economic Assistance (COMECON). The creation of the ECA and COMECON set the stage for the economic aspects of the East-West rivalry that would dominate world affairs for the next four decades.

The Marshall Plan was completed in 1951, having added about $15 billion to the economies of Western Europe. Marshall himself was awarded a Nobel Peace Prize in 1953 for his role in revitalizing industrial and agricultural productivity in Europe after World War II. The Marshall Plan was instrumental in shifting U.S. foreign policy from isolationism to internationalism. It resulted in the creating of the European Economic Community, or “Common Market,” the predecessor to today’s European Union.

See also The Cold War

The Cold War

From the end of World War II until 1991, Western capitalist democracies led by the United States engaged in an ongoing ideological conflict against the Communist Soviet Union. Known as the Cold War, this forty-five-year period of hostilities between the two superpowers never exploded into direct warfare; the war was instead fought on the political, economic, and philosophical levels. Only indirectly did the two sides square off militarily, as they battled for supremacy and influence throughout the developing world by supporting and arming forces and factions in regions that were politically unstable.

The “Iron Curtain”

The United States and the Soviet Union were allies during World War II, working together to defeat Nazi Germany. After the war, however, relations between the two countries quickly deteriorated. There is no consensus as to the precise beginning of the Cold War, but many historians point to a 1946 speech given in Fulton, Missouri, by British statesman Winston Churchill (1874–1965). In the speech, Churchill decried the Soviet Union’s moves to achieve regional domination over all the countries of Eastern Europe, stating that “an iron curtain has descended across the Continent. Behind that line lie all the capitals of the ancient states of Central and Eastern Europe. Warsaw, Berlin, Prague, Vienna, Budapest, Belgrade, Bucharest, and Sofia, all these famous cities and the populations around them lie in what I must call the Soviet sphere, and all are subject in one form or another, not only to Soviet influence but to a very high and, in many cases, increasing measure of control from Moscow.”

Regardless of whether Churchill’s use of the term iron curtain actually marked a significant turning point in history, there is no doubt that the Cold War had begun by the time President Harry Truman (1884–1972) implemented the Truman Doctrine—the proclamation that the United States would take proactive measures to prevent the spread of communism in Europe—in 1947. The following year, Truman put into effect the Marshall Plan, a program of economic assistance to the war-ravaged countries of Western Europe, aimed at stabilizing the economies of those nations in hopes of keeping them from turning to communism as they sought to rebuild their infrastructures. As U.S. concerns about Soviet aggression in Europe continued to escalate, the United States worked with several Western European nations to form the North Atlantic Treaty Organization (NATO), a military coalition designed to reinforce its members’ ability to resist any attempt by the Soviet Union to make inroads in the region. A few years later, the Soviet Union countered by forming its own military alliance, called the Warsaw Treaty Organization, or Warsaw Pact.

The geographic focal point of the early stages of the Cold War was Germany. After World War II, the country was split into two separate states: East Germany, controlled by the Soviets; and West Germany, controlled by the United States and its Western allies. The city of Berlin, located within the boundaries of East Germany, was itself split into Eastern and Western halves under the control of the two sides. When the Soviet Union blockaded West Berlin in 1948 to stop the inflow of American supplies, the United States responded by flying over the blockade and delivering via airlift the supplies necessary to keep the city alive. Soviet leader Joseph Stalin (1879–1953) finally disassembled the failed blockade in May 1949. More than a decade later, in 1961, the East German government erected a wall between the two halves of Berlin—the Berlin Wall—to prevent East Germans from emigrating to the West.

The Cold War Expands

In the early 1950s the Cold War spread to Asia. In 1950 the Soviet Union forged an alliance with the Communist government of China, and that year China supported an invasion of South Korea by forces from Communist North Korea. Fearing that communism would expand throughout the region, the United States rushed to South Korea’s assistance, helping form the Southeast Asia Treaty Organization, the region’s version of NATO. The Korean War lasted until 1953, and thirty-seven thousand Americans lost their lives in it.

Meanwhile, the Soviet Union had leveled the nuclear playing field. At the beginning of the Cold War, the United States was the only country to possess nuclear weapons, giving it a huge strategic edge as it sought to counterbalance the might of the Soviets’ huge and powerful conventional forces. In 1949, however, the Soviet Union tested its first atomic bomb, prompting the United States to intensify its own nuclear weapons development efforts. Thus the arms race was on.

At home in the United States, fear of communism bordered on hysteria. Convinced that communism threatened the very fabric of American society, anybody with a history of ties to communist organizations—which had been fairly common in the United States before World War II, especially during the Great Depression—was viewed as an enemy of the state, and was at risk of having his or her career ruined. This “witch hunt” mentality peaked in Congress with the work of the House Committee on Un-American Activities, whose highly publicized hearings targeted a number of celebrated entertainers and artists, and Senator Joseph McCarthy (1908–1957) of Wisconsin, who was eventually discredited for being overzealous in his search for communist operatives hiding in every corner of American society. Much of the American public lived under a cloud of terror, convinced that the Soviets intended to launch a nuclear attack on their hometown at any moment. Bomb shelters were dug in backyards across the nation, and schoolchildren were instructed to “duck and cover”—to get under a desk or table and cover their heads with their hands—in the event of such an attack.

The death of Stalin in 1953 brought hope that relations between the Soviet Union and the West might thaw, and the proliferation of nuclear weapons come to a halt. This hope was short-lived, however, as tensions mounted once again after just a couple of years, with Prime Minister Nikita Khrushchev (1894–1971) taking over as Soviet leader. Khrushchev is perhaps best remembered for shouting “We will bury you” during a meeting with Western ambassadors in 1956. His statement was interpreted at the time as being overtly hostile. There is some doubt, however, about that translation of his Russian words; they are probably more accurately translated as “We will outlive you,” or “We will attend your funeral.” Nevertheless, when the Soviet Union launched the Sputnik satellite in 1957, it ushered in a new era of heightened competition between the Americans and Soviets. This round of conflict centered on technology for space exploration and bigger, better missiles.

Meanwhile, in nearly every part of the world, including not only Southeast Asia but also Africa, the Middle East, and Latin America, the United States and Soviet Union continually jockeyed for position by supporting leaders they thought would best help their cause in the region, often willingly turning a blind eye to a regime’s brutality or corruption if they believed that regime was most likely to keep the other side at bay. Wherever a civil war broke out, the two superpowers were likely to be pulling the strings behind the opposing sides in the skirmish.

Crises in Cuba

In the early 1960s, the island nation of Cuba became the Cold War’s most dangerous flash point. In 1959 Communist leader Fidel Castro (1926–) took power in Cuba after helping lead the overthrow of President Fulgencio Batista y Zaldívar (1901–1973). Castro developed close ties to the Soviet Union, which naturally was delighted to have a good friend located so close to the United States. Disturbed by the presence of a Soviet-sponsored communist country just ninety miles off the Florida coast, President John F. Kennedy (1917–1963) in 1961 armed a group of Cuban exiles and helped them launch an invasion at the Bay of Pigs in southwest Cuba, aimed at removing Castro from power. The invasion was poorly executed and failed miserably. It succeeded in only further damaging the relationship between the United States and Cuba, and by extension, the relationship between the United States and the Soviet Union.

There were a number of “near misses” over the course of the Cold War, but the event that probably brought the United States and the Soviet Union closest to a nuclear exchange was the Cuban missile crisis of October 1962. That fall, U.S. intelligence detected the presence of Russian nuclear missiles in Cuba. President Kennedy issued an ultimatum to the Soviet Union to remove the missiles, and the world waited anxiously to see which superpower would back down first. The crisis was essentially a game of “chicken,” with nuclear annihilation the penalty for guessing wrong. After much diplomatic negotiation, the Soviet Union announced that it would dismantle and remove the missiles.

Communism Seen as Threat in Vietnam

The Cold War, and the overriding foreign policy imperative that communism be contained in every region of the globe, provided the backdrop for the Vietnam War, which dominated political debate in the United States through the remainder of the 1960s and into the 1970s. In the mid-1950s, an uprising led by Ho Chi Minh (1890–1969) resulted in the ouster of the French colonialists who had dominated the region for about a century. The agreement that ended the fighting “temporarily” divided Vietnam into two separate nations in the north and south, with North Vietnam coming under communist control.

Naturally, the United States rushed to support South Vietnam in hopes of preventing it from being overrun by its neighbors to the north, which was supported by Communist China. Whereas China did not possess the “superpower” status of the Soviet Union, it was nevertheless perceived as the major communist threat in the region, setting up a classic East-West Cold War face-off by proxy.

Detente

Relations between the United States and the Soviet Union thawed a bit during the early 1970s. This period was dubbed detente, a French word meaning “easing” or “relaxing.” Between 1969 and 1972, the two superpowers engaged in negotiations aimed at slowing down the arms race. The dialogue, known as the Strategic Arms Limitation Talks (SALT I), led to the creation of two treaties designed to put a halt to the proliferation of nuclear arms for five years. The treaties were signed in May 1972 by U.S. President Richard M. Nixon (1913–1994) and Soviet General Secretary Leonid Brezhnev (1906–1982). The success of the SALT I negotiations led to a follow-up series, SALT II, which resulted in the signing of several more arms limitation treaties. This round of treaties was not ratified by Congress, however, as tensions between the two nations began to rise once more later in the decade. New conflicts in the Middle East, Angola, and Chile brought hostilities into the open once again.

By early in the presidency of Ronald Reagan (1911–2004), the Cold War was back in full swing. Reagan increased military spending, intensified the arms race, and engaged in a war of rhetoric, referring to the Soviet Union as the “evil empire.” Reagan sent assistance, sometimes illegally, to anticommunist guerillas in Central America. He also sent covert aid to the mujahideen (Islamic guerrilla fighters) who were battling against the Soviet occupation of Afghanistan. Some of these mujahideen evolved into the Taliban.

The beginning of the end of the Cold War came in the mid-1980s, when Mikhail Gorbachev (1931–) took power in the Soviet Union. With the Soviet economy crumbling, in part due to the expense of its disastrous invasion of Afghanistan, Gorbachev introduced major reforms that completely overhauled the political and economic systems of Communist bloc countries. These reforms were known collectively as perestroika (economic and government restructuring) and glasnost (openness). By 1990 the Soviet economy had collapsed, the Communist governments of several Eastern European countries had been ousted, and, in the most important symbolic act of all, the Berlin Wall was dismantled, effectively bringing the Cold War to a decisive conclusion.

See also The Cuban Missile Crisis

See also The Marshall Plan

See also McCarthyism

Hollywood’s Cold War

The culture of the Cold War was captured, and to some extent created, by the Hollywood film industry. In the 1950s movie studios churned out anticommunist pictures by the score, partly in response to consumer demand, but also to deflect the scrutiny many top filmmakers had come under from “commie-hunters” in the nation’s capital. Studios, with the blessing of Washington, had made numerous pro-Soviet movies during World War II—after all, they were U.S. allies at the time. These movies included Mission to Moscow (1943) and Song of Russia (1944). By the time the House Committee on Un-American Activities Committee (HUAC) was in full swing in the 1950s, however, those films had come back to haunt their makers. The HUAC-led backlash against Hollywood resulted in the production of such anticommunist propaganda films as I Married a Communist (1950), I Was a Communist for the FBI (1951), and Invasion U.S.A. (1952).

One of the best Cold War dramas was The Manchurian Candidate (1962), which addressed the prospect of U.S. soldiers captured during the Korean War being brainwashed by the Chinese to become political assassins upon receipt of a signal from their handlers. The movie, based on a 1959 novel by Richard Condon, starred Laurence Harvey as a war-hero-turned-assassin.

Hollywood turned its attention to a different aspect of the Cold War in the 1960s: the specter of nuclear annihilation. Two 1964 films exemplified this theme. Fail-Safe, based on a popular novel of the same title, was about a technical foul-up that accidentally sends American nuclear bombers on their preassigned mission toward Moscow. The other movie, Stanley Kubrick’s Dr. Strangelove, or: How I Learned to Stop Worrying and Love the Bomb, was a satire in which actor Peter Sellers played several roles, including the president of the United States and the wheelchair-bound, ex-Nazi, nuclear weapons expert Dr. Strangelove.

Simultaneously, a separate line of movies focusing on espionage was hitting the nation’s cinemas, including the wildly popular James Bond movies based on books by British writer and former intelligence officer Ian Fleming (1908–1964). These movies, which included Dr. No (1962), From Russia with Love (1963), and Goldfinger (1964), spawned a huge wave of Bond-influenced spy thrillers for both the cinema and television.

After the thaw of the 1970s, Hollywood joined with President Ronald Reagan (1911–2004) in renewing tensions with the Soviet Union with a new wave of anticommunist movies. Red Dawn (1984) showed a band of brave American teenagers rebelling guerrilla-style in a United States suffering under Soviet occupation. The 1985 movie Invasion U.S.A.—unrelated to the 1952 film of the same name—pitted martial arts expert Chuck Norris against a small army of Soviet infiltrators sent to destroy American society. Sylvester Stallone’s Rambo III (1988) had the title character battling Russian villains in Afghanistan.

By the end of the 1980s, the Cold War was winding down. A popular, if sometimes hackneyed, Hollywood genre died along with it.

The Korean War

The Korean War (1950–1953) was the first conflict in which U.S. military forces were fully engaged in the years following World War II. The United States fought on behalf of South Korea in hopes of preventing its occupation by Communist North Korea, backed first by the Soviet Union and later China. The war lasted just over three years, and resulted in about 140,000 American casualties, including approximately 37,000 deaths.

The Korean peninsula was under Japanese control from nearly the beginning of the twentieth century through World War II. After the Allies defeated the Japanese, Korea was partitioned into two separate countries for occupation purposes. The Soviets controlled the area north of the 38th parallel, where they set up a socialist regime. In 1948 that regime officially became the Democratic People’s Republic (DPR) of Korea, with longtime Communist leader Kim Il Sung (1912–1994) as the new nation’s first premier. In the South, many different factions struggled for power before a United Nations–sponsored election resulted in the selection of Syngman Rhee (1875–1965), widely called the father of Korean nationalism, as president of the newly formed Republic of Korea (ROK).

The division of Korea was supposed to be temporary, but both sides had their own ideas about how reunification should be structured. Military incursions across the border took place constantly, though the ROK army was ill equipped and poorly trained, while the DPR’s armed forces had access to the best Soviet weapons and training. Rhee’s government was riddled with corruption, and engaged in repressive tactics against political dissenters. From the U.S. perspective, one objective was paramount: Korea could not be allowed to reunify under Communist rule. In spite of the potential for conflict, the United States withdrew most of its troops in June 1949, leaving only a small group of about five hundred technical advisers in place to help train the South Korean military.

Surprise Attack by DPR

On June 25, 1950, North Korean troops stormed across the border without warning. They met with little resistance from the shocked and outgunned South Koreans, and within a couple days they had advanced to the outskirts of the South Korean capital, Seoul. The United Nations (UN) Security Council immediately convened a special meeting, and passed a resolution calling on the North Koreans to withdraw instantly. The resolution was ignored. Two days later, the Security Council met again and passed another resolution authorizing UN members to intervene as necessary to turn back the assault. President Harry S. Truman (1884–1972) quickly accepted the mandate, committing U.S. air and naval forces to the conflict—technically a “police action,” since war had not been officially declared. Fifteen other countries offered assistance, some of it substantial, some of it largely symbolic.

Initially, U.S. military involvement was limited to U.S. Air Force sorties from Japan and carrier strikes by the U.S. Navy, leaving the overmatched South Korean army to hold its own on the ground. They were not up to the task, and on June 28 North Korean forces took Seoul. By early August, enough UN forces had arrived to bolster the sagging ROK army and create a stable line of defense around the important port of Pusan. In mid-September 1950, General Douglas MacArthur (1880–1964), commander of U.S. forces in the Far East and supreme commander of the UN forces, launched a brilliantly conceived amphibious attack at Inchon, a port city on the western coast just a few miles from Seoul. American forces recaptured Seoul two days later. MacArthur’s successful tactics threatened to encircle the overextended North Korean army, forcing them to retreat back above the 38th parallel.

With the momentum having swung in favor of the UN and American forces, a crucial decision had to made at this point: should they continue to pursue the enemy and advance across the 38th parallel into North Korean territory? Yielding to public pressure, and given the go-ahead by most UN members, Truman—sensitive to criticism throughout his presidency of being “soft” on communism—authorized MacArthur to continue moving northward into North Korea, crossing the 38th parallel on October 1. They took the DPR capital of P’yŏngyang on October 19. By late November, UN and ROK forces had driven the North Korean army nearly to the Yalu River, the boundary between North Korea and the People’s Republic of China.

Chinese Entry

Then the momentum of the war reversed once again. China, which had said all along that an invasion of North Korea would not be tolerated, began to send vast numbers of conscripts to assist the North Korean army. Thousands of Chinese soldiers crossed the Yalu and attacked the vulnerable flank of MacArthur’s forces, and now it was the UN’s turn to retreat. P’yŏngyang was abandoned on December 5 and on January 4, 1951, Communist forces once again captured Seoul.

A new line of scrimmage was established south of the 38th parallel, and over the next few months the line moved back and forth moderately without decisive progress made by either side. By July 1951 the conflict had become more or less a stalemate. MacArthur wanted to break the deadlock by attacking China, possibly with atomic weapons, but Truman was firmly against that approach. When MacArthur continued to press the matter, Truman eventually became fed up and relieved MacArthur of his command on April 11, 1951. In July of that year, the two sides began negotiating an end to hostilities. The talks, however, broke down repeatedly amid accusations of germ warfare and disagreements about plans to exchange prisoners.

It ultimately took two years to arrive at an agreement acceptable to all parties. By then the United States had a new president, Dwight D. Eisenhower (1890–1969), and thousands more soldiers had been killed and wounded. A final armistice was signed at P’anmunjŏm on July 27, 1953, calling for a cease-fire and the withdrawal of both armies from a battle line that stretched the entire width of the peninsula, not far from the original border at the 38th parallel. Both sides claimed victory, but the mutual goal of a reunified Korea was not achieved.

The Return of MacArthur

Few World War II military commanders achieved greater fame than General Douglas MacArthur (1880–1964). A hero of the war in the Pacific, MacArthur went on to lead the United Nations troops battling against North Korea a few years later. His trademark sunglasses and pipe made his likeness one of the most recognizable images of the era.

MacArthur was born on January 26, 1880, in Little Rock, Arkansas. He seemed destined for a military career; many in his family had been career soldiers, and his father, Arthur MacArthur, was himself a prominent general. As a child, Douglas MacArthur was a mediocre student. It was not until he entered the military academy at West Point in 1899 that he began to stand out in the classroom. He graduated from West Point in 1903 at the very top of his class.

MacArthur’s first military assignment took him to the Philippines, and his climb through the army’s ranks was rapid. In 1906 he was appointed aide-de-camp to President Theodore Roosevelt (1858–1919). During World War I, he earned a reputation as a skilled and daring leader and a variety of honors and decorations. After the war, he was promoted to brigadier general and was made superintendent of West Point, a position he held until 1922.

From 1930 to 1935, MacArthur served as chief of staff of the U.S. Army. During this period, an incident took place that most of his admirers would prefer not to remember. In 1932, during the heart of the Great Depression, MacArthur led a brutal assault on a gathering of thousands of ragged World War I veterans who were in Washington, D.C., to request that Congress dispense their war service bonuses earlier than scheduled. MacArthur set upon the shantytown, where the men were encamped with their families, with tanks and columns of bayonet-wielding soldiers. He rationalized the attack by insisting that he had just quelled an impending communist uprising.

MacArthur was sent back to the Philippines in 1935, and he continued to live there after retiring from the army. In July 1941 he was recalled to active duty as commander of U.S. forces in the Far East. MacArthur’s forces were driven from the Philippines when the Japanese attacked the islands, prompting the general to utter his most famous line, “I shall return,” in 1942. He made good on that prediction two years later, after battling his way back through the South Pacific from Australia, giving him an opportunity to deliver another, less famous, line, “I have returned.… Rally to me.” For his exploits, MacArthur was rewarded with the privilege of accepting Japan’s surrender on September 2, 1945.

When the war was over, President Harry Truman (1884–1972) named MacArthur supreme commander of Allied occupation forces in Japan, from where he launched the brilliant attack at Inchon that drove the North Koreans out of the South in the early stages of the Korean War. As the war dragged on, however, the headstrong MacArthur’s defiance eventually crossed the line with Truman, and MacArthur was relieved of his command in April 1951.

McCarthyism

During the late 1940s and early 1950s, U.S. Senator Joseph McCarthy (1908–1957), a Wisconsin Republican, led a tidal wave of anticommunist political repression in the United States. McCarthy and his allies claimed that communists had infiltrated the federal government and other institutions, and were threatening the American way of life. The attacks were often baseless, but they nevertheless destroyed the careers of thousands of individuals, some of whom had done nothing more than attend a left-wing political meeting ten or fifteen years earlier. Originally associated with generic Cold War anticommunism, the term McCarthyism eventually came to refer to a particularly mean-spirited and groundless accusation based on paranoia and characterized by political grandstanding.

Seeds of Anticommunism

Communism was under attack by conservatives in the United States long before the onset of the Cold War. The industrialization that took place in the late nineteenth and early twentieth centuries gave rise to a fairly large and active socialist movement in response to horrible working conditions and poor wages. Large segments of the labor movement embraced socialist philosophies, and the movement gained momentum with the arrival of waves of European immigrants who brought with them traditions of militant labor activism. The dawn of the Great Depression in 1929 sparked a period of dramatic growth for communism in the United States, as Americans searched for a response to the economic upheaval the nation was experiencing. Communist rhetoric became common among displaced workers as well as artists and intellectuals.

As the 1930s progressed, many people who had embraced communism began to sour on the ideology as news spread of such world events as the brutal purges carried out by Soviet leader Joseph Stalin (1879–1953) and the Soviets’ signing of a nonaggression pact with Nazi Germany. There was also a backlash among some conservatives against President Franklin Roosevelt’s (1882–1945) New Deal policies, which were too socialistic for their tastes.

By the time the United States entered World War II in late 1941, communism had largely fallen out of favor in the United States. In 1938 the House of Representatives formed the Committee on Un-American Activities (HUAC). Two years later the Smith Act was passed, making it illegal to advocate the violent overthrow of government. Various loyalty programs designed to weed out communists from jobs in the federal government were put into place over the next few years.

Growing Paranoia

After the end of World War II in 1945, halting the spread of communism became a central theme of American policy both at home and abroad. In 1947 President Harry Truman (1884–1972) signed an executive order barring all communists and fascists from government work, and the following year, Communist Party leaders in the United States were prosecuted under the Smith Act. Communist paranoia continued to escalate as the Soviet Union expanded its global reach. The 1949 victory of Communist forces under Mao Zedong (1893–1976) in China’s civil war further unsettled American nerves. Another law, the McCarran Internal Security Act, was passed by Congress in 1950, virtually outlawing communism altogether. The act actually went even further than that, turning into outlaws even those who were shown simply to have a “sympathetic association” with undesirable organizations and individuals. In spite of these actions, conservative critics continued to assault the Truman administration as being too soft on communism. Such was the national mood when Senator McCarthy made his appearance in the national spotlight.

McCarthy was elected to the U.S. Senate in 1947 after a brief and undistinguished career as a lawyer and circuit court judge in his home state of Wisconsin. McCarthy’s first few years in the Senate were fairly uneventful. That changed abruptly in February 1950, when McCarthy announced in a speech to the Republican Women’s Club of Wheeling, West Virginia, that he was in possession of a list—presumably written on the piece of paper he was waving around—of the names of 205 known members of the Communist Party who were currently working for the U.S. State Department. The number tended to shift in subsequent versions of the claim over the weeks that followed.

McCarthy’s claim created a nationwide stir, coming close on the heels of the conviction of State Department official Alger Hiss (1904–1996) for perjury related to his testimony about involvement with Soviet espionage agents. McCarthy moved quickly to exploit his newfound fame. He was extremely skillful at manipulating the media. At no point did he manage to produce concrete evidence to back up his claims, but it did not seem to matter, even after a Senate subcommittee investigated his allegations in the spring of 1950 and found them to be baseless. Bolstered by the onset of the Korean War and the arrests of Julius (1918–1953) and Ethel (1915–1953) Rosenberg for allegedly spying for the Soviets, Americans were primed to believe in the authenticity of McCarthy’s “Red scare” assertions. The Republicans happily played on these fears, laying the blame for the spread of communism in the United States squarely on the shoulders of the Democrats, who had controlled the federal government for twenty years. McCarthy went so far in 1951 as to call this period of Democratic domination and in particular the leadership of Truman’s secretary of state, George C. Marshall (1880–1959), part of “a conspiracy so immense and an infamy so black as to dwarf any previous such venture in the history of man” In this sense, McCarthy’s tactics were an unqualified success; in 1952 the Republicans seized control of Congress and, with the election of Dwight D. Eisenhower (1890–1969), the White House.

The Role of the FBI

Throughout the McCarthyism period, Senator McCarthy himself never actually documented the existence of single communist in a government job, but his power to deflate his political enemies with false accusations was enormous. He also had a powerful ally in Federal Bureau of Investigation (FBI) director J. Edgar Hoover (1895–1972), perhaps the most virulent anticommunist in the federal government. The FBI provided much of the information, sketchy though it often was, that fueled the investigations and prosecutions of suspected communists and communist sympathizers. So eager was Hoover to expose the “Red menace” that he regularly resorted to such underhanded methods as unauthorized and often illegal wiretaps, break-ins, and media leaks.

Once identified as a communist or sympathizer, many individuals were forced to testify before one of several investigating bodies, most notably the House Un-American Activities Committee. HUAC members would browbeat their prey—some of whom were there on the basis of the flimsiest of evidence gathered through questionable means—into not only admitting their own past ties to the Communist Party, but also informing on others who had participated with them. Those who refused to cooperate could claim their Fifth Amendment right to avoid self-incrimination, but they usually lost their jobs just the same, and saw their lives thrown into chaos.

People who defied HUAC often learned the hard way that their name had been placed on an unofficial “blacklist.” Many industries had blacklists containing the names of people who were no longer employable because they had been identified as communists or communist sympathizers. The most famous blacklist was the one for the Hollywood movie industry. It included the names of the so-called Hollywood Ten, a group of screenwriters and directors who had stood up to HUAC in 1947.

Discredit and Censure

The threat of losing one’s job turned out to be a powerful deterrent, leading people to avoid association with any organization that could be remotely considered leftist. Thousands of people were fired during the peak years of McCarthyism. Academia was a favorite target; about 20 percent of those called to testify before a state or federal investigative body were college faculty or graduate students.

Ironically, the Republican takeover of Congress turned out to be the beginning of the end of McCarthyism. With Republicans in control in Washington, McCarthy could no longer weave tales of communist conspiracies within the federal government. He turned his attention instead to the military, which turned out to be a major strategic blunder. With World War II still fairly fresh in the minds of most Americans, the military was generally revered by the public. McCarthy’s new round of attacks was met with hostility, and, perhaps more important, he lost the support of the Eisenhower administration, whose leader was himself a military hero.

It soon became clear that McCarthy had invented many of his accusations out of thin air, and while Americans still feared the spread communism as much as ever, they lost their taste for the witch-hunt. In 1954 McCarthy was censured by the Senate for his misconduct. He died three years later a bitter and disgraced has-been. By the late 1950s the repression and hysteria that characterized the McCarthyism era had pretty much evaporated, and Americans understood that many of their neighbors had been unjustly ruined.

The Hollywood Ten

Artists and intellectuals, with their enduring interest in maintaining freedom of expression, have long been attracted to left-wing politics. The Hollywood film industry is no exception, from Jane Fonda’s activism against the Vietnam War to the outspoken progressive views of Tim Robbins in the Iraq War era. In the fall of 1947, ten prominent movie writers and directors were subpoenaed to appear before the House Committee on Un-American Activities (HUAC) as part of a broad investigation into “the extent of Communist infiltration in the Hollywood motion picture industry.” Dubbed the “Hollywood Ten,” these artists were Alvah Bessie, Herbert Biberman, Lester Cole, Edward Dmytryk, Ring Lardner Jr., John Howard Lawson, Albert Maltz, Samuel Ornitz, Robert Adrian Scott, and Dalton Trumbo.

All ten members of the group refused to answer the committee’s questions, on the grounds that their right to hold and communicate whatever personal and political beliefs they wanted was protected by the First Amendment to the Constitution. HUAC saw things differently, and as a result of their refusal to cooperate, the Hollywood Ten were tried in federal court and found guilty of contempt. They were each sentenced to one year in jail and fined $1,000. In addition, they were blacklisted from the movie business. For years most of them were able to eke out a living by working in the film industries of other countries, or else by using a pseudonym to continue working in Hollywood. The most famous pseudonym used by a member of the Hollywood Ten was Robert Rich, who won an Oscar in 1956 for writing The Brave One. Rich was actually the blacklisted Trumbo.

The example of the Hollywood Ten ushered in a dark period for Hollywood. Called to appear before HUAC, some three hundred witnesses from Hollywood clung to their own careers by admitting their previous communist affiliation and naming others whom they knew to have similar connections in their past. Many of the film industry’s brightest stars took the high road by refusing to name names, and paid for the noble gesture with their careers, at least temporarily.

Of the original Hollywood Ten, only Dmytryk backed down and later cooperated with HUAC. He appeared a second time before the committee and named twenty-six members of left-wing groups.

Hollywood blacklisting finally subsided in the late 1950s along with the anticommunist hysteria that had gripped the entire nation. Many formerly blacklisted writers and directors were able to rebuild their careers. Trumbo was the first of the Hollywood Ten to successful resume work in Hollywood, appearing in the credits of the classic Spartacus (1960), which he adapted from the novel of the same name by fellow blacklistee Howard Fast. The climactic scene in Spartacus reflects Trumbo’s own traumatic McCarthyist experience, as captured rebel slaves refuse to betray Spartacus to the ruthless Crassus. As a result, Crassus has the slaves crucified one by one.

See also The Cold War

See also Alger Hiss

See also Julius and Ethel Rosenberg

The Civil Rights Movement

While the struggle for racial equality in the United States has been ongoing since the arrival of the first Africans in North America, the civil rights movement of the 1950s and 1960s was an especially critical period, marked by intensive activity, protest, and substantial progress in reducing discrimination in employment, education, politics, and housing.

Roots of the Movement

The roots of the civil rights movement were planted nearly a century earlier, during the post–Civil War period known as the Reconstruction. After the war, the victorious North tried to impose economic and social changes on the vanquished South. The Thirteenth through Fifteenth Amendments to the Constitution—known collectively as the Civil War Amendments—granted fundamental citizenship rights to blacks, but the reality in the South was quite different. Even as eighteen states in the North and West were putting antidiscrimination laws in place by the end of the century, Southern states were moving in the opposite direction, creating a system of laws designed to prevent African-Americans from attaining equal status with whites. These laws became known as Jim Crow laws. They essentially institutionalized racial segregation and discrimination, ensuring that blacks would remain second-class citizens in the South, where 90 percent of the nation’s African-Americans lived at the time. Jim Crow laws were accompanied by a wave of violence by white supremacist groups against African-Americans who dared challenge this systematic discrimination. By the dawn of the twentieth century, most black males in the South were disenfranchised, victims of such discriminatory policies as poll taxes and literacy tests required in order to vote. (Women of any race could not vote either.)

Institutionalized racism was not confined to the South. The federal government tacitly approved of Jim Crow policies when the U.S. Supreme Court ruled in Plessy v. Ferguson (1896) that the policy of “separate but equal” facilities for African-Americans did not violate the Constitution.

African-Americans began to battle back against Jim Crow in the early 1900s, though much of the protest took place in the North, where it was less dangerous to speak out. Massachusetts-native W. E. B. Du Bois (1868–1963) was one of the most visible and vocal crusaders for racial justice during this period. In 1905 he launched an equality initiative in Niagara Falls, New York, called the Niagara Movement, but it lasted only a few years and accomplished little. In 1909 Du Bois and a handful of other individuals of diverse ethnicities founded the National Association for the Advancement of Colored People (NAACP). The NAACP worked for racial equality through a combination of litigation and public education, including publication of their own magazine, Crisis, edited by Du Bois.

Progress after World War II

The NAACP scored some notable legal victories beginning in the 1930s and early 1940s. Several law schools and graduate schools were desegregated because of the group’s efforts, and through litigation culminating in the U.S. Supreme Court case Smith v. Allwright (1944) they were able to halt the formal exclusion of blacks from party primary elections in the South. Other efforts were less successful, however. Prodded by the NAACP, the U.S. House of Representatives passed strong antilynching legislation in 1937 and 1940, only to see the bills thwarted in the Senate by filibusters orchestrated by members from Southern states.

The civil rights movement gained much momentum after World War II, as Americans began to understand the hypocrisy of segregation and discrimination after years of witnessing the Nazi version of race policy. During this period, a vast migration of blacks from the rural South into Northern cities took place. As these individuals began registering to vote, a new and potentially powerful political force was created. In the North, a growing number of religious and labor leaders, intellectuals, and others were dissatisfied with the racist status quo, and began organizing to bring about change. In 1948 President Harry S. Truman (1884–1972) included a broad civil rights plank in his campaign agenda. It cost him the support of many white Southerners, a voting bloc that had long been reliably Democratic. A number of these people splintered from the Democrats to create a new party, the Dixiecrats, who fielded their own presidential candidate, Strom Thurmond (1902–2003). Truman prevailed in spite of this defection, with the help of 70 percent of the votes of blacks in the North.

A watershed moment in the civil rights movement occurred in the late 1940s, when the NAACP, driven by its chief legal counsel (and later Supreme Court Justice) Thurgood Marshall (1908–1993), began confronting segregation head-on through a series of landmark legal cases. Marshall’s chief argument was that “separate but equal” was basically nonsense, since separate facilities, whether in education, housing, or other contexts, almost always meant inferior facilities for blacks. This drive culminated in the U.S. Supreme Court’s 1954 decision in Brown v. Board of Education of Topeka, which brought the doctrine of “separate but equal” to an end with regard to public schools.

Brown v. Board of Education provided a surge of momentum to the civil rights movement, as African-American activists and their allies sought to extend the victory to other corners of society beyond education. Meanwhile, Southern whites saw the case as an omen that their comfortably segregated way of life was under attack. They responded by digging in stubbornly against the progress of the civil rights movement.

Rosa Parks

In December 1955 Rosa Parks (1913–2005), the secretary of the Alabama NAACP chapter, was arrested after refusing to give up her seat to a white man on a Montgomery city bus, as was required by a city ordinance. Following her arrest, the NAACP, local churches, and other individuals and organizations launched a boycott by African-American riders of the Montgomery bus system. One of the boycott’s chief organizers was Martin Luther King Jr. (1929–1968), at the time a young local preacher who advocated nonviolent resistance and civil disobedience as the best route to social change. The Montgomery boycott lasted for more than a year. In the end, in spite of periodic violence that included the firebombing of King’s house, the boycott was successful. In late 1956, the Supreme Court ruled that Montgomery’s discriminatory bus law was unconstitutional.

Momentum for civil rights activism continued to grow with each victory. In 1957 King founded the Southern Christian Leadership Conference (SCLC) in order to help develop the generation of leaders who would drive the movement in the coming years. Still, the movement met with fierce resistance from racist groups, primarily in the South. In 1957 President Dwight D. Eisenhower (1890–1969) sent federal troops to Little Rock, Arkansas, when local authorities refused to allow nine African-American students to enroll in the previously all-white Central High School. A few years later, President John F. Kennedy (1917–1963) deployed U.S. marshals to ensure that James Meredith (1933–) could enroll as the first black student at the University of Mississippi in 1962.

Meanwhile, a steady series of smaller protest actions took place regularly across the South. Many of these actions were carried out by the Student Nonviolent Coordinating Committee (SNCC, pronounced “Snick”), a youth-oriented organization that had branched off of the SCLS. On February 1, 1960, a group of four students sat down at a whites-only Woolworth’s lunch counter in Greensboro, North Carolina, and refused to leave until physically evicted. A series of similar sit-ins followed, targeting not only lunch counters but also segregated theaters, swimming pools, and other facilities. Segregated interstate buses and bus stations were another target. In 1961 a group of these “Freedom Riders” organized by James Farmer (1920–1999), cofounder nearly twenty years earlier of the Congress of Racial Equality (CORE), boarded segregated buses traveling from Washington, D.C., into the South, where they were beaten viciously by Southern racists in some cities, receiving no help from local police. Segregation in interstate transportation was finally ended in September of that year when the Interstate Commerce Commission implemented the Supreme Court’s 1960 decision in the case of Boynton v. Virginia, which deemed such segregationist transportation policies unconstitutional.

Civil Rights Act of 1964

By 1963 racial polarization in the South had escalated to dangerous levels. Violence was widespread. For several days in May, police in Birmingham, Alabama, beat and set attack dogs on followers of King engaged in nonviolent protests. These brutal assaults were captured by television news cameras. Their broadcast outraged much of the American public, and led President Kennedy to address the nation on June 11 calling on Congress to enact strong civil rights legislation. One of the civil rights movement’s defining moments took place on August 28, 1963, when a coalition of African-American groups and their allies organized a huge march on Washington to promote passage of the civil rights bill that had been introduced in Congress. This was the event at which King delivered his famous “I have a dream” speech before hundreds of thousands of supporters of many races.

After the assassination of Kennedy in November 1963, incoming President Lyndon B. Johnson (1908–1973) made passage of the civil rights bill a priority of his administration. Behind fierce lobbying by religious groups of many different denominations, the bill was signed into law by Johnson on July 2, becoming the Civil Rights Act of 1964. The act dealt a tremendous blow to the legal structures that had allowed segregation and racial discrimination to survive up to that time. It prohibited discrimination in places of public accommodation, such as restaurants, motels, and theaters; it denied federal funding to programs with discriminatory policies; it established the Equal Employment Opportunity Commission; and it outlawed discrimination in private businesses with twenty-five or more employees and in labor unions.

Meanwhile, the violence raged on. In 1963 Medgar Evers, field secretary of the NAACP in Mississippi, was gunned down while organizing a boycott to protest voter discrimination. In June 1964, two white civil rights activists, Andrew Goodman and Michael Schwerner, and an African-American associate, James Chaney, were murdered while promoting voter registration among blacks in Mississippi. In spite of the Civil Rights Act, discrimination in election policies was rampant. In 1965 King led a march from Selma, Alabama, to Montgomery to protest voting restrictions that unfairly disenfranchised African-Americans. More than twenty-five thousand people participated in the march, making it one of the largest civil rights protests of the 1960s. The march had a powerful impact; Congress soon responded by passing the Voting Rights Act of 1965, which prohibited the use of literacy tests and other discriminatory policies to filter out minority voters.

Fragmentation of the Movement

In spite of this progress, the movement began to disintegrate in the mid-1960s. SNCC began to lose patience with the nonviolent approach advocated by King, who had been awarded the Nobel Peace Prize in 1964 for his inspiring leadership. SNCC leader Stokely Carmichael (who later changed his name to Kwame Ture; 1941–1998), later influential in the rise of the militant group the Black Panthers, was vocal in his criticism of King’s pacifist philosophy. He began promoting distrust of whites who supported the movement, urging African-Americans to become more aggressive and self-sufficient. This approach was threatening to many white liberals, and it alienated many less-militant African-Americans as well. Bursts of violence broke out in large cities, the first major flare-up being the six-day Watts riot in South Central Los Angeles, California, in August 1965.

As these new, more militant factions of the civil rights movement grew angrier, the appeal for black separatism became louder. The Black Muslims were among those calling for a reorganization of African-American culture that would eliminate dependence on the white establishment. Large race riots took place in Detroit, Michigan, and Newark, New Jersey, during the summer of 1967. As African-American protest grew increasingly violent, many white liberals shifted their energies to protesting against the Vietnam War.

King was assassinated on April 4, 1968, in Memphis, Tennessee. The murder of King touched off riots in 125 cities over the next week. Parts of Washington, D.C., were in flames for three days. Congress reacted by passing the Civil Rights Act of 1968, the most important part of which was Title VIII, known as the Fair Housing Act. The act outlawed discrimination in the sale and rental of most housing.

By this time, however, the civil rights movement was fragmented beyond recognition. Emphasis during the 1970s was on programs aimed at compensating for the impact of past discrimination by giving special consideration to minority applicants for jobs, university admissions, government contracts, and the like. This approach to reducing racial inequality is called “affirmative action.” Critics of affirmative action tend to characterize it as “reverse discrimination” because it uses race rather than other qualifications to award opportunities.

During the 1980s national politics took a dramatic conservative swing, and there was considerable backlash against some of the civil rights movement’s gains of the 1960s. The administration of President Ronald Reagan (1911–2004) reduced the number of lawyers in the Civil Rights Division of the U.S. Department of Justice from 210 to 57. Reagan went so far as to attempt—unsuccessfully—to disband the U.S. Commission on Civil Rights entirely. The Supreme Court under conservative Chief Justice William Rehnquist (1924–2005) chipped away further at government protections against racial discrimination. In a batch of 1989 rulings, the Court said, among other things, that (1) the Civil Rights Act of 1866 did not protect blacks from racial harassment by employers; (2) the burden of proof in employment discrimination cases was with the employee, not the employer; and (3) programs setting aside a portion of city contracts for minority businesses were unconstitutional unless there was evidence of flagrant discrimination. The Court has continued to struggle with the parameters of affirmative action into the twenty-first century.

Mississippi Burning

One of the most shocking events associated with the civil rights movement was the 1964 murder in Philadelphia, Mississippi, of three young activists who were helping to organize a voter registration drive. The three men were James Chaney, a twenty-one-year-old African-American from Mississippi; Andrew Goodman, a white, twenty-year-old anthropology student from New York; and Michael Schwerner, a white, twenty-four-year-old social worker also from New York.

The trio arrived in Philadelphia on June 20, 1964, soon after finishing a weeklong training session in Ohio on minority voter registration strategies. At about 5:00 p.m. their car was stopped by Neshoba County Deputy Cecil Price. Chaney was arrested for allegedly speeding, and his passengers were detained “for investigation.”

During their time in jail, the three were not allowed to make phone calls, and when their fellow civil rights workers called the jail, the secretary was ordered to lie and say the men were not there. They were finally released at 10:30 p.m. and ordered to leave the county. Police soon found the charred remains of their car. Local authorities and Mississippi Governor Paul Johnson played down the incident, suggesting that Schwerner, Goodman, and Chaney were probably fine and had just gone somewhere else to make trouble. However, after offering a $25,000 reward for information about the men’s whereabouts, the Federal Bureau of Investigation received a tip that led them to a farm six miles southwest of Philadelphia, where they found the bodies of the three men on August 4, 1964. Autopsies showed that all three had been shot and Chaney had been severely beaten as well.

Eighteen suspects went to trial in 1967, and seven of them were found guilty, but only of civil rights violations. The event was recapped in several films over the next few decades, most notably the 1988 motion picture Mississippi Burning, starring Willem Dafoe and Gene Hackman, but no further legal action took place. Finally, after years of intensive scrutiny by award-winning investigative reporter Jerry Mitchell of the Jackson Clarion-Ledger, however, the case was reopened. In 2005 Edgar Ray Killen, a local minister who had been a prime suspect in the case from the start but been set free in 1967 by a deadlocked jury, was convicted of three counts of manslaughter.

See alsoBrown v. Board of Education of Topeka

See also Martin Luther King Jr.

See also Rosa Parks

See also Affirmative Action

The National Association for the Advancement of Colored People

The National Association for the Advancement of Colored People (NAACP) was founded in 1909 by a multiracial group of activists based in New York City. Its mission is to ensure the political, educational, social, and economic equality of all Americans by eliminating racial hatred and discrimination.

As the nation’s oldest and largest civil rights organization, the NAACP played a key role in the civil rights movement in the United States, particularly during the 1950s and 1960s. The NAACP’s strategy stressed using the federal courts to attack segregation laws and practices and secure the rights of all Americans, as guaranteed by the U.S. Constitution.

A note about language: At the time of the NAACP’s founding, the words colored and Negro were the accepted terminology for describing nonwhite people of African descent. In the late 1960s, the word black started to be preferred. In the 1980s, the term African-American emerged as the descriptor of choice.

The NAACP and the Civil Rights Movement

The NAACP’s strategy for promoting racial equality, and guaranteeing it in the law, was to present the legal arguments for its necessity by fighting in state and federal courts to strike down segregationist laws. An impetus for the NAACP’s strategy to fight racism via the judicial system stemmed from its belief that the U.S. Supreme Court made an invalid decision in Plessy v. Ferguson (1896), which legalized segregation by establishing the “separate but equal” principle.

The NAACP’s attorneys, among them future Supreme Court Justice Thurgood Marshall (1908–1993), argued that under segregation, the conditions in which blacks lived were separate but never equal to the higher standards afforded whites. The NAACP’s legal challenges and victories revolved around that idea, which ultimately resulted in the Supreme Court decision in Brown v. Board of Education of Topeka (1954), which rejected Plessy by dissolving and deeming unconstitutional the concept of “separate but equal.” The Brown decision inspired the marches and demonstrations of the civil rights movement by motivating a previously intimidated black populace with the possibility of change. The resulting protests and activism helped secure the enactment of the Civil Rights Act of 1964, the Voting Rights Act of 1965, and the Fair Housing Act of 1968.

The NAACP and Other Civil Rights Organizations

While courtroom victories could slay state-sponsored discrimination, changing a predominantly white nation’s long-held attitudes toward blacks could not be resolved with the simple strike of a gavel. To civil rights leaders such as the Reverend Martin Luther King Jr. (1929–1968) and his Southern Christian Leadership Conference (SCLC), and groups such as the Congress of Racial Equality (CORE) and Student Nonviolent Coordinating Committee (SNCC, pronounced “snick”), the battle for equal rights needed to also be waged through emotional entreaties. That is, to enlist the energies of sympathetic whites and the masses of black Americans, the civil rights movement needed to emphasize the moral reasons for equality, and by doing so motivate regular people to act in force by expressing their heartfelt desire for equal rights and protection under the law.

At times, the NAACP viewed King’s theatrical speeches and dramatic but nonviolent demonstrations, marches, and acts of defiance as showboating, in part to enhance his own celebrity. King and his similarly passionate followers thought that while the NAACP’s efforts at working through the judicial system were important, the results were too slow in coming. For instance, a single legal victory was typically preceded by a decade of work. Public adherence to—and the legal enforcement of the rights secured by—such victories often became battles unto themselves. As is often the case with activist causes, the ultimate goals of the various civil rights groups were the same, only the means for getting there differed.

As King himself observed about the civil rights movement’s various approaches, on more than one occasion: “The law cannot make a man love me, but it can keep him from lynching me.” The fight for equality, he said, was “a three-lane road with some emphasizing the way of litigation and mobilizing forces for meaningful legislation, and others emphasizing the way of nonviolent direct action, and still others moving through research and education and building up forces to prepare the Negro for the challenges of a highly industrialized society.”

Reactions to Brown

The Brown decision in 1954 raised the ire of segregationists, which led to an escalation of violence against blacks. However, by showing black Americans that their lot in life could change for the better, “the Negro has seen the exit sign to freedom,” said King. “The whole nation put itself on record then as saying that segregation is wrong.” Speaking about the decision in 1960, NAACP leader Roy Wilkins (1901–1981) noted the irony of his organization’s work being considered controversial: “We ask nothing that is not guaranteed by the American Constitution, that has not been affirmed and reaffirmed in the nation’s noblest expressions of democratic faith from the Declaration of Independence to the United Nations Charter, that is not rooted in our accepted Judeo-Christian ethic.”

After Brown, the NAACP’s profile was raised so high that the group and its members were targeted by extremists and politicians bent on disrupting racial progress. In Alabama, legislation outlawed the NAACP, a policy that held for a decade. NAACP activists including Henry Moore and Medgar Evers were assassinated at their homes. (In 1995 the latter’s wife, Myrlie Evers-Williams, would become executive director of the NAACP.) Because NAACP activities were often less visible—engaging in legal proceedings, for example—some historians credit King and SNCC with making the civil rights movement the powerful cultural force that it was. At the same time violence against civil rights activists was building in the South, the NAACP was working behind the scenes in Washington. It successfully lobbied the administration of President John F. Kennedy (1917–1963) for assistance in securing the rights of black Americans. After Kennedy’s assassination in November 1963, President Lyndon B. Johnson (1908–1973) continued his predecessor’s efforts and signed the Civil Rights Bill of 1964, which outlawed segregation in public places. The following year, Johnson signed into law the Voting Rights Act of 1965.

The Post–Civil Rights Era

Within days of President Johnson signing the Voting Rights Act, racial riots exploded in the Watts section of Los Angeles, California; in the following years other riots occurred in Cleveland, Ohio; Detroit, Michigan; Newark, New Jersey; Washington, D.C.; and other cities. In April 1968 Martin Luther King Jr. was assassinated in Memphis, Tennessee, by a white man named James Earl Ray. Two months later, Democratic presidential candidate and civil rights advocate Senator Robert F. Kennedy (1925–1968) was shot and killed by a deranged Palestinian immigrant, Sirhan Sirhan. That November, Republican Richard M. Nixon (1913–1994) won the presidency, replacing the Democrat Johnson who, bogged down by the Vietnam War, chose not to pursue a second full term in office.

Having gained legal victories in such areas as voting, housing, and education, NAACP leaders and other civil rights activists had to work to ensure that the new laws were being implemented and enforced. Frustrated at seeing integration efforts stalled by ingrained racism, young black people started to advocate for “Black Power” (a catchphrase of the Black Panther Party) in a movement that dismissed integration, espoused black self-determination, and promoted pride in being black; the movement was often viewed as antiwhite. Clashes between blacks and segregationists, and the police forces sympathetic to the segregationists, continued. When the Supreme Court ruled in Swann v. Charlotte-Mecklenburg Board of Education (1971) that busing children from black neighborhoods to white ones (and vice versa) could be federally mandated in order to achieve integration in public schools, white Americans launched violent protests. The issue of busing became a hot-button topic among whites and conservatives who were resistant to some or all of the changes brought on by the civil rights movement.

The influence and purpose of the NAACP during the Nixon years and after is not as easily defined as it was in the 1950s and 1960s. Because of the various civil rights acts, black Americans were beginning to have representatives in or near positions of power. Reliance on the NAACP was no longer as necessary. By the late 1970s the NAACP had serious financial and institutional problems. At the start of the Reagan era in 1981, the civil rights movement had lost both momentum and governmental support.

A century after its founding, however, the NAACP is still a primary player in challenging racial discrimination. The organization’s recent focus has been on equality cases of broad significance involving such issues as employment, education, housing, criminal law, and voting rights.

Affirmative Action

Affirmative action policies seek to redress the past discrimination and disadvantages inflicted upon minorities and women. Such policies often call for giving certain groups of people special consideration in employment and education in order to encourage diversity and promote equal opportunity.

Reasons for Affirmative Action

Prior to the civil rights movement of the 1950s, discrimination on the basis of race, ethnicity, gender, and religion was legal and rampant. Women and blacks were routinely prohibited from educational opportunities in both public and private institutions of higher education. Both were prohibited from entire fields of work opportunities. Newspaper job listings were gender specific and women were rarely worked outside the home once they married or became pregnant. Black people were routinely relegated to menial jobs, regardless of their education or talents. At the time the United States was founded, all women and all nonwhite men were forbidden from owning property or having any significant measure of self-determination.

Policy makers and social activists in the 1960s realized that although judicial and legal victories had been achieved on behalf of equal rights, changing statutes was not enough. Beliefs and attitudes change gradually, regardless of the rules of law.

The 1954 landmark U.S. Supreme Court case of Brown v. Board of Education of Topeka declared as unconstitutional the segregationist practice of educating black and white children in separate schools. However, changing the instilled attitudes and behaviors of an entire society was not as straightforward, especially in a nation that since its founding two centuries earlier had accepted the subjugation of women and blacks. Ingrained prejudice continued despite the Brown decision and the equality guarantees of the Civil Rights Act of 1964 and the Voting Rights Act of 1965. During the 1960s and 1970s predominantly white-male colleges and workplaces continued to be the norm.

How Affirmative Action Was Instituted

The term affirmative action first appeared in 1961, in an executive order issued by President John F. Kennedy (1917–1963). In Executive Order 10925, Kennedy declared that “affirmative action” should be taken to ensure that federal employees are treated equality, regardless of “race, creed, color, or national origin.”

Following Kennedy’s lead, in 1965 President Lyndon B. Johnson (1908–1973) issued Executive Order 11246, which reiterated the points made by Kennedy but with a stronger, enforced, directive. Three years later, Johnson added protections against gender discrimination. Articulating his support for equality and affirmative action in a speech to students at Washington, D.C.’s all-black Howard University in 1965, Johnson said:

You do not wipe away the scars of centuries by saying: Now you are free to go where you want, and do as you desire, and choose the leaders you please. You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, “you are free to compete with all the others,” and still justly believe that you have been completely fair. Thus it is not enough just to open the gates of opportunity. All our citizens must have the ability to walk through those gates. This is the next and the more profound stage of the battle for civil rights. We seek not just freedom but opportunity. We seek not just legal equity but human ability, not just equality as a right and a theory but equality as a fact and equality as a result.

In 1969 President Richard M. Nixon (1913–1994) expanded affirmative action in the federal government by establishing goals and issuing a timetable for construction-related federal contractors to meet certain diversity benchmarks, which became required a year later. In his autobiography that was published in 1978, Nixon wrote that his support of such policies came from his belief that, “A good job is as basic and important a civil right as a good education.”

For an even wider reach, antidiscrimination language pertaining to private sector hiring was written into the Civil Rights Act of 1964. Enforcement was assigned to the newly created Equal Employment Opportunity Commission.

How Affirmative Action Works

Affirmative action policies encouraged employers and educational institutions to set goals and timetables for increasing diversity within their workforce or school, by use of recruitment efforts and preferential hiring. In theory, affirmative action was not supposed to set aside quotas (that is, mandating the number of minority or female individuals to be hired or enrolled), nor was it to be used for the benefit of unqualified candidates. In its purest form, affirmative action would, in the case of two identically qualified candidates, encourage preference to the one from the previously routinely disregarded minority.

So-called weak affirmative action efforts are those that encouraged equal opportunities for all by, for example, banning segregation or discrimination. The more controversial affirmative action efforts were the “hard” policies that required a certain number of set-asides (or quotas) be included specifically for minorities or women.

The Arguments For and Against Affirmative Action

The opponents of such hard affirmative action programs generally believed that past injustices are past. What is important is to provide equal opportunity going forward, without preferences for any race or gender. Otherwise, a quota system is being established in which education and work privileges and opportunities are being given to people based on their gender or race, and not on their academic or career skills. The program to “right past wrongs,” a phrase often used in affirmative action debates, instead created “reverse discrimination,” in which white men were on the losing side.

Supporters of affirmative action argued that minority-friendly policies were necessary in order to level the playing field between whites and people of color and between men and women. Years of oppression cannot be erased and equal opportunity cannot be created if for generations some people—because of their race, ethnicity, gender, or religion—have been denied the educational and vocational experiences available to the more privileged class (specifically white Anglo-Saxon Protestant males). Another argument raised by supporters of affirmative action is that institutionalized racism and violence (such as lynching and other Ku Klux Klan activities) are not in the distant past and, in some cases, are still ongoing. Many of the establishments that practiced and supported racist and sexist policies in the 1960s are still in existence.

The main court case to challenge affirmative action polices is Regents of the University of California v. Bakke (1978). Allan Bakke, a white man, sued the university because he was twice refused admission to the medical school in favor of less-qualified black applicants because, he claimed, of the school’s quota for minority students and the preferential treatment they received during the admissions process. In a 5 to 4 decision, the U.S. Supreme Court ruled that while it was indeed illegal for the university to set aside for certain classes of people a specific number of places (quotas) in the absence of proof of past discrimination, it allowed that in order to achieve a diverse student body, minority status could be used as a factor in admissions.

When the United States fell into a recession in the 1970s, white men (pejoratively dubbed “angry white men”) asserted that they were losing their jobs to minority hires, despite having seniority and better skills. Some accused black Americans of playing the “race card”—blaming racial discrimination—whenever a black person was denied an opportunity or felt victimized. Using such minority groups as Jews and newly immigrated Asians as examples, others asserted that minorities could achieve success on their own, without mandated special treatment. While women and minorities benefited from the way affirmative action made educational and career opportunities available to them, many bristled at any assumption that they were hired for their race or gender as opposed to their skills. Another argument in the debate held that if schools at the elementary and high school levels were providing the same quality of education to all, regardless of race or gender, by the time young people reached college and the workforce, all would be arriving from a level playing field, hence the need for affirmative action would be moot.

The White House and Affirmative Action

Starting with the presidency of Ronald Reagan (1911–2004) in 1981, Republican leaders have challenged affirmative action policies. In the 1990s President Bill Clinton (1946–), a Democrat, asserted that the job of ending discrimination remained unfinished, “Mend it, but don’t end it,” he said of affirmative action in a 1995 speech at the National Archives.

From its outset, affirmative action was envisioned as a temporary policy, needed just until Americans of all races and genders could operate on a so-called level playing field. In 1996 voters in California decided that four decades of affirmative action had evened that field enough. They approved a ballot measure that declared: “The State shall not discriminate against, or grant preferential treatment to, any individual or group on the basis of race, sex, color, ethnicity, or national origin in the operation of public employment, public education, or public contracting.”

Since 2001 a Republican White House again challenged affirmative action policies, specifically any types of quotas or preference standards. In 2003 the administration of President George W. Bush (1946–) joined arguments before the U.S. Supreme Court against affirmative action policies used by the University of Michigan. The Bush administration objected to the school’s race-influenced admission policy, comparing it to a “quota system.” An irony of the Bush White House’s opposition to affirmative action was that two of its top cabinet appointees, successive secretaries of the Department of State, Colin Powell (1937–) and Condoleezza Rice (1954–), both of whom are African-American, had acknowledged that affirmative action policies gave them access and opportunities they would have been denied a generation earlier. It was also noted that Bush himself owed his Yale University admission to a quota system giving preferential treatment to the children of Yale graduates. As the son and grandson of two Yale alums, Bush was granted admission to the Ivy League school despite his less than stellar academic record because of the institution’s “legacy” system, which sets aside spots in each year’s class for the children of alumni.

Milestones in the History of Affirmative Action

1961: In Executive Order 10925, President John F. Kennedy (1917–1963) declares that federal contractors may not discriminate in their hiring practices against “any employee or applicant for employment because of race, creed, color, or national origin.” He then says that federal contractors need to use “affirmative action” to ensure that all employees and applicants are treated fairly. Kennedy also creates the Committee on Equal Employment Opportunity.

1964: The Civil Rights Act of 1964 is signed into law, prohibiting racial discrimination. The act also establishes the Equal Employment Opportunity Commission.

1965: In Executive Order 11246, President Lyndon B. Johnson (1908–1973) requires that all government contractors and subcontractors take “affirmative action” to expand job opportunities for minorities, and ensure that minority employees have the same chances for promotions and pay raises as white employees.

1967: Johnson amends his 1965 executive order to include affirmative action for women.

1970: In a regulation called Order No. 4, the U.S. Department of Labor under President Richard M. Nixon (1913–1994) authorizes the use of goals and timetables to correct the “underutilization” of minorities by federal contractors. (In 1971, Order No. 4 is revised to include women.) The rationale of Order No. 4 is that simply advertising job openings is not enough. Employers need to show that they are actually hiring minorities.

1973: To discourage the use of simple “quotas” for minority hiring, the Nixon administration issues goals and timetables for affirmative action hiring by state and local governments.

1978: In Regents of the University of California v. Bakke, the U.S. Supreme Court upholds the use of race as one factor in choosing among qualified applicants for admission, but rules unlawful the practice of reserving a set number of seats in each entering class for disadvantaged minority students.

1985: The Reagan administration is unsuccessful in its efforts to repeal the 1965 Executive Order 11246.

1995: President Bill Clinton (1946–) declares his support for affirmative action programs by announcing his administration’s policy of “Mend it, don’t end it.”

1996: With the ballot measure Proposition 209, California abolishes all public sector affirmative action programs.

1997: The University of California ends affirmative action programs at its graduate schools, disallowing the use of race, gender, ethnicity, or national origin as a factor in admissions decisions.

1998: Voters in Washington State pass “Initiative 200” banning affirmative action in higher education, public contracting, and hiring.

2003: In two cases involving the University of Michigan, the U.S. Supreme Court holds, in Grutter v. Bollinger, that the use of race among other factors in the law school’s admissions is constitutional because the program obtains “an educational benefit that flows from student body diversity.” Nevertheless, in Gratz v. Bollinger, the justices reject an undergraduate admissions program that grants points based on race and ethnicity.

Executive Order 11246

On September 24, 1965, President Lyndon B. Johnson (1908–1973) issued an executive order prohibiting employment discrimination based on race, color, religion, and national origin by organizations receiving federal contracts or subcontracts. In 1967 the order was amended to also prohibit discrimination based on gender. Future affirmative action efforts were based on Johnson’s combined orders.

The contractor will not discriminate against any employee or applicant for employment because of race, color, religion, sex, or national origin. The contractor will take affirmative action to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, color, religion, sex, or national origin. Such action shall include, but not be limited to the following: employment, upgrading, demotion, or transfer; recruitment or recruitment advertising; layoff or termination; rates of pay or other forms of compensation; and selection for training, including apprenticeship.

Bibliography

U.S. Department of Labor, Employment Standards Administration, Office of Federal Contract Compliance Programs, “Executive Order 11246,” Part II, Subpart B, Sec. 202(1), http://www.dol.gov/esa/regs/statutes/ofccp/eo11246.htm (accessed April 17, 2007).

The Cuban Missile Crisis

The placement of Soviet nuclear missiles in Cuba brought the United States and the Soviet Union to the brink of war in the fall of 1962. This showdown, which became known as the Cuban missile crisis, marked one of the most dangerous moments of the Cold War, as the world waited nervously to see if either superpower would back down in order to avoid a full-blown military confrontation.

Imperialist Threat versus Communist Aggression

In the wake of the botched 1961 U.S.-backed Bay of Pigs invasion aimed at toppling the Castro regime, tensions had risen between the United States and Cuba. In spring of 1962 the Soviet Union, under the leadership of Premier Nikita Khrushchev (1894–1971), initiated plans to secretly deploy nuclear missiles in Cuba. A few months later, Khrushchev began to make public the Russians’ commitment to protecting Cuba against what they perceived as the “imperialist threat” from the United States. That meant supplying additional arms and personnel, as well as indicating that any American attack on Cuba or on Soviet ships going there could result in war.

During August and September of that year, U.S. intelligence detected an increase in the amount of Soviet military aid arriving on the island, as well as other unusual activity. This included the arrival of sophisticated surface-to-air missile installations, bombers capable of carrying nuclear weapons, and thousands of Soviet technical experts. Politics played a key role in the American response. With Congressional elections approaching, the Republicans sensed an opportunity to pick up seats by attacking Democratic President John F. Kennedy’s (1917–1963) response to the Soviet posturing. Kennedy understood that politically, he could not afford to be perceived by the American public as being weak against communism. He therefore responded in September by warning the Soviets that “the gravest consequences would arise” if offensive Soviet weapons were definitively found in Cuba.

Mounting Evidence

On October 14, a U-2 reconnaissance plane returned from a flight over Cuba with pictures of long, canvas-covered objects. Over the next twenty-four hours, American analysts scrutinized the photographs, and finally determined that the Soviets were installing medium-range ballistic missiles and launch pads in Cuba. Located within 100 miles of Florida, these weapons were easily capable of striking a large portion of the United States. President Kennedy was informed of the discovery on October 16, and he quickly called a meeting of his top security advisers. This group later took on the name Executive Committee of the National Security Council, or ExComm.

ExComm conducted a series of meetings at which they considered several different courses of action. These options included a direct military attack on the Cuban missile sites; an all-out invasion of Cuba; a trade-off with the Soviets in which the United States would remove its own missiles positioned in Turkey; and a blockade of Cuba. Kennedy decided to implement a blockade. He also gave the Soviets an ultimatum, delivered in a nationally televised speech, demanding that they remove the missiles and warning that any missile attack on the United States would be met with a U.S. attack on the Soviet Union.

The blockade was scheduled to go into place on October 24. That day, U.S. strategic nuclear forces were placed on their highest status of alert below actual nuclear war, DEFCON 2, as Americans waited anxiously to see how the Soviet Union would respond to this strategy. On the 25th, U.S. Ambassador to the United Nations (UN) Adlai Stevenson (1900–1965) confronted Soviet representative Valerian Zorin with photographic evidence of the buildup at a televised meeting of the UN Security Council. The Soviets denied everything and accused the United States of pushing the world to the edge of a nuclear exchange.

War Narrowly Averted

Tensions ran high over the next several days, as the two sides engaged in secret dialogues. The crisis peaked on October 27, when an American U-2 surveillance plane was shot down over Cuba by a surface-to-air missile. Meanwhile, U.S. intelligence personnel reported that Soviet embassy officials at UN headquarters in New York City were destroying sensitive documents, which is often a sign that war is about to break out.

Behind the scenes, furious negotiations were taking place in an effort to avert a war. A Soviet intelligence agent contacted the Kennedy administration through an ABC news correspondent indicating that the Soviet government would consider removing its missiles from Cuba only if the United States removed its own missiles from Turkey and promised not to invade Cuba. By the end of the next day, Khrushchev had formally agreed to dismantle the missiles and ship them back to the Soviet Union. Kennedy’s public part of the deal was a pledge not to invade Cuba, and an agreement to lift the blockade as soon as removal of the missiles could be verified. Out of the public eye, Kennedy also agreed to remove the American Jupiter missiles from Turkey.

In the aftermath of the crisis, the Democrats ended up gaining seats in the U.S. Senate and losing only two in the House, a much better result than anticipated, largely on the strength of Kennedy’s handling of the situation. The Republicans, who had initially taken the president to task for his inaction on the budding crisis in Cuba, now accused him of manipulating the crisis in order to build support for Democratic Congressional candidates.

While the Cuban missile crisis ultimately ended peacefully, it served as a startling illustration of how quickly Cold War mistrust could escalate into a nuclear confrontation capable of producing unprecedented destruction. The realization of how close the two superpowers had come to nuclear war led to the installation of a “hotline,” a direct communication link between Washington and Moscow that would enable the leaders of the two nations to communicate directly in the event of another crisis, rather than relying on go-betweens to adequately deliver critical messages in volatile situations. The first hotline was ready for action in August 1963. U.S.-Soviet relations improved in a variety of other ways in the months that followed the Cuban missile crisis. In July 1963 the United States, the Soviet Union, and Great Britain signed a treaty prohibiting aboveground testing of nuclear weapons. The United States also began selling wheat to the Soviet Union around that time. Nevertheless, the arms race between the United States and Soviet Union went on unabated for many more years.

See also Fidel Castro

See also The Cold War

See also John F. Kennedy

The Warren Commission

A week after the assassination of President John F. Kennedy (1917–1963) in November 1963, incoming president Lyndon B. Johnson (1908–1973) issued an executive order creating the U.S. Commission to Report upon the Assassination of President John F. Kennedy, a seven-person panel charged with evaluating “all the facts and circumstances surrounding [the] assassination, including the subsequent violent death of the man charged with the assassination.” Johnson named U.S. Supreme Court Chief Justice Earl Warren (1891–1974) to head the panel, and thus it quickly became best known as the Warren Commission.

The Assassination of JFK

The assassination of Kennedy on November 22, 1963, shocked the nation at a time when the United States was enjoying a period of relative peace and prosperity. There were many questions swirling about the circumstances of the assassination. On November 24 the speculation intensified when the suspected assassin, Lee Harvey Oswald, while in the custody of the Dallas police, was gunned down by a local nightclub owner, Jack Ruby. Seeking to calm the nation’s collective jangled nerves, Johnson, freshly sworn in as the thirty-sixth president of the United States, quickly moved to investigate the suspicious events surrounding the murder of his predecessor. On November 29, he issued Executive Order 11130, establishing the Warren Commission. Membership on the commission reflected careful planning in maintaining an acceptable political balance. In addition to Chief Justice Warren, the group included U.S. Senators John Sherman Cooper (a Republican from Kentucky) and Richard Russell (a Democrat from Georgia); Representatives Gerald Ford (a Republican from Michigan) and Hale Boggs (a Democrat from Louisiana); former Central Intelligence Agency (CIA) director Allen Dulles; and John J. McCloy, chair of the Council on Foreign Relations and formerly president of the World Bank and assistant secretary of war.

Proceedings of the Warren Commission began on December 3, 1963. Over the next ten months, the commission scrutinized immense amounts of evidence, including the testimony of 552 witnesses and reports from ten different federal agencies, such as the Federal Bureau of Investigation (FBI), CIA, the State Department, and the Secret Service. The hearings were closed to the public unless the individual testifying specifically requested that the public be invited to attend. Only two witnesses made such a request.

Warren Report

On September 24, 1964, the Warren Commission presented President Johnson with an 888-page final report, officially titled Report of the President’s Commission on the Assassination of President John F. Kennedy but almost universally referred to as the Warren Report. The report was made public three days later. Twenty-six additional volumes of testimony and exhibits were later published as well. The commission concluded unanimously that Oswald had acted alone in assassinating Kennedy, and that Ruby had likewise been a lone vigilante when he in turn killed Oswald two days later. While the conclusion of the commission was unanimous, the report did not, contrary to popular belief, completely dismiss the possibility of a conspiracy. It merely stated that given the information available to the commission, no substantial evidence had been uncovered to suggest that Oswald was acting in concert with others.

In its report, the commission subscribed to the “single bullet theory,” concluding that only one bullet had caused the nonfatal injuries to both Kennedy and Texas Governor John Connally (1917–1993), who was sitting directly in front of Kennedy in the car. According to this view, a second bullet killed the president, and a third shot missed the motorcade entirely. The commission found no connection between the assassination and Oswald’s association with communism and the Soviet Union. They also found no connection between Oswald and Ruby.

Conspiracy Theories Abound

It did not take long for skepticism to bloom in some circles. Critics found numerous flaws in the commission’s reasoning, citing photographs and X-rays from the autopsy and film footage shot by eyewitness Abraham Zapruder as evidence contradicting the official conclusions. Conspiracy theories abounded, and have continued to proliferate ever since. Two books published in 1966—Inquest by Edward Jay Epstein and Rush to Judgment by Mark Lane—charged that the commission had failed to dig deeply enough to uncover the truth with any certainty. The same year, a New Orleans district attorney named Jim Garrison unearthed what he believed to be persuasive evidence of a conspiracy, leading to charges against prominent businessman Clay Shaw. A trial ensued, and Shaw was acquitted in 1969. Film director Oliver Stone later replayed the Garrison-Shaw episode in his 1991 hit movie JFK.

Conspiracy theories and doubts about the accuracy of the Warren Commission’s conclusions have continued to circulate and multiply over the years, though no conclusive evidence has ever emerged that Oswald acted in league with anybody else. One popular theory has it that the assassination was masterminded by Cuban leaders, in retaliation for an alleged plot on Fidel Castro’s (1926–) life orchestrated by the Kennedy administration. Another points the finger at organized crime figures, acting in response to the Kennedy administration’s crackdown on Mafia activities.

The Magic Bullet

One of the most controversial conclusions of the Warren Commission is the contention that a single bullet—separate from the one that actually killed the president—entered John F. Kennedy’s (1917–1963) back, exiting through his neck; hit Texas Governor John Connally (1917–1993) in the back, exiting through his chest; passed through Connally’s wrist, and finally lodged in his leg. According to the theory, the bullet later fell out of Connally’s leg and landed on a stretcher at Parkland Hospital, where it was eventually found. To do this, the bullet had to have passed through 15 layers of clothing, 7 layers of skin, about 15 inches of body tissue, and hit the knot of a necktie, taking out 4 inches of rib and shattering a radius bone along the way. The bullet itself was a 1-inch long, copper-jacketed, lead-core, 6.5-millimeter rifle bullet. It was entered as Warren Commission Exhibit 399, more commonly referred to as CE399.

This “single bullet theory”—derisively dubbed the “magic bullet theory” by skeptics—was central to the theory that Lee Harvey Oswald was the lone gunman. The theory is generally credited to Warren Commission Junior Counsel Arlen Specter, who is now a Republican U.S. senator from Pennsylvania. Only three spent shells were found at the site from which Oswald is believed to have shot Kennedy, meaning that only three shots could have been fired in order for the Commission’s theory to be correct. According to the Warren Report, the first shot missed the president’s limousine entirely, the second was the “magic” bullet, and the third struck Kennedy in the head and killed him. While the single bullet theory was plausible enough to sway members of the commission, it contains quite a few unlikely enough elements that doubts have been raised in the minds of those inclined to suspect a conspiracy.

For one thing, Connally himself and his wife Nellie were certain that he was struck by a second bullet, separate from the one that hit Kennedy in the back. There are also questions about the trajectory of the bullet. A bullet shot from the sixth floor of a building would have had to negotiate some bizarre turns and angles in order to cause all the damage credited to it. There are also problems with the time line: the amount of time it takes to fire three shots from a bolt-action Mannlicher-Carcano rifle—the type of weapon found at the shooter’s location—does not correspond with the timing of the bullets’ arrival at the limousine.

In spite of the myriad questions raised by the single bullet theory, it has never been decisively debunked.

See also John F. Kennedy

See also Earl Warren

The Vietnam War

The eleven-year conflict in Vietnam (1964–1975) was the United States’ longest and costliest war of the twentieth century, and unquestionably one of the most divisive events in modern American history. The war was responsible for the deaths of about 58,000 Americans and more than 3 million Vietnamese. The United States, operating under the “domino theory”—the fear that each small country allowed to fall into communist hands would inevitably lead to a chain reaction of similar events in other neighboring nations—entered the fray with the goal of preventing the reunification under Communist rule of Communist North Vietnam and the U.S.-backed South Vietnam. The United States ultimately failed to achieve its objective, as the North Vietnamese managed to subdue the South and unify the country soon after the withdrawal of U.S. forces in 1973. Along the way, the war bred massive, often violent, dissent inside the United States, creating social scars that took a generation to heal.

War’s Colonial Roots

The roots of the Vietnam conflict can be found in the European colonialism of the previous century. Vietnam, which stretches along the eastern edge of the Indochina peninsula just south of China, became a French colony in the mid-nineteenth century. Resistance to French domination began to grow in the early twentieth century, and a budding independence movement began to emerge in the years following World War I, under the leadership of Ho Chi Minh (1890–1969). During World War II, the Japanese occupied Vietnam, and the French were forced to abandon the colony.

With the defeat of Japan, France assumed that it would resume control of Vietnam and the rest of the territory they called French Indochina, which also included the neighboring countries of Cambodia and Laos. Ho Chi Minh and his nationalist organization, called the Viet Minh, opposed French rule, however, and beginning in 1946 they engaged the French in the First Indochina War. The United States, seeking to squelch the spread of communism in the region, channeled millions of dollars into France’s efforts in Vietnam. That war finally ended when Viet Minh military forces trapped several thousand French troops in a fifty-six-day siege of the French fort at Dien Bien Phu in 1954. France’s surrender at Dien Bien Phu led to peace talks in Geneva, Switzerland, at which the French agreed to withdraw all of their troops from Vietnam. The Geneva agreement also divided Vietnam “temporarily” into separate states in the north and south, with Ho Chi Minh and the Communists in control in the north. National elections to unify Vietnam as a single, independent nation were to be held in 1956.

Instability in South Vietnam

President Dwight D. Eisenhower’s (1890–1969) administration was rightfully worried that Ho Chi Minh would prevail in the election over Ngo Dinh Diem (1901–1963), the U.S.-backed leader of the South Vietnamese government. The U.S. government proceeded to undermine the Vietnamese elections. Diem, with Eisenhower in agreement, argued that free elections in the North were impossible. Elections were held in the South Vietnam, and Diem was elected by an overwhelming majority in a vote that was certainly rigged. He quickly declared South Vietnam an independent nation. The United States pledged millions of dollars in financial support to prop up Diem’s fledgling government, and established a military presence in South Vietnam to ensure the stability of the border between North and South Vietnam.

By 1957 Communist guerrillas, known as the Vietcong, had begun an underground campaign to gain control of South Vietnam. In 1959 two U.S. soldiers were killed during a Vietcong attack north of Saigon, the South Vietnamese capital, marking the first American deaths of the Vietnamese conflict. The following year, North Vietnam openly acknowledged that they were sponsoring efforts to overthrow Diem’s government and force the United States out of Vietnam altogether. Ho Chi Minh and his allies established a political wing of the Vietcong, calling it the National Front for the Liberation of South Vietnam.

In response, the United States ratcheted up its presence in Vietnam. President John F. Kennedy (1917–1963) increased the number of U.S. “advisers” there from eight hundred when he took office to more than sixteen thousand by 1963. The insurgency in the South continued to intensify, fueled not only by North-sponsored propaganda and violence, but also by Diem’s own heavy-handed, repressive methods, which did more to alienate large swaths of the South Vietnamese population than they did to contain the insurgency. In 1963 Diem was assassinated and his government overthrown by elements of the South Vietnamese military. A series of no less than ten unstable governments were put in place in South Vietnam over the next eighteen months.

The Gulf of Tonkin

Meanwhile, American casualties began to reach levels too high to ignore. Forty-five Americans were killed in 1963; the total more than doubled the next year to 118. The year 1964 was a turning point for U.S. involvement in Vietnam. In August of that year, it was reported that two U.S. ships in the Gulf of Tonkin off the coast of North Vietnam were fired upon without provocation by North Vietnamese vessels. In fact, as it was later revealed, one of the American ships had not been fired on at all and the other had been engaged in supporting a South Vietnamese military operation on North Vietnamese turf. Nevertheless, the event gave President Lyndon B. Johnson (1908–1973) a rationale for dramatically escalating U.S. operations in Vietnam.

In response to the events in the Gulf of Tonkin, at least as they were portrayed at the time, President Johnson ordered a large air attack on North Vietnam’s oil facilities and ship bases, and he sought Congressional approval for the power to run a broad military operation in Southeast Asia. Congress quickly gave him that power in the form of the Gulf of Tonkin Resolution, which authorized the president to take whatever measures were necessary to repel Vietcong aggression. Initially, Johnson used this “blank check” to authorize small covert operations by South Vietnamese troops in North Vietnam, and he ordered bombing along the Ho Chi Minh Trail, the name commonly used for North Vietnamese supply routes that snaked through Laos. Undeterred, the Vietcong accelerated their attacks on American outposts, prompting Johnson to commit thirty-five hundred U.S. combat troops to Vietnam on March 8, 1965. That commitment grew to eighty thousand by the end of 1965, as General William Westmoreland convinced Johnson that there were already large numbers of North Vietnamese soldiers operating in the South, and it would therefore take a major commitment of U.S. troops to save South Vietnam.

Over the next year, U.S. involvement in Vietnam blossomed into a full-blown military venture, involving every branch of the American armed services. Vietnam presented a new kind of challenge for U.S. forces. The conflict was fought mostly in rural areas covered with dense foliage, making it difficult to identify enemy soldiers, much less kill or capture them. One strategy employed by the American military was the use of defoliants, such as Agent Orange, to remove the natural vegetation that provided cover for enemy troops. Agent Orange was later found to cause cancer and other problems, and has been blamed for damaging the health of thousands of U.S. fighters and an unknown, massive number of Vietnamese villagers.

Growing Dissent

As the war dragged on, the American public began to tire of the news they were receiving daily about dead and injured young people from their communities. By the late 1960s the war had become quite unpopular at home and the antiwar movement had broadened its support. Historians generally point to the 1968 Tet offensive—a coordinated assault by the North Vietnamese on nearly every South Vietnamese city or town with a substantial population—as the turning point in the war. The Tet offensive was named after the holiday celebrating the lunar new year. It was an all-out plunge designed to end the war in one devastating swoop, with the idea that many South Vietnamese city dwellers already sympathized with the Vietcong, and would help rather than hinder their efforts when they arrived. In all, eighty-four thousand troops attacked seventy-four towns and cities, catching the American and South Vietnamese armies more or less by surprise.

From a military standpoint, the Tet offensive was a colossal failure. The Vietcong army was decimated, and they made no significant strategic gains. In the war of public opinion, however, it was a gigantic success. The attack took a heavy toll on the morale of U.S. forces, and an even heavier toll on the will of the American public to carry on with the seemingly endless effort to keep South Vietnam out of Communist hands, which a lot fewer people seemed to think was important compared to a few years earlier. The strength of the Tet offensive cast doubt on the Johnson administration’s constant assertions that victory was just around the corner.

Nixon’s Exit Strategy

Dragged down by an unpopular war, the financial cost of which had stymied all of his ambitious domestic plans, Johnson announced that he would not seek reelection. By 1969 there were more than half a million U.S. troops fighting in Southeast Asia. Johnson’s presidential successor, Richard M. Nixon (1913–1994), took a different approach, initiating a policy he called “Vietnamization,” which meant gradually turning over more and more of the day-to-day fighting to the South Vietnamese army. Nixon began to slowly and systematically withdraw U.S. troops from the ground war, while simultaneously expanding the use of air attacks. He launched the most intensive bombing campaign in U.S. military history. Meanwhile, he sought to negotiate a settlement that would end the war in way that would save face for America.

Meanwhile, opposition to the war continued to grow, fueled in part by new reports of atrocities committed by U.S. troops. The most highly publicized was the massacre of hundreds of unarmed civilians, mostly women and children, by U.S. soldiers at the tiny South Vietnamese village of My Lai in March 1968 (though the first reports of this massacre did not reach the media for more than a year). When Nixon sent troops into Cambodia in 1970 to root out Vietcong guerillas, the opposition at home reached new heights. In January 1973, Nixon and his key adviser Henry Kissinger (1923–) finally managed to negotiate a settlement that allowed the United States to complete their withdrawal from Vietnam without conceding defeat. The government of South Vietnam strongly opposed the American pullout, arguing that they were doomed without the support of U.S. troops. Their predictions were correct. In 1975—after Nixon had already been brought down by the Watergate scandal—North Vietnamese forces plowed through the South, capturing the capital city of Saigon in April of that year.

The domino theory that had been the basis for America’s involvement in Vietnam to begin with turned out to be false. Rather than steamrolling across the region, communism ultimately petered out in many countries. The unified Vietnam nationalized some businesses and instituted a number of socialist reforms, but within a decade elements of capitalism began to creep into the nation’s economy. By the mid-1990s, the Vietnamese economy was decidedly mixed, with private ownership of businesses on the rise, and in 1995 full diplomatic relations between the United States and Vietnam were established.

Vietnam Veterans Against the War

In April 1967 a handful of disenchanted American service personnel who had returned to the United States after completing their tours of duty in Vietnam formed an antiwar organization called Vietnam Veterans Against the War (VVAW). After marching together in the Spring Mobilization to End the War demonstration along with over four hundred thousand other protesters, the group decided to assemble like-minded Vietnam veterans into a formal organization. As people who had seen the horrors of the war firsthand, VVAW added a degree of credibility to the antiwar movement.

In its earliest stages, VVAW’s posture was composed and dignified, as they sought to contrast themselves from some of the movement’s more radical elements. Wearing suits and ties, they lobbied legislators on military spending issues. VVAW chapters quickly began to spring up around the country, particularly in the Midwest and on the West Coast. Membership grew slowly but steadily as President Richard M. Nixon (1913–1994) maintained the United States’ involvement in the war during the late 1960s. Events beginning in 1970, such as the deployment of U.S. combat troops in Cambodia and the shooting of students at Kent State University in Ohio by National Guardsmen during an antiwar rally, helped accelerate the organization’s growth. Gradually, VVAW became more aggressive in its tactics, as organization leaders began associating with more radical groups, such as the Black Panthers. Members disrupted meetings and staged guerrilla theater events. Revelations about the My Lai massacre and other atrocities committed by U.S. forces sparked additional interest in VVAW on the part of returning soldiers disturbed by what they had witnessed in Vietnam. At one major VVAW protest in April 1971, hundreds of members marched to Arlington National Cemetery, while others gathered on the steps of the Capitol Building in Washington, D.C., to dramatically throw down the medals and ribbons they had been awarded. Congress convened hearings on the war. One of the speakers at the televised hearings was U.S. Navy veteran and Silver Star recipient John Kerry, a VVAW member who would later became a U.S. senator and the 2004 Democratic presidential nominee.

In addition to their activities aimed at ending the war, VVAW also served as a support group for veterans returning from overseas. The organization formed “rap groups,” which gave veterans the opportunity to talk about their experiences in Vietnam and the problems they were facing readjusting to civilian life. By the end of 1971, membership in VVAW had begun to decline, in spite of a handful of high-profile events orchestrated by the group. More than thirty years later, VVAW was in the news once again. The organization leaped to Kerry’s defense during the 2004 presidential campaign when he was attacked by a group called Swift Boat Veterans for Truth, whose members challenged Kerry’s account of the events that led to his wartime honors.

Legislation, Court Cases, and Trials

The GI Bill

The GI Bill of Rights, usually referred to simply as the GI Bill, refers to two pieces of legislation for two different eras—the Servicemen’s Readjustment Act of 1944, which offered World War II veterans a comprehensive package of benefits, most notably financial assistance for college and mortgage subsidies; and the Montgomery GI Bill, passed in the 1980s to attract more volunteers into the military in the post-Vietnam era.

As the end of World War II approached, many lawmakers remembered what had happened at the end of World War I. When millions of veterans returned from service overseas in 1918, the United States entered a severe recession that cast many of these servicemen into unemployment or homelessness or both. With about twice as many veterans expected to return after World War II, there was great concern that the impact of their reintegration into the American economy would be even more disruptive than before.

Servicemen’s Readjustment Act

When World War II was still in progress, a number of groups began meeting to consider how to avert the economic problems associated with masses of returning veterans. President Franklin Roosevelt (1882–1945) organized the National Resources Planning Board Conference on Postwar Readjustment of Civilian and Military Personnel, which held its first meeting in July 1942. The following year, the conference issued a report calling for the government to provide veterans with at least a year of education at any level. Meanwhile, another group, the Osborn Committee of the Armed Forces Committee on Postwar Educational Opportunities for Service Personnel, headed by General Frederick Osborn, was focusing on similar approaches. In the fall of 1943 the American Legion, an organization of U.S. wartime veterans, got involved in the discussion, and began crafting a comprehensive proposal that covered a wide range of needs, including health care, unemployment compensation, education and job training, and home and farm loans.

The American Legion’s proposal became the Servicemen’s Readjustment Act, though their public relations department quickly gave it a catchy nickname, the GI Bill of Rights. One of the bill’s most vocal advocates was newspaper magnate William Randolph Hearst (1863–1951). With sponsorship from Hearst’s nationwide media empire, public support for the bill mounted quickly. Both houses of Congress passed the bill unanimously in the spring of 1944, and on June 22 of that year, just days after the decisive D-day invasion at Normandy, Roosevelt signed the GI Bill into law.

The GI Bill offered veterans an impressive array of benefits. Two of the benefits that had a major impact on postwar American society were educational assistance and mortgage subsidies. Veterans were eligible to receive $500 per year for college tuition and other costs, more than adequate for a university education at the time. During 1947, at the peak of GI Bill usage, nearly half of all students on American college campuses were veterans. From the time the Servicemen’s Readjustment Act was enacted in 1944 to its sunset in 1956, about 7.8 million out of the 15 million eligible veterans received education and job training with GI Bill assistance. The education and training component of the program cost the federal government a total of $14.5 billion over the twelve years it lasted. In spite of early concerns that college campuses would quickly be overrun with veterans, the program was highly successful. It brought a financial windfall for colleges and universities. It also enabled many people to attend college who otherwise would not have had the opportunity, and led to an increase in the nation’s median income, and in turn increased tax revenue.

A New Middle Class

The Serviceman’s Readjustment Act’s mortgage subsidy provision likewise had a transforming effect on American society. About 20 percent of the single-family homes built in the United States in the two decades after World War II were financed with GI Bill assistance. This supercharged demand for new housing stimulated the economy, built wealth among the nation’s multiplying middle-class families, and led directly to the development of the country’s suburban areas.

The Montgomery GI Bill, in contrast to the Servicemen’s Readjustment Act, was enacted during a time of peace as a way to make military service more attractive to potential recruits. Unlike World War II veterans, military personnel who served in Vietnam were not generally greeted as heroes upon their return from war. Because the war had became so unpopular, the number of young adults willing to volunteer for the armed forces began to decline once the draft ended in 1973. In response to the sagging number of volunteers, Congressman G. V. (Sonny) Montgomery, a Mississippi Democrat who chaired the House Veterans Affairs Committee, proposed in 1984 a new version of the GI Bill designed to promote military service, even when the United States was not at war. President Ronald Reagan (1911–2004) signed the Montgomery GI Bill into law later that year. Under this plan, participants could choose to have $100 deducted from their pay every month during their first year of service. In return, the government would provide up to $400 per month toward educational expenses for up to three years.

Quonset Huts

The GI Bill had a democratizing effect on U.S. college campuses. With the influx of millions of veterans into the higher education system after World War II, colleges and universities were suddenly no longer seen as gathering places for children of the nation’s elite. While concerns about colleges being swamped with veterans were not borne out in any damaging way, there was the issue of where to house and instruct this new, oversized generation of budding scholars. The answer on many campuses was the use of prefabricated structures such as Quonset huts.

A Quonset hut is a lightweight, prefabricated building made of corrugated steel with an arched top. The structures got their name from the place where they were manufactured, the Davisville Naval Construction Battalion Center at Quonset Point in Rhode Island. Quonset huts were first used by the U.S. Navy in 1941. The navy needed a type of all-purpose structure that could be shipped easily over long distances and assembled on-site by unskilled workers. The navy awarded the contract to make the first Quonset huts to the George A. Fuller construction company; it took only two months from the day the contract was signed for the company to make their first hut.

The inside of a Quonset hut was open, so it could be configured for any of a variety of uses, from barracks to medical facilities to bakeries. During World War II, between 150,000 and 170,000 Quonset huts were manufactured. When the war was over, the government made surplus huts available to the general public for the very reasonable price of $1,000 per hut. Quonsets proved to be an ideal solution to many building challenges. They saw use as grocery stores, restaurants, churches, and, most important, as cheap shelter during the postwar housing crunch. Quonset huts put in place during the post–World War II years to house GI Bill students can still be found on many college campuses across the United States.

The Employment Act of 1946

The Employment Act of 1946 represented the federal government’s attempt to head off a post–World War II depression by taking an active role in promoting full employment during the nation’s transition back to a peacetime economy. The new law created the Council of Economic Advisers to help the president accomplish his administration’s employment goals, and established the Joint Economic Committee in Congress to analyze relevant public policy strategies and proposals.

As World War II wound down, the American public and policy makers began to consider the economic challenges that lay ahead. Eleven million men were about to return from military service, and would have to be reintegrated somehow into the nation’s civilian economy. Meanwhile, the American people had become accustomed to an economy stimulated by wartime production. A slowdown would remind them too much of the Great Depression that had ended just a few years earlier. Some economists were predicting massive unemployment, skeptical about the economy’s ability to absorb so many returning servicemen.

Keynesian Theory

Both major political parties made full employment a cornerstone of their platforms going into the 1944 election season. President Franklin Roosevelt (1882–1945), in his annual message to Congress in January 1944, talked about a new Economic Bill of Rights, which would include the “right to a useful and remunerative job.” Roosevelt’s thinking was strongly influenced by the work of the British economist John Maynard Keynes (1883–1946), whose groundbreaking 1936 book The General Theory of Employment, Interest and Money argued forcefully that governments could pull their nations out of economic depression by actively stimulating the economy through public works spending and other investment, even if it meant running a deficit budget. He rejected the traditional view that capitalism worked best when the government refrained from interfering and let the system make its own adjustments. The full mobilization of U.S. production capacity during World War II had been one of the first real-world applications of what became known as Keynesian theory.

On January 22, 1945, Senator James E. Murray (1876–1961) of Montana, chairman of the influential War Contracts Subcommittee of the Military Affairs Committee, introduced a bill calling for full employment. Murray’s bill promoted the principle that everyone who was “able to work and seeking work” had the right to a job, and it was the federal government’s responsibility to see that this right was upheld.

No Guarantees

The bill passed the Senate easily in nearly its original form, which directed the president to provide full employment when necessary through a program of public investment and expenditures on works programs. In the House of Representatives, however, the bill met with resistance from conservatives, who saw in it the seeds of socialism. They negotiated away some of what they regarded as the more offensive provisions, making sure the bill did not “guarantee” anybody a job or mandate full employment. Congress also got rid of any specific references to public works or other spending on government-created jobs, instead substituting vague language about using “all practicable means … to promote maximum employment, production, and purchasing power.” The version of the law that eventually passed both houses did not mandate any particular action on the part of the federal government to stave off an economic slump. It did direct Congress to set up a Council of Economic Advisers, to consist of three qualified economists, which would help the president produce an annual report on the state of the economy. It also created a joint congressional committee whose assignment was to review that report and make its own recommendations. Most important, the bill articulated the federal government’s commitment to playing a role in maintaining economic stability. It passed both houses of Congress easily, and was signed into law by President Harry Truman (1884–1972) on February 20, 1946.

As it happened, the postwar depression many economists had predicted did not materialize. In fact, in some ways the opposite of what was predicted took place. Instead of surplus production, there were shortages of many goods. Instead of an economic slowdown there was inflation. There was no gigantic wave of unemployment; instead, the U.S. economy entered one of its longest booms in history. Nevertheless, the logic of Keynesian theory became mainstream, and in the many brief recessions that have occurred since passage of the Employment Act of 1946, the federal government has generally reacted with some combination of increasing public works expenditures, lowering interest rates, and cutting taxes, as advocated by Keynes. The Keynesian model did, however, lose some of its luster in the 1970s when the country experienced an unexpected bout of “stagflation,” a previously rare combination of inflation and economic stagnation. This resulted in many conservatives, including President Ronald Reagan (1911–2004), embracing a competing theory known as “supply-side” economics.

Guns, Butter, and the Council of Economic Advisers

The Employment Act of 1946 created the Council of Economic Advisers, a three-member panel that advises the president on economic policy matters. The first chairman of the Council was Edwin G. Nourse, vice president of the Brookings Institution, a prominent Washington, D.C.–based think tank. A few years after its creation, the Council of Economic Advisers had its first quarrel over the classic “guns versus butter” debate. Nourse believed that the government had to make a choice between guns and butter, meaning that the government could invest public resources either in a strong defense or in generous domestic programs, but not both. Another member, Leon Keyserling, disagreed, arguing that in an expanding economy, the country could afford both a strong military and a high standard of living. Their disagreement came to a head in 1949. Keyserling gained the upper hand in the debate by garnering the support of two influential Truman advisers, Clark Clifford (1906–1998) and Dean Acheson (1893–1971). Nourse resigned as chairman of the council, departing with warnings about the hazards of deficit spending. He was succeeded as chairman by none other than Keyserling.

Keyserling’s notion that one could have one’s guns and eat one’s butter too held sway in both parties for the next two decades, before the advent of stagflation in the 1970s cast doubt on the infallibility of the Keynesian approach to economic policy. By the 1980s a ballooning federal budget deficit became a hot-button political issue. Both Democrats and Republicans began to champion balanced budgets and proclaim the evil of deficit spending, though neither party had much success at crafting budgets whose combined spending on guns and butter did not exceed revenue.

Brown v. Board of Education

In its 1954 decision in Brown v. Board of Education of Topeka, Kansas, the U.S. Supreme Court held that school segregation was unconstitutional, effectively ending the “separate but equal” doctrine that had been established in the 1896 case Plessy v. Ferguson.

Background

Brown v. Board of Education did not come about suddenly. It was the result of a long-term effort spearheaded by the National Association for the Advancement of Colored People (NAACP) beginning as early as the 1930s to attack school segregation across the South. Leading the charge was NAACP lawyer and future U.S. Supreme Court Justice Thurgood Marshall (1908–1993). The NAACP coordinated separate lawsuits in five states: Maryland, Kansas, Virginia, South Carolina, and Delaware. The suits charged that forcing children to attend segregated schools violated their right to equal protection under the law as guaranteed by the Fourteenth Amendment to the Constitution.

Oliver Brown was a railroad employee who lived with his family in Topeka, Kansas. The Browns lived near a major railroad switchyard, which the Brown children had to cross every day in order to attend an all-black school. Brown was unhappy that because of their race his children were not allowed to attend another school that was much closer to home.

When his daughter Linda was entering third grade in the fall of 1950, Brown took her to the nearby whites-only school to attempt to enroll her. Not surprisingly, the principal refused to allow her to attend. Brown, who had no history of civil rights activism up to that time, decided to take action, and approached the head of the local branch of the NAACP for help. NAACP lawyers representing Brown filed a suit in the U.S. District Court on March 22, 1951. The District Court decided against Brown, and he filed an appeal.

Meanwhile, parallel suits were being filed in other states. The U.S. Supreme Court decided in 1952 to consolidate the suits into a single case. After hearing initial arguments in June of that year, the Supreme Court put the case on hold, requesting that both sides reargue their cases the following year with specific regard to the original intent and history of the Fourteenth Amendment as it pertained to educational institutions. The NAACP’s Marshall argued before the Court that racially segregated schools were not and could never be made “equal,” suggesting that segregation could be upheld only if the Court were “to find that for some reason Negroes are inferior to all other human beings.”

Unanimous Decision

Marshall’s argument struck a chord with the justices. They decided unanimously against segregation, erasing the half-century-old “separate but equal” doctrine that had been established in Plessy v. Ferguson. Writing for the Court, Earl Warren (1891–1974), who had just replaced Fred Vinson (1890–1953) as chief justice, opined that “in the field of public education the doctrine of ‘separate but equal’ has no place.” While the Court’s ruling meant schools could no longer legally keep the races separate, the decision did not offer a remedy to the situation. In 1955 the Supreme Court issued a supplemental opinion, known as Brown II, which considered the challenges of desegregating the nation’s thousands of segregated school districts. Instead of ordering all schools to instantly desegregate, the Court called on local districts to implement their own desegregation plans in good faith “with all deliberate speed.”

The impact of Brown v. Board of Education was explosive. The Supreme Court had ruled segregation unconstitutional, but attitudes about race do not change overnight. Many school districts across the nation—not just in the South—initially refused to abide by the new reality brought about by Brown. This resistance led to numerous delays in implementing Brown; in fact, even a decade after the decision was handed down, fewer than 1 percent of public schools in the South had been desegregated. In many cities, riots erupted when African-American students tried to enroll in previously all-white schools as the Court had ordered. The standoff reached a boiling point in 1957 in Little Rock, Arkansas, when President Dwight D. Eisenhower (1890–1969) was forced to call in the military to overcome Arkansas governor Orval Faubus’s blockade of the entrance to Central High School. It took at least another decade for school desegregation to truly penetrate a substantial portion of the Deep South.

Brown v. Board of Education marked a monumental turning point for the civil rights movement. Never again would racial segregation in any institution, be it a school district, a business, or other public facilities, be taken for granted. One by one, segregation in other kinds of institutions—from libraries to public restrooms—was shot down by a series of landmark Supreme Court decisions over the years that followed. Demographic shifts in the nation’s major cities made school desegregation increasingly difficult to accomplish in the decades that followed, as white families fled to the suburbs and urban areas themselves grew more segregated. Nevertheless, the principle that the Constitution supports intentional efforts to eradicate segregation has remained a powerful force in social policy.

See also The Civil Rights Movement

See also Thurgood Marshall

The Exclusionary Rule

The exclusionary rule dictates that any evidence obtained by police or other government agents though an illegal search in violation of the Fourth Amendment to the Constitution is inadmissible in court during a criminal trial. The U.S. Supreme Court affirmed the exclusionary rule with regard to federal courts in 1914—though the central issues were raised as early as 1886—but not until 1961 did the Court extend the rule to apply to courts and law enforcement officials at all levels, from federal down to local.

The main purpose of the exclusionary rule is to deter police misconduct. Before the exclusionary rule came into practice, police had little reason to pay attention to the Fourth Amendment, which protects people against the unreasonable search for and seizure of evidence. Even if the evidence was gathered illegally, it could still be used in criminal trials. The method through which the evidence was obtained did not matter, as long as the evidence itself was relevant to the case. The defendant had no way of stopping the government from using this evidence; he or she could seek a remedy only through other channels, such as a lawsuit against the officers or agents for damages caused by their misconduct.

Weeks v. United States

The Supreme Court addressed this situation in 1914 with its ruling in the case of Weeks v. United States. In that case, a federal agent had conducted a warrantless search of the home Fremont Weeks, who was suspected of violating gambling laws. The evidence found in the search helped convict Weeks, who then appealed the verdict on the grounds that the Fourth Amendment prohibited the use of evidence obtained via a warrantless search. The Supreme Court agreed with Weeks. His conviction was overturned, and the exclusionary rule was born.

Because the Supreme Court’s decision in Weeks applied only to federal courts, however, state courts were free to continue considering evidence that had been obtained illegally. Moreover, there remained a loophole for federal courts as well. Through what was called the “silver platter doctrine,” a federal court could still legally use illegally secured evidence if the evidence had been obtained by a state or local agent and then subsequently handed over to federal officials on a “silver platter.” So as long as it was not a federal officer who violated the Fourth Amendment, the evidence was still admissible in court.

The Supreme Court revisited the issue in the early 1950s. In Rochin v. California (1952), the Court held that evidence was inadmissible if the manner in which it was obtained was an egregious violation of the Fourth Amendment. In this case, the defendant’s stomach had been pumped in order to retrieve evidence that he had used illegal drugs. This ruling made clear that if officials’ actions were offensive enough, they would be punished by having the evidence thrown out. Even after this ruling, however, states could still use evidence obtained illegally in all but the most wanton instances, and the silver platter doctrine still allowed federal courts to use evidence obtained illegally by state agents regardless of whether the search had violated Fourth Amendment protections. The Supreme Court finally did away with the silver platter doctrine in 1960 with its ruling in the case Elkins v. United States, though a vestige of the doctrine pertaining to evidence seized by foreign officials remained intact.

Mapp v. Ohio

It was not until its 1961 decision in Mapp v. Ohio that the Supreme Court effectively extended the exclusionary rule to apply to all state criminal prosecutions. In that case, Cleveland police suspected that Dollree Mapp was harboring the perpetrator of a firebombing. They arrived at her home and demanded entry, which she refused. The police forced their way in, handcuffed her, and proceeded to search the home without a warrant. They did not find the firebombing suspect, but they did find a pornography collection that allegedly violated obscenity laws. Mapp was arrested, convicted, and sentenced to seven years in prison. After the Ohio Supreme Court upheld the case, she appealed to the U.S. Supreme Court. In overturning the conviction, the Supreme Court established that the exclusionary rule applied to criminal proceedings at the state level.

In most cases, the exclusionary rule is invoked by defendants in a hearing before the presiding judge, at which they request to have evidence suppressed on the grounds that it was obtained through an illegal search. Within the exclusionary rule resides what is called the “fruit of the poisonous tree” doctrine, which requires that any evidence obtained on the basis of information or other evidence that was obtained through an illegal search is also inadmissible. In other words, violating the Fourth Amendment in a search “poisons” any evidence that follows from anything discovered in the original breach.

More recently, the exclusionary rule has been diluted and its applications narrowed. In 1984 the Supreme Court established a “good faith” exception to the rule, meaning that if the police’s violation was inadvertent, then the evidence may be admissible. Subsequent decisions have carved out a variety of other exceptions, as the balance between individuals’ right to privacy and the investigative needs of law enforcement continues to evolve.

The Clean Air Acts

As public awareness of environmental issues rose sharply in the 1960s, a series of laws was passed that promoted air pollution research, regulated vehicle and factory emissions, and created standards for air quality. Current U.S. policy regarding efforts to control air pollution is grounded in the Clean Air Act of 1970, as updated through major amendments in 1977 and 1990. Nevertheless, Congress has been passing laws aimed at protecting air quality since the 1950s, and local efforts at tackling air pollution go back as far as the late nineteenth century.

Some of the nation’s earliest efforts to control air pollution include laws passed in 1881 in Chicago, Illinois, and Cincinnati, Ohio, to curb industrial smoke and soot. Other cities soon followed suit, and such laws were soon common in major industrial hubs. The first statewide air pollution control program was launched in Oregon in 1952. The federal government became involved in nationwide air quality control for the first time three years later, when Congress passed the Air Pollution Control Act of 1955, which did not do much to actually improve air quality, but granted $5 million a year for five years to the U.S. Public Health Service to conduct research on the problem. Congress extended the program for another four years in 1960.

Clean Air Act of 1963

The first version of the Clean Air Act was enacted in 1963. This act provided a permanent stream of federal funding—initially $95 million over three years—for pollution research and support for states to set up their own pollution control agencies. The Clean Air Act of 1963 also included language recognizing the hazards of exhaust fumes from automobiles, and called for the development of auto emissions standards for the first time. In addition, the act encouraged research on new technologies to remove sulfur from fuels with high sulfur content, whose use was degrading air quality, with the federal government taking responsibility for these sorts of interstate pollution issues.

While passage of the Clean Air Act was an important step, guidelines developed by the U.S. Department of Health, Education, and Welfare (HEW) pertaining to motor vehicle emissions were still merely advisory; state and local agencies were not obligated to enforce them. In 1964 HEW published a report called Steps toward Clean Air, which made a strong case for mandatory emissions standards. A new Senate Subcommittee on Air and Water Pollution was created, with Democratic Senator Edmund Muskie (1914–1996) of Maine as its chair. With public awareness of the problem growing quickly, the first amendments to the Clean Air Act were passed in 1965, in the form of the Motor Vehicle Air Pollution Control Act. This act established nationwide motor vehicle emissions standards, to be administered by HEW; the Environmental Protection Agency (EPA) would not come into existence for another five years. Reactions to the Motor Vehicle Air Pollution Control Act were mixed. On the one hand, auto companies were predictably horrified, because they would for the first time be required to adapt their technology to meet environmental mandates. Consumer advocate Ralph Nader (1934–), on the other hand, criticized the new law as being too lenient.

Another batch of amendments to the Clean Air Act was passed in 1967. This legislation, called the Air Quality Act, divided the nation into Air Quality Control Regions for monitoring purposes. It also set a timetable for states to develop “state implementation plans” for meeting new standards for emissions from stationary sources, such as factories and fuel-processing facilities.

Clean Air Act of 1970

The Clean Air Act of 1970 represented a complete rewriting of the air pollution control laws that had been passed during the previous decade. It dramatically changed the nation’s approach to addressing the problem of air pollution, and established the backbone of the air pollution control systems that have remained in place ever since. Championed by Senator Muskie and President Richard M. Nixon (1913–1994), the Clean Air Act of 1970 shifted primary responsibility for air quality control from the states to the federal government, though states were charged with monitoring and enforcing compliance.

There were four key components to the regulatory structure established by the 1970 act. First, it created National Ambient Air Quality Standards for six major categories of pollutants. Second, the newly created EPA was to set New Source Performance Standards for determining how much pollution new plants would be allowed to produce. Third, it instituted new motor vehicle emissions standards. Finally, it required states to produce implementation plans, to be approved by the EPA, outlining how they would go about meeting all these new federal guidelines.

The next round of substantial amendments to the Clean Air Act took place in 1977. One main purpose of the 1977 amendments was to address the widespread failure of industries, including automobile manufacturers, to meet the deadlines set out in the 1970 act. These deadlines were extended, but other, more stringent standards were put into place as well. One new feature was the addition of a program for prevention of significant deterioration of air that was already clean, such as in some national parks.

No substantial changes in the Clean Air Act were made during the 1980s, as the administration of President Ronald Reagan (1911–2004) opposed any strengthening of environmental regulations that might negatively impact the growth of American industry. In 1990, however, a new version of the Clean Air Act was passed. The Clean Air Act of 1990 addressed some new topics that had come to the fore in the years since the last round of amendments, including acid rain and ozone layer depletion. In 2003 Republicans in the House and Senate introduced the Clear Skies Act, which would weaken some of the Clean Air Act’s standards. The Clear Skies Act failed to garner enough support to pass through Congress; even without new legislation, however, some of its features were implemented administratively through the EPA.

Sex Discrimination in Employment: Title VII

While the Civil Rights Act of 1964 is best known for its impact on racial discrimination, Title VII of the act also included language prohibiting employment discrimination on the basis of gender. This legislation marked a critical step forward in the movement toward equal opportunity for women in the United States.

For generations, men had traditionally been the primary breadwinners in American families, while women were in charge of the home and children. World War II changed the employment landscape for women, however, as they moved into the workplace to perform jobs formerly held by the men who were now overseas fighting the war. During the war, the National War Labor Board recommended that women be paid the same wages as men. However, equal compensation was a completely voluntary policy, and most employers did not comply.

Equal Pay Act

During the early 1960s President John F. Kennedy (1917–1963) and his associates had worked hard to produce a sweeping bill that would outlaw once and for all various forms of employment discrimination. One result was the Equal Pay Act of 1963, which amended the Fair Labor Standards Act of 1938. The Equal Pay Act essentially made it illegal to pay women less than men for doing the same work.

The Civil Rights Act of 1964 added a number of protections against gender-based employment discrimination, and the presence of sex as a protected class in Title VII of the act played an interesting role in the law’s passage. As originally drafted, Title VII prohibited employment discrimination on the basis of race, color, religion, and national origin. It did not cover gender discrimination.

In February 1964 Congressman Howard Smith, the powerful Virginia Democrat who chaired the House Rules Committee, offered an amendment to the bill that added sex as a prohibited basis for employment discrimination. Many observers were baffled, as Smith had been a longtime opponent of civil rights legislation. Some supporters of the bill believed he was trying to sabotage it by adding language about sex discrimination in order to smash apart the bipartisan consensus required to pass the bill. The bill’s sponsors opposed Smith’s amendment, thinking the bill should focus exclusively on matters of race. Labor unions, many of which had a history of discrimination against women, were not happy about the amendment either.

Possible Poison Pill

Smith may not have had an ulterior motive. While he was no friend to the civil rights movement historically, he had a longstanding alliance with feminist leader Alice Paul (1884–1977), and had always been a supporter of the Equal Rights Amendment. He recognized the Civil Rights Act as a potential vehicle for advancing some of the women’s rights he believed in. Regardless of Smith’s motives, it is clear that many other Southern Democrats viewed Smith’s amendment as an opportunity to scuttle the bill; most of them supported adoption of the amendment. The amendment was adopted on a vote of 164 to 133.

Most of the Democrats from the South who had supported Smith’s amendment ended up voting against the full bill, which would seem to confirm that they had hoped the sex discrimination provision would be a “poison pill.” The bill passed anyway, and the Civil Rights Act, with Title VII expanded to included gender discrimination, was enacted on July 2, 1964. Title VII empowered the Equal Employment Opportunity Commission to enforce the act’s prohibitions against employment discrimination.

In the years since passage of the Civil Rights Act, Title VII has been expanded a few times. In 1986 the U.S. Supreme Court ruled in the case Meritor Savings Bank v. Vinson that women had the right to protection from a “hostile work environment.” Later expansions protected women from discrimination based on pregnancy and broadened the definition of sexual harassment in the workplace.

Griswold v. Connecticut

The 1964 U.S. Supreme Court case Griswold v. Connecticut affirmed married couples’ right to privacy in the bedroom. Specifically, this case determined that a Connecticut law preventing married couples from using birth control was unconstitutional. This outcome resulted in all state laws on the books prohibiting the use of contraceptives by married couples being struck down. It also set the stage for other landmark sexual privacy rulings that followed over the next decade, including Eisenstadt v. Baird (1972), which upheld the right of unmarried people to use birth control, and Roe v. Wade (1973), which made abortion legal.

Background

Connecticut passed a law in 1879 making it illegal to use any kind of birth control drug or device, regardless of whether the use was by a married couple. In addition it became illegal to provide medical advice or information having to do with birth control. A number of other states passed similar laws around that time. By the middle of the twentieth century, the law was hopelessly out of date and rarely enforced. It nevertheless remained on the books in spite of its unpopularity. In 1942 the Planned Parenthood League of Connecticut, a group involved in public education about birth control, attempted to challenge the law in the U.S. Supreme Court. In that case, the appellant was a doctor, and the Court ruled that he did not have standing to sue, as he himself was not harmed by his inability to dispense, or advise patients about, birth control.

Another attempt to bring down the law was made in 1961, when a group of women brought suit. This time, the Court refused to decide the case (Poe v. Ullman) on the grounds that it was never enforced, calling it “dead words” and “harmless empty shadows.” Not all members of the Court felt this way, however. In his dissenting opinion, Justice John Marshall Harlan (1899–1971) wrote that the law represented an “unjustifiable invasion of privacy,” and should therefore be struck down.

In November 1961, four months after the Supreme Court’s nondecision in the Poe case, Estelle Griswold, executive director of the Planned Parenthood League of Connecticut, and Dr. Charles Lee Buxton, chairman of Yale University’s obstetrics department, opened a birth control clinic in New Haven, Connecticut. They contended that by declaring the law dead, the Supreme Court’s language in the Poe case made it legal for doctors in Connecticut to prescribe birth control. However, nine days later, their clinic was closed and the two were arrested.

Conviction Leads to Test Case

Both Griswold and Buxton were convicted at trial in the Sixth Connecticut Circuit Court, in spite of defense attorney Catherine Roraback’s argument that the Connecticut birth control law violated her clients’ constitutionally protected right to freedom of speech. They were each fined $100. The convictions were upheld in both the Appellate Division of the Sixth Connecticut Circuit Court and the State Supreme Court of Errors. In both cases, the courts held that it was the legislature’s job, not theirs, to change a bad law. This set the stage for the case’s arrival on the U.S. Supreme Court docket.

Oral arguments before the Supreme Court began on March 29, 1964. Attorney Thomas Emerson, a Yale law professor, argued on behalf of Griswold and Buxton that Connecticut’s birth control law deprived his clients of their First Amendment right to free speech, as well as their Fourteenth Amendment right to liberty, which could not be abridged without “due process of law,” and their right to privacy as guaranteed by the Ninth Amendment. Emerson also asserted that the Connecticut law was based on a moral judgment—the notion that the use of contraceptives is immoral even within a marital relationship—that did not “conform to current community standards.”

Thomas Clark, the attorney for the State of Connecticut, was repeatedly called on to explain the purpose of the law. He maintained that the law was put into place in order to deter sex outside of marriage. He found it difficult, however, to explain (1) why then the law should apply to married couples; and (2) why the law was necessary when there were already laws on the books prohibiting fornication and adultery.

Supreme Court Reversal

The Supreme Court voted 7–2 to reverse the convictions of Griswold and Buxton, vaporizing the 1879 Connecticut law in the process. In his majority opinion, Justice William O. Douglas (1898–1980) wrote that enforcing laws such as the Connecticut birth control ban represented a gross violation of the right to privacy, presumably guaranteed by the Ninth Amendment, which reads: “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.” In other words, just because the Constitution does not specifically mention the right to use birth control in the privacy of one’s own home does not mean that the government can search your bedroom for evidence of contraceptives if it sees fit to do so.

The Griswold case marked a significant change in the way the Ninth Amendment had historically been interpreted. Prior to 1964 it had usually been interpreted to mean that any right not specifically granted to the federal government by the Constitution fell by default into the domain of state government. The interpretation articulated by Douglas was more literal, reserving such rights “to the people” as per the actual language of the amendment.

This expanded interpretation of the right to privacy laid the groundwork for two important challenges over the next several years to state laws restricting people’s reproductive behavior. In 1972 the Supreme Court ruled in the case Eisenstadt v. Baird that single people had the right to buy and use contraceptives. Writing for the majority, Justice William Brennan (1906–1997) pointed to the line of reasoning first outlined in Griswold as the basis for the decision. If the Ninth Amendment protected the privacy of married people, he noted, then it should also apply to nonmarried people, because it is actually the individuals making up the marriage with whom the right resides. Brennan opined that being married is not a prerequisite to freedom from “unwarranted governmental intrusion into matters so fundamentally affecting a person as the decision whether to bear or beget a child.”

The following year, similar reasoning was used to extend this right of privacy to include a woman’s right to choose to terminate a pregnancy in the controversial case Roe v. Wade.

Planned Parenthood of Connecticut

Planned Parenthood of Connecticut was founded in 1923 by Mrs. Thomas N. Hepburn—mother of the actress Katharine Hepburn—and a few of her friends. After attending a speaking engagement in Hartford, Connecticut, that year by birth control advocate and Planned Parenthood pioneer Margaret Sanger (1879–1966), the group decided to launch the Connecticut Birth Control League. One of their main objectives was to get their state’s law banning the use of and dissemination of information about birth control overturned. Connecticut’s law was one of several state laws collectively known as Comstock Laws, named after Anthony Comstock (1844–1915), who had gotten similar laws passed in several states in the 1870s.

In spite of the group’s efforts and the growth of their organization, their nine attempts to get the state legislature to change the law between 1923 and 1931 all failed. Nevertheless, the Birth Control League opened the state’s first birth control center in Hartford in 1935, and by the end of that decade had opened clinics in nine other cities as well. The clinic in Waterbury, Connecticut, was raided by police six months after it opened, and two doctors and a nurse were arrested. Their defense attorneys argued that the state’s Comstock Law was unconstitutional, but the Connecticut Supreme Court upheld the law in its 1940 ruling. The league continued to attack the law through litigation, but was unsuccessful until Griswold v. Connecticut presented an ideal test case with which to directly challenge the law.

By the time the Griswold case was decided, Planned Parenthood of Connecticut was serving about 300 women a year. In 2004 the organization provided medical services to more than 58,000 individuals, and provided information to over 16,000 people through its public education and outreach programs.

New York Times v. Sullivan

In its ruling on the landmark case New York Times v. Sullivan, the U.S. Supreme Court changed the rules regarding what constituted libel in statements made by the media about public figures. Prior to Sullivan, standards for libel cases were determined at the state level. False statements about public figures were not generally considered to be free speech protected by the First Amendment to the Constitution, but each state was free to interpret libel laws as its leaders saw fit. This case limited states’ authority to award libel damages, putting in place a national standard requiring the presence of “actual malice” for determining libel cases that involved public figures. In granting new protections to the press, Sullivan fundamentally changed the way the U.S. media dealt with controversial issues. Because they no longer had to fear a costly libel suit for inadvertently getting a fact wrong about a public figure, newspapers and other media outlets were able to go after perceived wrongdoers more aggressively than in the past.

Background

The events leading to Sullivan took place in the context of the civil rights movement. In 1960 Martin Luther King Jr. (1929–1968) and other civil rights leaders were engaged in a series of antisegregation protests in Montgomery, Alabama. Officials in Montgomery reacted strongly to the protests, and acted aggressively to thwart them. In March of that year, a group calling itself the Committee to Defend Martin Luther King and the Struggle for Freedom in the South took out a full-page advertisement in the New York Times that asked readers to contribute money to support their civil rights efforts. The ad, running under the headline “Heed Their Rising Voices” and signed by sixty-four prominent political, religious, and artistic leaders, was published in the March 29 edition of the Times, of which six hundred thousand copies were printed.

The ad claimed that state and local officials in Alabama had responded with a “wave of terror” to peaceful demonstrations by thousands of African-American students in the South. Events supporting this charge were described, but the ad did not mention any particular public official by name.

When Montgomery city commissioner L. B. Sullivan, who supervised the city’s police department, heard about the ad, he was outraged. On April 19 he filed a libel suit against the Times, claiming that the ad’s reference to “Southern violators of the Constitution” in Montgomery had unjustly defamed him. His suit asked for $500,000 in damages. During the trial, Sullivan was able to establish that the ad contained a number of inaccurate statements about the events that had taken place. The judge instructed the jury that under Alabama law, it did not matter whether Sullivan had suffered any financial loss, and that if the statements were found to be libelous, then malice was presumed to be present. Therefore, the jury really had to decide only whether the ad actually concerned Sullivan, which was not clear because he was not specifically mentioned in it.

Circuit Court Ruling

On November 3, the circuit court in Montgomery ruled in favor of Sullivan, and awarded him the full $500,000. The Alabama Supreme Court upheld this decision in 1962, applying an extremely broad definition of libel. The court wrote: “Where the words published tend to injure a person libeled by them in his reputation, profession, trade or business, or charge him with an indictable offense, or tends to bring the individual into public contempt [they] are libelous per se.… We hold that the matter complained of is, under the above doctrine, libelous per se.”

The Times appealed the Alabama Supreme Court’s ruling, and the case moved up to the U.S. Supreme Court. In unanimously overturning the Alabama court’s decision, the Supreme Court on March 9, 1964, established a totally new standard for libel in cases concerning public officials. In his opinion, Justice William J. Brennan (1906–1997) wrote: “We hold that the rule of law applied by the Alabama courts is constitutionally deficient for failure to provide the safeguards for freedom of speech and of the press that are required by the First [Amendment] in a libel action brought by a public official against critics of his official conduct.” Brennan went on to write that in order to succeed in such a libel case, the public official must prove that the statement was made with “actual malice,” defining actual malice as “knowledge that it was false or with reckless disregard of whether it was false or not.” Clearly, Sullivan had not proven that the Times acted with actual malice as so defined.

This new libel standard, applicable to every state, placed a much heavier burden on public officials seeking to sue for libel. The Supreme Court’s position was that the press must be free to criticize the actions of officials related to controversial matters—such as civil rights—of great importance to the public, without fear of facing a suit just for getting a few minor facts wrong.

A Victory for Freedom of Press

The Supreme Court’s decision in New York Times v. Sullivan marked a major shift in the law’s attitude toward the press. For the first time, the law was willing to look the other way on certain types of falsehoods published in the press, on the grounds that the importance of the free exchange of ideas far outweighed the damage caused by inevitable, inadvertent errors. In the “absence of malice,” public officials cannot recover damages from those who publish false statements about them.

While the Court’s decision was unanimous, not all of the justices agreed entirely with Brennan’s thoughts on the case. Justices Hugo Black (1886–1971) and William O. Douglas (1898–1980), in separate concurring opinions, questioned whether the press should ever be held liable for defaming public officials. Their interpretation of the freedom of the press provisions in the First Amendment led to the conclusion that the press must be absolutely immune from liability for criticizing how public officials perform their duties.

In the years following the Sullivan decision, a sequence of decisions in other cases helped hash out such lingering questions as who is a public official, what constitutes official conduct as opposed to private conduct, and where the line should be drawn regarding the right of public figures to keep personal information private. Throughout this thread of cases, the Court made sure the new libel rules were applied effectively, and issued rulings that gave the press even more latitude. In Rosenblatt v. Baer (1966), the Court extended the Sullivan rule to public officials of lesser stature, and the following year, in Associated Press v. Walker, the rule was further stretched to cover not just public officials, but public figures outside the realm of government service. Over the years, the position on libel staked out by Sullivan has come under criticism from both directions. Public officials derided the policy for making it too difficult to recover damages from libel, while the news media have argued that because the ruling did not go far enough they are still subject to long, costly litigation, even though they usually win.

See also William Brennan

The Food Stamp Act of 1964

The Food Stamp Act of 1964 created a permanent food stamp program in the United States that assists poor families with food purchases, while also supporting the nation’s farmers by boosting consumption of agricultural products. The Food Stamp Act was a central piece of President Lyndon B. Johnson’s (1908–1973) social agenda known as the “Great Society.” The stated goals of the act were to “strengthen the agricultural economy; to help achieve a fuller and more effective use of food abundances; [and] to provide for improved levels of nutrition among low-income households through a cooperative Federal-State program of food assistance to be operated through normal channels of trade.”

Food Stamp Origins

The origins of the food stamp program were in the Great Depression, as the federal government sought to counter overproduction by raising the amount of agricultural goods Americans consumed. The U.S. Department of Agriculture (USDA) saw the program as a way to address two crucial problems—low demand for farm products and increasing hunger among the poor and unemployed—with a single government initiative. The first food stamp program was launched in Rochester, New York, in 1939. It spread to about fifteen hundred other counties before the economic upswing triggered by World War II made the program less necessary.

Over the next twenty years, various advocates proposed reviving the food stamp program, but the federal government made no move to do so. The idea finally took hold in 1960. While campaigning in parts of the poverty-stricken Appalachian Mountains that year, John F. Kennedy (1917–1963) witnessed firsthand the prevalence of severe hunger in the region. After he was elected president, Kennedy instructed the USDA to establish food stamp pilot programs. The first of these test programs was set up in McCowell County, West Virginia, in May 1961.

The idea behind food stamps is to create a mechanism—namely the stamps—for transferring surplus food grown by farmers to the people who need the food but cannot afford it. Rather than subsidizing farmers directly, food stamps provide a way to increase demand for food at stores, which in turn translates into increased purchases of crops from farmers. The food stamp pilot programs were very successful in achieving this objective.

When Johnson inherited the presidency after the assassination of Kennedy, he prepared an ambitious agenda of social programs he referred to collectively as the Great Society. Johnson’s Great Society initiative encompassed a range of programs designed to solve the problem of poverty. He aimed to improve access to health care, job training, education, and a variety of other services crucial to disadvantaged families. Johnson made food stamps one of his Great Society programs. The Food Stamp Act, which would make food stamps a permanent, nationwide program, was introduced into Congress with strong bipartisan support, as well as the backing of the USDA, the National Farmers Union, and a host of poverty groups. It was enacted on August 31, 1964.

Federal-State Partnership

Like the Depression-era version of the program, the new food stamp program had two constituencies: poor people and farmers. The program was a collaborative effort between federal and state governments. The USDA distributed the stamps through state welfare offices, and recipients could then spend them almost like cash at their local grocery stores (certain items, such as alcohol, were excluded). The federal government was responsible for reimbursing stores for the value of the stamps, while state agencies handled eligibility determination and distribution. The cost of administering the program was shared between the two levels of government.

In the years following passage of the Food Stamp Act, the program evolved. For one thing, poverty was becoming more of an urban problem. The USDA adapted to the changing demographics of hunger by transforming food stamps from a way of disposing of the nation’s agricultural surplus while feeding the hungry to more of a straightforward welfare program. Eligibility requirements were loosened, and benefits increased. The Food Stamp Reform Bill of 1970 solidified and standardized these changes in eligibility rules and nutritional standards. Additional legislation in 1973 made food stamps an entitlement program, meaning states must offer it to anybody who is eligible. As the 1970s continued, however, the food stamp program, and many other assistance programs, came under increased criticism for being lax in their detection of fraud and abuse. The Food Stamp Act of 1977 tightened eligibility standards and established more rigid guidelines for administering the program.

Criticism of assistance programs grew more insistent during the 1980s, as the administration of President Ronald Reagan (1911–2004) sought to reduce federal spending on assistance for the poor. Concerns were raised that easy access to benefits was functioning as a disincentive to work and was attracting illegal immigrants. This growing hostility culminated in the 1996 passage of the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA), sometimes referred to as the Welfare Reform Act. While food stamps remained an entitlement program, PRWORA ushered in an assortment of policies designed to shrink enrollment in public assistance programs, shifting the burden of aiding families in crisis from government agencies to private charities and food banks.

See also Lyndon B. Johnson

Food Stamp Fraud

Since the food stamp program was first implemented, policy makers have always been concerned about fraud and abuse. In the decades since the Food Stamp Act was passed, Congress has tweaked the rules on a number of occasions to make it harder for people to cheat the program. In reality, only a tiny percentage of food stamp recipients have engaged in fraud or abuse. However, when it comes to entitlement programs, lawmakers and the media have always tended to play up such activities when they are uncovered.

There are two basic kinds of fraud that people engage in with regard to the food stamp program: providing false information when applying for the program in order to cheat on the eligibility restrictions; and trafficking in food stamp coupons for financial gain. The first type of cheating is relatively straightforward to address by requiring people to document their information more thoroughly when applying for the program. However, some advocates who work on behalf of the poor are against making it more burdensome to apply for food stamps. They argue that many people who need financial assistance have limited education and poor literacy skills, and that making the application process more complicated will prevent some eligible families from receiving assistance that they desperately need. According to this view, because the incidence of fraud is fairly low, it is better to allow a few cases of fraud to go undetected than to create unnecessary barriers to enrollment for those who legitimately qualify.

The other type of abuse is more difficult to address. Some stores illegally allow people to use food stamps for nonfood purchases, in violation of program rules. These stores usually give the customer only partial value for their food stamps. For example, they might allow somebody to use $5 in food stamps to cover $3 toward the purchase of laundry detergent or magazines. According to the federal government’s Food and Nutrition Service (FNS), this sort of practice is much more common in small stores than in large ones. While only about 15 percent of food stamps are redeemed in small grocery stores, those stores accounted for about $190 million in food stamp fraud in 2005.

One recent development that is helping to eliminate this type of abuse is the replacement of paper food stamp coupons with an electronic benefits transfer (EBT) system, which works a lot like bank-issued debit cards. EBT has a lot of advantages. Besides helping to cut back on fraud, EBT also reduces the problem of food stamp theft, because a personal identification number is needed to use an EBT card. According to the FNS, about $241 million in food stamp benefits were stolen in 2005.

Heart of Atlanta Motel v. United States

The U.S. Supreme Court’s decision in Heart of Atlanta Motel v. United States—the first major test of the Civil Rights Act of 1964—advanced the cause of the civil rights movement by affirming that the federal government’s power to regulate interstate commerce could be used to combat racial discrimination. This landmark case upheld the constitutionality of Title II of the act, which guaranteed minorities full access to places of public accommodation.

Civil Rights Act Challenged

The Heart of Atlanta Motel was a 216-room facility in downtown Atlanta, Georgia. Prior to passage of the Civil Rights Act of 1964, the Heart of Atlanta consistently refused to rent rooms to African-Americans. The ownership of the motel hoped to continue their whites-only policy, so they filed a lawsuit in the U.S. District Court for the Northern District of Georgia, in which they contended that Title II of the act violated their constitutional rights. Heart of Atlanta also filed for an injunction to prevent the government from enforcing the provision dealing with public accommodations.

In the suit, lawyers for the Heart of Atlanta made several arguments. They claimed above all that Congress had overstepped its power under the Commerce Clause—which gives Congress the ability to regulate interstate commerce—by seeking to regulate local private businesses such as their motel. They also argued that the act violated the Fifth Amendment, which prohibits the taking of liberty and property without due process of law, by depriving the motel’s operators of their right to choose customers as they saw fit.

The federal government counter sued, asking that the act be enforced and the motel be forced to stop discriminating against African-Americans. Lawyers for the government contended that refusing to provide accommodations to African-Americans interfered with interstate commerce by making it harder for them to travel on business. Noting that some three-quarters of the motel’s clients were from out of state, the government argued that this situation was clearly covered by the Commerce Clause.

Commerce Clause

The District Court ruled against the Heart of Atlanta, and ordered the operators to stop denying service to African-Americans solely on the basis of their race. The motel appealed the decision, and the U.S. Supreme Court agreed to hear the case. In December 1964 the Supreme Court ruled unanimously against the Heart of Atlanta, upholding the constitutionality of Title II of the Civil Rights Act. In his opinion, Justice Tom C. Clark (1899–1977) wrote, based on transcripts of Congressional debate over the Civil Rights Act, that the discrimination African-Americans faced when they tried to find accommodations clearly “had the effect of discouraging travel on the part of a substantial portion of the Negro community.” The evidence, he continued, was “overwhelming … that discrimination by hotels and motels impedes interstate travel” which in turn impedes interstate commerce. He noted that the authority of Congress under the Commerce Clause of the Constitution to intervene in order to keep interstate commerce flowing freely had been long established.

Clark further argued that Commerce Clause allowed Congress to regulate not only interstate commerce, but also to regulate commerce within a state if that commercial activity had a harmful effect on interstate commerce. So however “local” the Heart of Atlanta’s discriminatory policies seemed, they were fair game for regulation by the federal government if they stopped African-Americans from traveling for purposes of interstate commerce.

In the aftermath of the Supreme Court’s decision in Heart of Atlanta Motel v. United States, the Commerce Clause became an important weapon for the federal government to wield in its efforts to eliminate racial discrimination nationwide. In another case decided the same day, Katzenbach v. McClung, the Court gave a similar rationale in deciding against a small restaurant that did not serve African-Americans. According to the Court, because the restaurant purchased some of its food and supplies from out of state, it was engaging in interstate commerce, and was therefore subject to the Commerce Clause. That most of its customers were local was deemed irrelevant.

See also The Civil Rights Movement

Medicare and Medicaid

As part of his Great Society package of domestic programs, President Lyndon B. Johnson (1908–1973) signed into law two important government health-care programs in 1965: Medicare, which provides health insurance coverage for the elderly and people with disabilities; and Medicaid, which provides coverage for the poor.

Origins

The seeds of government-sponsored health insurance in America were planted during the Great Depression. Historically in the United States, people have obtained health insurance through their employers. During the Depression, when a large percentage of Americans were unemployed, President Franklin Roosevelt’s (1882–1945) administration developed New Deal programs to help people cover their medical costs. The nonprofit Blue Cross and Blue Shield programs were created in the 1930s as the health insurers of last resort. During World War II, many employers improved their employee insurance plans as a worker retention strategy, because fringe benefits were exempt from wartime wage freezes. Nevertheless, those without insurance through an employer or labor union, such as retirees, were often out of luck.

The problems of the uninsured became more acute after World War II, as medical breakthroughs and advanced technology drove up the cost of health care. Health-care costs more than doubled during the 1950s. By 1960 it was estimated that the average senior citizen in the United States was spending 15 percent of his or her income on health care. That year, Congress passed the Kerr-Mills Act, named for Senator Robert Kerr (D-AR) and Congressman Wilbur Mills (D-OK), which provided federal grants for state-run programs that subsidized health-care expenditures for elderly persons who could not afford it on their own. Few states ended up participating in the Kerr-Mills program because of its unfavorable cost-sharing system, but it did spark the dialogue that would lead to more comprehensive reform a few years later.

In 1961 Congressmen Cecil R. King (D-CA) and Clinton Anderson (D-NM) introduced a proposal for a program, which they called Medicare, that would provide hospitalization coverage for Social Security recipients. The program would be funded through an additional Social Security tax, and included an annual deductible participants would have to pay before benefits kicked in. Doctors, represented by the American Medical Association (AMA), lobbied fiercely against the proposal, arguing against any government interference in patient care. Medical groups mounted a public relations campaign designed to spread anxiety over “socialized medicine” among ordinary Americans.

Social Security Act Amendments of 1965

The King-Anderson proposal stalled in Congress, as civil rights legislation dominated debate in 1963 and 1964. Johnson’s landslide victory in the 1964 presidential election—coupled with Democratic control of both houses of Congress—removed many obstacles from his ambitious agenda of social reforms, collectively known as the Great Society. As he began his first full term in office, Johnson called for quick action on the issue of health care for the poor and the elderly. The AMA saw the writing on the wall, and rather than oppose the creation of government programs, they worked with lawmakers to craft a compromise. Medicare and Medicaid were both passed by Congress in 1965 as amendments to the Social Security Act, and Johnson signed the legislation into law on July 30, 1965. Former president Harry Truman (1884–1972) and his wife, Bess, were given the first Medicare cards.

The newly created Medicare program, created by Title XVIII of the Social Security Act, was divided into two separate components. Medicare Part A automatically covered hospitalization for people who were age sixty-five or older and eligible for Social Security. It also covered those eligible for railroad retirement benefits. People with disabilities under age sixty-five who received Social Security were also eligible for the program. Medicare Part B was a voluntary program in which eligible individuals could have a monthly premium deducted from their Social Security payment to receive coverage for 80 percent of their doctors’ fees and medical supplies, after paying a $50 deductible.

Title XIX, a separate amendment to the Social Security Act, created the Medicaid program. Medicaid was built on the model first established with the Kerr-Mills program. Costs and administrative responsibilities would be shared by state and federal governments. In general, people receiving welfare or other public assistance were eligible for Medicaid, but each state was free to set up its own program with its own specific eligibility criteria within certain parameters. California was the first state to develop a program, launching Medi-Cal in 1966. Nearly every state had a Medicaid program within a couple of years.

As health-care costs continued to increase nationwide, Medicare and Medicaid were altered several times over the years that followed. Funding limits were put in place, and were adjusted on a number of different occasions. Gaps in coverage and loopholes that increase costs to either patients or the government have been an ongoing issue. One strategy implemented in recent years to hold down costs has been to require or encourage Medicaid participants to enroll in health maintenance organization plans. Health-care spending currently makes up a large and growing percentage of every state’s budget, prompting louder and more frequent calls for an overhaul of the nation’s health-care finance system.

See also Lyndon B. Johnson

The Immigration and Nationality Act of 1965

The Immigration and Nationality Act of 1965 made significant changes to the immigration policies of the United States by repealing the national origins quota system, which had restricted the number of people who could legally move to the United States from any one country. Under the new first-come, first-served system, immigrant applications were given preference if the immigrants had family in the United States or skills needed within the American workforce.

What Is an Immigrant?

Foreign-born people who come to the United States with the intent to live and work are required to have immigration visas, which allow them to become “legal resident aliens” or, if they so choose, naturalized citizens (generally after five years). Such immigration visas are typically referred to as “green cards,” due to the color of the card once used. Tourists, students, and other categories of foreigners seeking entry to the United States for short-term visits are considered “nonimmigrants” and, if they are from a nation for which the United States requires a visa, receive a visa with restrictions as to their activities (e.g., many cannot legally work) and length-of-stay. Such persons are expected to at some point leave the United States. Those who do not are considered “illegal aliens” and are subject to deportation.

Background: The National Origins Quota System

Since the 1920s, U.S. immigration policy for people seeking to stay in the United States held that foreigners were admitted based on the nation they were from. At the time, the bias favored immigrants from the Western Hemisphere, limited immigrants from the Eastern Hemisphere, and prohibited all immigration from Asian nations. A temporary version of a national origins quota system was enacted in 1921. A permanent policy was announced in 1924.

For the countries upon which the United States placed restrictions, the national origins quota system limited immigration to 2 percent of each foreign-born group living in the United States in 1890. For instance, if 100,000 people born in the United Kingdom were living in the United States in 1890, 2,000 Brits would be allowed permanent residency visas into the United States. The use of the 1890 census is significant because it purposely excluded the large wave of immigrants who arrived from southern and eastern Europe between the 1890 and 1920. Under the new law, all of the nations under the quota system would be allowed a combined total of no more than 154,000 immigrants each year. Italy’s quota, for example, reduced the migration from that country from roughly 42,000 immigrants annually to just 3,845 people per year. (Another barrier: applicants who were disabled, ill, poor, illiterate, or in some way considered unsavory could and would be denied visas.)

There were no quota limitations on refugees, religious ministers, spouses of U.S. citizens, temporary visitors, and immigrants from the North and South American continents. Not only did the United States not want to alienate its neighbors, it needed workers from Mexico and other points south who were employed as farm laborers.

Because of the quota system and a worldwide economic depression in the 1930s, many nations never even met their quotas. At the same time, however, Jews and other persecuted people trying to escape Nazi Germany and other totalitarian regimes were denied refuge in the United States.

1950s Legislation: The McCarran-Walter Bill

The Immigration and Nationality Act of 1952, originally known as the McCarran-Walter Bill, repealed the long-held anti-Asian immigration policies but reaffirmed the national origins quota system. Under the act, the U.S. immigration system continued to set aside the vast majority of permanent residency visas for people from just three Western European nations: Ireland, the United Kingdom, and Germany. People wishing to immigrate to the United States from such Eastern and southern European nations as Poland, Italy, and Greece often waited years for their numbers to be called. According to a February 1965 session of the Senate Subcommittee on Immigration and Naturalization, Italy at that time had an annual allowed quota of 5,666 immigrants, yet the waiting list for admission into the United States tallied just short of 250,000 people.

Believing the 1952 act was simply perpetuating past wrongs, President Harry Truman (1884–1972) vetoed the bill when it landed on his desk. In his veto, he wrote “These are only a few examples of the absurdity, the cruelty of carrying over into this year of 1952 the isolationist limitations of our 1924 law. In no other realm of our national life are we so hampered and stultified by the dead hand of the past, as we are in this field of immigration.” Congress, however, overrode Truman’s veto.

Passage of the 1965 Immigration and Nationality Act

The Immigration and Nationality Act of 1965 was signed by President Lyndon B. Johnson (1908–1973) on October 3; it had as its backdrop the civil rights movement, which aspired to expand rights and freedoms to all people. The new statute discarded considerations of nationality, race, and ethnicity that had dictated the quota system of U.S. immigration policy for forty years. The 1965 policy granted permission to immigrate based on, in the following order of preference, the reunification of families, the acquisition by the United States of needed workforce skills, and, in the fewest instances, the moral obligation to provide assistance to refugees. This new plan, much of which remains in effect in the twenty-first century, is described as a “preference system.”

The 1965 act—also called the Hart-Celler Immigration Bill—capped the previously unrestricted immigration from Western Hemisphere nations to an annual maximum of 120,000, and placed a limit on immigration from Eastern Hemisphere nations at 170,000, but with no more than 20,000 persons per country. Overall, the ceiling on annual worldwide immigration increased from 150,000 to 290,000.

With national origin no longer a consideration, the United States would now fill each hemisphere’s quota by choosing on a first-come, first-served basis from pools of prospective immigrants, based on a seven-category preference system for relatives of U.S. citizens and permanent resident aliens (President Johnson was particularly eager to help families separated by Fidel Castro’s [1926–] revolutionary takeover of Cuba) and for people with work skills needed in the United States. According to one report, more than three-quarters of each hemisphere’s quota went to people with family ties to American citizens. Roughly 10 percent was reserved for people who possessed knowledge, experience, or skills that would be useful to the U.S. economy. The remaining admissions went to refugees, such as those escaping racial, religious, or political persecution.

The Impact of the 1965 Legislation

By opening immigration possibilities to the entire world, the 1965 act, viewed by many as a symbolic effort in line with other civil rights objectives of the time, unintentionally led to a huge wave of new immigration. Upon signing the legislation, President Johnson stated, “This bill we sign today is not a revolutionary bill. It does not affect the lives of millions. It will not restructure the shape of our daily lives.”

Dean Rusk (1909–1994), Johnson’s secretary of state, estimated that a total of eight thousand people would immigrate to the United States from India within the first five years of the new policy. In a February 1965 Senate subcommittee hearing he stated: “I don’t think we have a particular picture of a world situation where everybody is just straining to move to the United States,” he said. In fact, more than twenty-seven thousand people came from India during that time frame. From 1965 to 1993, the total number was just short of six hundred thousand.

Later Changes

In the mid-1970s amendments were enacted and the policy was revised again. The hemisphere distinctions were removed and replaced by an overall annual limit of 290,000 for the number of immigrants from throughout the globe allowed into the United States per year. By this time, the majority of immigrants into the United States were coming from Asia and Latin America, not Europe. (Between 1970 and 1992 more than one million Indochinese immigrants came to the United States, most of them refugees from Vietnam, Cambodia, and Laos.)

In the 1974 Lau v. Nichols case, brought on behalf of immigrant Chinese children, the U.S. Supreme Court ruled that public schools have an obligation to provide bilingual educational services to students whose learning is impeded by their difficulty in understanding or speaking English. The Court agreed with the plaintiff’s argument that such services were required by the clause in the Civil Rights Act of 1964 which stated, “No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving Federal financial assistance.”

An immigration-related statute, the Refugee Act of 1980, broadened the definition of what the U.S. government would consider to be a refugee. Whereas a refugee was once considered mainly someone who was fleeing persecution (typically from a Communist or Middle Eastern country), the new act opened the definition to be in line with the more generous criteria used by the United Nations. This act set an annual refugee admissions number at 50,000 and, separately, reduced the annual worldwide ceiling for immigrants to 270,000.

With the Immigration Reform and Control Act of 1986, Congress attempted to cope with the growing problem of illegal immigration. The act provided amnesty to certain individuals who were in the country illegally and at the same time increased the penalties for employers who hired undocumented aliens.

A revision in 1990 raised the overall annual global immigration limits to nearly seven hundred thousand, maintained family reconciliation as an important criterion, increased employment-based immigration (including allowances for temporary workers), and recognized as a category diversity immigrants, the term used to describe people from underrepresented countries.

Over the course of nearly a century, the immigration policies of the United States have evolved from a system based on national quotas—determined by ethnic and racial preferences—to one where entry for permanent residency is reserved for immigrants with desired job skills (employer sponsored) or immigrants with immediate legal family in the United States (family sponsored). Other changes in immigration policy have aimed at detecting and deporting criminals, terrorists, and undocumented aliens.

Immigration at the Start of the Twenty-First Century

In 2004 the largest legal immigrant groups to the United States were from Mexico, India, and the Philippines. (Complicated formulas exist to prevent one nation from monopolizing all of the open slots.) Since few people in the United States can trace their lineage through the centuries to Native American tribes that flourished on what is now U.S. soil, America is a nation built by immigrants. Some immigration opponents, however, argue that times have changed. They feel that with the nation’s population exceeding the three hundred million mark in 2006, the United States no longer needs or has the space or financing to accommodate vast migrations of foreigners. Immigration proponents argue that the United States has a moral obligation to keep its borders open and promote diversity. It is a problem that will not go away. From 2000 to 2005, nearly one million immigrants were granted legal permanent residence in the United States each year (though the majority were already living in the United States). By the middle of the first decade of the 2000s, an additional 8.7 million people were believed to be in the country illegally.

Excerpts from President Lyndon B. Johnson’s Remarks upon Signing the Immigration and Nationality Act of 1965

With Ellis Island in the background and the Statue of Liberty looming above, on October 3, 1965, President Lyndon B. Johnson (1908–1973) addressed a group of several hundred guests who had crossed to the island by boat for the ceremony.

This bill that we will sign today is not a revolutionary bill. It does not affect the lives of millions. It will not reshape the structure of our daily lives, or really add importantly to either our wealth or our power. Yet it is still one of the most important acts of this Congress and of this administration. For it does repair a very deep and painful flaw in the fabric of American justice. It corrects a cruel and enduring wrong in the conduct of the American Nation.… This bill says simply that from this day forth those wishing to immigrate to America shall be admitted on the basis of their skills and their close relationship to those already here. This is a simple test, and it is a fair test. Those who can contribute most to this country—to its growth, to its strength, to its spirit—will be the first that are admitted to this land. The fairness of this standard is so self-evident that we may well wonder that it has not always been applied. Yet the fact is that for over four decades the immigration policy of the United States has been twisted and has been distorted by the harsh injustice of the national origins quota system. Under that system the ability of new immigrants to come to America depended upon the country of their birth. Only 3 countries were allowed to supply 70 percent of all the immigrants. Families were kept apart because a husband or a wife or a child had been born in the wrong place. Men of needed skill and talent were denied entrance because they came from southern or eastern Europe or from one of the developing continents. This system violated the basic principle of American democracy—the principle that values and rewards each man on the basis of his merit as a man. It has been un-American in the highest sense, because it has been untrue to the faith that brought thousands to these shores even before we were a country. Today, with my signature, this system is abolished. We can now believe that it will never again shadow the gate to the American Nation with the twin barriers of prejudice and privilege.…

Bibliography

Public Papers of the Presidents of the United States: Lyndon B. Johnson, 1965. Volume II, Entry 546, 1037–1040. Lyndon Baines Johnson Library and Museum. http://www.lbjlib.utexas.edu/Johnson/archives.hom/speeches.hom/651003.asp (accessed April 14, 2007).

Miranda v. Arizona

Miranda v. Arizona is the U.S. Supreme Court case that affirmed the requirement that criminal suspects be informed of their constitutional rights to refrain from making statements to the police until they have had the advice of counsel and to have legal representation during questioning. Thus Miranda established the need for police officers everywhere to utter the phrase immortalized in countless television cop shows: “You have the right to remain silent.…” Prior to Miranda, the law pertaining to the treatment of suspects in custody varied from state to state. Miranda clarified on a national level exactly what information regarding their constitutional rights—primarily those stemming from the Fifth Amendment right not to incriminate oneself—suspects must be informed about in order for any evidence produced by their questioning to be valid in court.

Arrest of Ernesto Miranda

On March 3, 1963, an eighteen-year-old Phoenix, Arizona, woman was grabbed on the way home from her movie theater job, forced into a car, driven into the desert, and raped. Her attacker then drove her back into the city and dropped her off not far from her home. When she reported the attack to police, the woman’s account of the event was somewhat jumbled and contradictory. She described the assailant as a Mexican in his late twenties with glasses, driving either a Ford or Chevrolet from the early 1950s.

A week later, the woman and her brother-in-law spotted a 1953 Packard that she believed to be car driven by the rapist. They reported that the vehicle’s license plate number was DFL-312. That license turned out to be registered to a late model Oldsmobile, but a similar number, DFL-317 belonged to a Packard registered to a woman, Twila N. Hoffman, whose boyfriend, Ernesto Miranda, fit the woman’s the description of her attacker.

Miranda had a substantial criminal history, and had served a year in jail for attempted rape. Police placed Miranda in a lineup with three other Mexicans of similar physical type—though none of them wore glasses. The victim indicated that of the men in the lineup, Miranda looked the most like her attacker, but she was unable to provide a positive identification.

Two detectives, Carroll Cooley and Wilfred Young, took Miranda into another room and began interrogating him. As is common during questioning, the detectives misled their suspect, telling him he had been positively identified, and they asked him to make a statement. Miranda signed a written confession two hours later. There was no evidence that he had been coerced or abused in any significant way.

Questionable Police Practices

Because he was unable to afford private counsel, Miranda was given a court-appointed lawyer, Alvin Moore. While the case against Miranda was strong given the signed confession, Moore was disturbed by how the confession had been obtained. At trial in the Arizona state court, the prosecution presented only four witnesses: the victim, her sister, and the two detectives who had obtained Miranda’s confession. Moore presented a spirited defense on behalf of Miranda, pointing out a number of inconsistencies in the victim’s version of the event. Moore made his most important argument, however, during cross-examination of Detective Cooley. During this questioning, Cooley acknowledged that Miranda had never been informed of his right to the advice of an attorney before making a statement, and that informing suspects of this right was not part of the police department’s standard procedure when questioning suspects.

Based in this information, Moore asked Judge Yale McFate to throw out Miranda’s confession as tainted evidence. McFate rejected Moore’s request, and the jury was allowed to hear the confession. On June 27, 1963, Miranda was convicted of the rape and kidnapping and sentenced to two concurrent terms of twenty to thirty years in prison.

Moore’s arguments, however, had set off a chain of events that would dramatically change the way law enforcement officials go about their work. Miranda appealed to the Supreme Court of Arizona on the basis of Moore’s argument that he had not been informed of his right to have a lawyer present, but his conviction was upheld. After that defeat, he appealed to the U.S. Supreme Court, which agreed to hear the case in 1965. The Miranda case, along with three other cases involving similar issues, was argued before the Supreme Court between February 28 and March 2, 1966. In each of the four cases that made up the Miranda review, the suspect had not been notified of his rights, leading to a confession that resulted in a conviction.

Warren Court Ruling

On June 13, 1966, the Supreme Court voted 5 to 4 in Miranda’s favor. Writing for the majority, Chief Justice Earl Warren (1891–1974) established clear-cut guidelines for police behavior during an interrogation. He wrote:

Prior to any questioning, the person must be warned that he has a right to remain silent, that any statement he does make may be used as evidence against him, and that he has a right to the presence of an attorney, either retained or appointed.

Warren and the other justices in the majority—Hugo Black (1886–1971), William Brennan (1906–1997), William O. Douglas (1898–1980), and Abe Fortas (1910–1982)—believed that the inherently intimidating atmosphere of a police interrogation must be counterbalanced by strong safeguards that the suspect’s rights were upheld. To arguments that the new Miranda rules would hamper law enforcement efforts, Warren answered that similar rules were already in place at the Federal Bureau of Investigation, and that they did not seem to interfere with that agency’s ability to fight crime.

The Miranda ruling instantly changed the way suspects were treated in the United States. With the implementation of the “Miranda rules,” police officers all over the country began carrying “Miranda cards,” from which they would read verbatim the rights of detainees before questioning them. Conservatives hated the decision, fearing that large numbers of criminals would be cut loose on technicalities after the failure of police to properly “Mirandize” them, as the procedure came to be known.

The fears of those who have opposed the Miranda decision do not seem to have been borne out. The change did not appear to have an impact on the willingness of those arrested to give statements to police, and while individual cases have been thrown out when confessions have been found to be inadmissible because of failure to Mirandize, the law has generally not hindered prosecutors. In practice, the Miranda case actually appears to have had a dramatic positive impact on police behavior, as incidences of abusive interrogation practices have decreased. While coercion continues to take place, Miranda led to greater standardization and professionalization of police practices, and an increased awareness on the part of police of the rights of the accused. Attempts have been made over the years to chip away at the rights established by Miranda v. Arizona, but to date those efforts have not succeeded in any substantial way.

Ernesto Miranda received a new trial after his conviction was overturned by the Supreme Court, but he was convicted a second time on the strength of new evidence and sentenced to twenty to thirty years in jail. He was released on parole in 1972. Four years later, Miranda was stabbed to death in a bar fight. However, his name lives on in the words associated with it, invoked nearly every time somebody in America is arrested.

The Freedom of Information Act

The Freedom of Information Act (FOIA), first passed in 1966, provides citizen access to information held by agencies that are part of the executive branch of the federal government of the United States. The act requires that unreleased documents in the custody of the executive branch be made available to any individual on request. The law does, however, carve out nine exceptions to this mandate, under which the requested information may be withheld.

Before passage of FOIA, people seeking government documents had to state the purpose of their request, and government officials had a great deal of latitude as to whether the benefits of disclosing the requested material outweighed the value of keeping it secret. Such requests were frequently denied, with little justification given.

Roots of the Act

The movement to require the federal government to be more forthcoming with government documents was driven by journalists. At the dawn of the Cold War, an atmosphere of secrecy had descended upon Washington, D.C., as scare stories of communists hiding in every dark corner of every government office spread throughout federal agencies. Meanwhile, the federal government had grown dramatically since the 1930s, with the creation of many new agencies resulting in a sprawling, difficult-to-navigate bureaucracy.

In 1955 John Moss, a Democratic congressman from Sacramento, California, as chair of the Special Subcommittee on Government Information, led the first of a series of hearings on the issue of excessive government secrecy. His efforts were backed by an intensive lobbying campaign by newspaper editors who were fed up with the government’s frequent refusal to release information crucial to the thorough coverage of key issues and events of the day. With a Republican president, Dwight D. Eisenhower (1890–1969), in the White House, however, there was little chance that Moss’s proposed legislation to open government records to the public could move forward.

When the Democrats seized control of the White House at the beginning of the 1960s, Republicans became more receptive to the idea of opening access to government information. With Congressman Donald Rumsfeld (1932–), a Republican member of Moss’s subcommittee, signed on as a cosponsor—quite ironic in light of Rumsfeld’s role in promoting secrecy less than a decade later as President Gerald R. Ford’s (1913–2006) chief of staff and secretary of defense, and even more so as President George W. Bush’s (1946–) defense secretary during the War on Terror—Moss’s freedom of information bill was finally ready to make headway in Congress.

Passage of FOIA

President Lyndon B. Johnson (1908–1973) was adamantly opposed to the bill at first, and spent considerable energy stalling its progress in 1965. In another interesting twist, one of the leading voices against passage of the bill was Johnson’s press secretary Bill Moyers, who later went on to become a leading muckraking journalist dependent on the availability of government documents. By the spring of 1966 the Senate had passed a version of Moss’s bill, and Moyers and other White House staff began to sense that passage in the House might be inevitable. Even Moyers began speaking in favor of the legislation.

The House of Representatives passed the bill unanimously on June 20, 1960, and sent it to Johnson for his signature. By that time, only one federal agency, the Department of Health, Education, and Welfare, was still recommending a veto. Johnson signed FOIA into law on July 4, 1966, albeit reluctantly. He was not happy about the prospect of open public access to his government’s documents.

FOIA mandated that federal government documents be made available to any person who requests them; nevertheless, the act included nine exemptions to that requirement. Those exceptions are: (1) material that is classified in the interest of national defense; (2) internal guides that discuss enforcement strategies; (3) material whose disclosure is prohibited by other laws; (4) confidential or privileged commercial or financial information; (5) information protected by privileges such as attorney-client or work product; (6) information whose release would constitute an unwarranted invasion of personal privacy; (7) information compiled for law enforcement purposes that might cause harm if released; (8) information related to oversight of financial institutions; and (9) geophysical and geological information about oil wells.

FOIA has been amended a number of times since 1966. The most significant amendments took place in 1974 and 1996. The 1974 amendments, coming in the aftermath of the Watergate scandal, sought to tighten the requirements of the law, force greater agency compliance, and narrow the exceptions. President Ford vetoed this legislation, but Congress overrode his veto. The Electronic Freedom of Information Act of 1996 essentially updated the act to reflect changes that have taken place in information technology since the act was first established.

How to File an FOIA Request

By law, anybody can request information for any reason from the federal government under the Freedom of Information Act (FOIA). For relatively small requests made for noncommercial purposes, there is usually no fee associated with filing a FOIA request. There are three general categories of requesters defined by the law. The first category includes members of the news media, educational institutions, and noncommercial scientific institutions. Usually this type of requester has to pay a standard document copying charge, but the fee is often waived for individual journalists and scholars if their goal is to disseminate information to the public. The second category consists of nonprofits, public interest groups, and individuals seeking information for personal use. This category must usually pay a fee for both document reproduction and the time it takes to perform the search. The last category is those requesting information for commercial use in order to make a profit. This group is usually assessed additional fees.

The first thing to do when seeking information is to figure out which department the information resides in. While there is no central office in the federal government that coordinates FOIA requests, the U.S. Department of Justice serves as the source of information about FOIA and the different government departments and agencies.

The next step is to make sure that the information you are looking for is not already available on the department’s Web site. The department Web site will usually have information posted about where to send a FOIA request. It is usually a good idea to send the request to both the department’s FOIA office and to the specific office within the department that has the information you hope to obtain.

The last step is to simply write a letter to the agency indicating in as much detail as possible the information you are looking for, making sure to mention in the letter that it is a formal FOIA request. Most departments’ FOIA Web pages contain sample letters that you can use as a model when making your own FOIA request. Once the right agency has received the request, and if the request is complete and correct, the agency has twenty working days to respond with a determination as to whether they will grant the request. If the request is denied, the agency must give the reason within those twenty days. If the request is approved, the agency is then supposed to provide the information promptly, though there is no specific time frame within which they must act.

The Demonstration Cities and Metropolitan Development Act

The Model Cities program—originally called Demonstration Cities—was an ambitious attempt from the mid-1960s to the mid-1970s to revitalize selected urban neighborhoods through a comprehensive system of social programs and planning efforts. The program coordinated public and private resources and promoted participation by residents in efforts to rehabilitate their own communities. The Model Cities program was a cornerstone of President Lyndon B. Johnson’s (1908–1973) Great Society initiative, a set of programs and policies aimed at eliminating poverty and racial injustice.

Early Roots

During the late 1950s and early 1960s, the civil rights movement raised awareness among Americans of racial discrimination in the South and a litany of social problems plaguing poor African-Americans in northern cities. During this period a number of foundation-funded programs were instituted in various cities to address the social ills associated with poverty and discrimination. One of the first was Mobilization for Youth, an antidelinquency program launched in the late 1950s on New York’s Lower East Side. Within a few years, the Ford Foundation had established an initiative known as the Gray Areas program. This program, which absorbed Mobilization for Youth as well as similar programs that had developed elsewhere, provided grants to neighborhood improvement agencies in Boston, Massachusetts; New Haven, Connecticut; Philadelphia, Pennsylvania; and Washington, D.C. As these privately financed programs took hold, the federal government decided to take action as well. In 1964 the Economic Opportunity Act created the Community Action Program, which combined community participation with antipoverty programs.

By the mid-1960s a number of high-profile individuals representing the private and public sectors as well as academia began advocating for the creation of a national-scale approach to urban problems. National Institutes of Health psychiatrist Leonard Duhl, Tufts University dean Antonia Chayes, and Detroit mayor Jerome Cavanagh, among others, began pushing an idea for an overarching program that would address cities’ problems on the social, economic, political, and physical levels simultaneously. Johnson convened a series of presidential task forces, which he asked to come up with solutions to the pressing urban problems of the day. One of the first recommendations to come out of the task forces was for the creation of a cabinet-level Department of Housing and Urban Development (HUD), which was established in 1965. Next the task forces took up the idea of Demonstration Cities. As initially conceived, the program would cover only a handful of cities, but it quickly grew into something much broader. The final report of the task forces quickly transformed into a bill requesting $2.3 billion to provide select cities with comprehensive assistance in several areas, including education, housing, and social services. Johnson directed leaders of the newly formed HUD to make passage of the Demonstration Cities project one of their top priorities in 1966.

The Demonstration Cities and Metropolitan Development Act was signed into law on November 3, 1966. When riots began to erupt in cities across the United States the next summer, the program’s name was quickly changed to Model Cities, because the word demonstration was too closely associated with rioting. The first Model Cities grants were awarded in November 1967. Totaling over $300 million, they went to sixty-three cities and towns across the country. Another $200 million was handed out in a second round of grants in September 1968, following another summer of violence in America’s cities.

Decline under Nixon

In 1968 the Democrats lost control of the White House and lost ground in Congress. With that change in the political climate in Washington, along with waning support from suburban and rural Americans growing weary of scenes of urban violence on the television, the Model Cities approach fell out of favor. Initially, HUD secretary George Romney persuaded President Richard Nixon (1913–1994) to leave Model Cities and a handful of other urban programs intact. With his reelection in 1972, however, Nixon began to reexamine the urban aid programs he had inherited from Johnson. Funding for the Model Cities program was suspended in 1973, and soon afterward the remnants of the program were folded into the Community Development Block Grant program, which was created by the Housing and Community Development Act of 1974.

It is difficult to assess the overall impact of the Model Cities program. In some senses, it was a success; supporters point to the development of a generation of involved citizens who became engaged in the political process through the program’s resident participation requirement. Critics tend to point out that there was little improvement in most social problems; urban poverty certainly did not go away. While the Community Development Block Grant program that survives lacks the kind of comprehensive approach of Model Cities, it has resulted in the development of many highly successful community organizations that have had a positive and lasting effect in many cities.

See also Lyndon B. Johnson

The Presidential Succession Amendment

The Twenty-fifth Amendment to the Constitution, ratified on February 10, 1967, established the procedure for replacing the president or vice president of the United States in the event that either office is unoccupied. First proposed following the assassination of John F. Kennedy (1917–1963), the amendment has been used several times since its passage.

The term presidential succession refers to procedures for transferring presidential authority to another individual through means other than the normal electoral process that takes place every four years. That includes situations in which a sitting president dies while in office as well as when the president resigns, is removed through impeachment, or is unable to perform the duties of the office because of health or other reasons. The procedures for presidential succession are defined in three different parts of the U.S. Constitution—Article II, Section 1, Clause 6, and the Twentieth and Twenty-fifth Amendments—and in the presidential succession law passed by Congress in 1947.

Need for Clear Procedures

During the twentieth century, the importance of a well-defined presidential succession system became obvious, as five vice presidents ascended to the presidency during the first three-quarters of the century, four of them because of presidential deaths and one as a result of a resignation. More than one-third of the presidents between 1841 and 1975 either died in office, became disabled, or, in the case of one, Richard M. Nixon (1913–1994), resigned.

The basic principles of presidential succession were laid out in Article II of the Constitution, which states that the vice president shall assume the duties of the president if the president dies or is removed from office, or, if the president becomes disabled, until the disability is no longer present. Article II also directs Congress to figure out a plan for situations in which neither the president nor vice president is able to serve.

The Twentieth Amendment, ratified in January 1933, added to the formula that if the president-elect dies or is unable to take office, then the vice president–elect will be sworn in as president. Coincidentally, just twenty-three days after the amendment was ratified, an attempt was made to assassinate President-Elect Franklin Roosevelt (1882–1945). Had it succeeded, Vice President–Elect John Nance Garner (1868–1967) would have been sworn in as president in March. After World War II, Congress passed the Presidential Succession Act of 1947, which put in place the current order of succession after the vice president. If neither the president nor the vice president is able to serve, next in line are the Speaker of the House of Representatives, the president pro tempore of the Senate, and then the cabinet officers, starting with the secretary of state, followed by the secretary of the treasury, secretary of defense, and attorney general.

Kennedy Assassination

In spite of these laws, however, some procedural issues still remained unclear at the time Kennedy was assassinated. The Twenty-fifth Amendment was proposed to resolve some of these questions. Section 1 of the amendment simply reaffirmed the longstanding precedent of the vice president taking over as president upon the death or resignation of the president. Section 2 established a new procedure for selecting a new vice president if that office becomes vacant. This is precisely the situation Lyndon B. Johnson (1908–1973) faced when he assumed the presidency after Kennedy’s death.

Sections 3 and 4 of the amendment address presidential disability. Prior to the passage of the Twenty-fifth Amendment, the Constitution did not cover situations in which a president becomes temporarily disabled. Section 3 establishes procedures for situations in which the president communicates that he “is unable to discharge the powers and duties of the office,” in which case the vice president temporarily assumes those powers and duties until the president is once again able to serve. Section 4 addresses situations in which the vice president and a majority of cabinet members determine that the president is incapable of performing. In such a case, the vice president becomes acting president until four days after the time the president declares himself able to resume his duties. The vice president and cabinet members may dispute the president’s assertion of capability, in which case the matter is resolved by Congress.

Twenty-fifth Amendment Invoked

The Presidential Succession Amendment saw plenty of action in the 1970s. President Richard M. Nixon (1913–1994) used the procedure outlined in Section 2 to nominate Gerald R. Ford (1913–2006) as vice president following the resignation of Vice President Spiro Agnew (1918–1996) in October 1973. When Nixon resigned from office the following year, Ford succeeded him immediately and was sworn in the same day as per the amendment. Ford then used the procedures in the amendment to nominate Nelson Rockefeller (1908–1979) as vice president.

When President Ronald Reagan (1911–2004) underwent surgery for cancer in July 1985, he handed over power to Vice President George H. W. Bush (1924–) for eight hours, though it is unclear whether Reagan formally invoked the Twenty-fifth Amendment. Similarly, President George W. Bush (1946–) signed presidential authority over to Vice President Dick Cheney (1941–) when he was preparing to undergo a colonoscopy. Cheney served as acting president from 7:09 to 9:24 a.m. on June 29, 2002.

Red Lion Broadcasting Co. v. Federal Communications Commission

The U.S. Supreme Court’s decision in the landmark case Red Lion Broadcasting Co. v. Federal Communications Commission (1969) upheld as constitutional the Federal Communications Commission’s (FCC) “fairness doctrine.” The fairness doctrine is a policy aimed at making radio and television broadcasters present a fair and balanced account of public issues by requiring them to provide equal time for a response by those who have been personally attacked on the air in the context of political debate or editorializing. The key question of the case is whether the fairness doctrine violated broadcasters’ constitutionally protected right to free speech by telling them what content they had to air. The Supreme Court ruled that the doctrine did not violate that right, and that companies which broadcast on the limited number of frequencies available—frequencies that theoretically belong to the public—are obligated to present a variety of viewpoints on all subjects their programming touches upon.

Background

When commercial radio came into existence in the 1920s, the airwaves were unregulated. The available broadcast frequencies were up for grabs, their allocation left up to the business community. The result was a dysfunctional system that did not serve listeners well. To make sense of this chaotic situation, the federal government created the Federal Radio Commission (FRC; later renamed the Federal Communications Commission) in 1927. The FRC quickly asserted that radio stations were obligated to serve the public interest, and set about creating a broad range of regulations aimed at bringing order to the young broadcasting industry. As part of its congressionally mandated mission to protect the public interest, the FCC stated in 1929 that public interest “requires ample play for the free and fair competition of opposing views,” applicable to “all discussions of issues of importance to the public.” By the end of the 1940s this commitment had been distilled into the so-called fairness doctrine, which required broadcasters to offer free air time for response to anybody who represented views different from those being put forth via a station’s programming.

The facts of the Red Lion case revolve around a November 27, 1964, broadcast by the Pennsylvania radio station WGCB. The show in question was a fifteen-minute program by the Reverend Billy James Hargis, part of an ongoing series called “Christian Crusade.” In the broadcast, Hargis took issue with the beliefs of Fred J. Cook, who had written a book titled Barry Goldwater: Extremist on the Right. Hargis, a fan of the Republican politician Goldwater, launched a scathing personal attack on Cook, calling him a communist and stating that he had been fired from his newspaper job for fabricating a story.

Equal Time for Reply

Cook heard the show, and quickly demanded free airtime for a rebuttal as per the fairness doctrine that had been official policy of the FCC for more than a decade. WGCB refused to honor Cook’s request, however, and the matter was referred to the FCC. The FCC ruled that Hargis’s broadcast was indeed a personal attack on Cook, and that WGCB, as a licensed broadcaster, was therefore obligated to provide Cook with free airtime to respond to the attack.

The station was unhappy with the FCC’s ruling, and in 1967 they brought the case before the U.S. Court of Appeals for the District of Columbia Circuit. Their argument hinged upon the notion that the FCC was violating the station’s First Amendment free speech rights by dictating how they must allocate their precious airtime. While the case was under consideration by the court of appeals, the FCC took the opportunity to clarify the parameters of the fairness doctrine with regard to personal attacks and political editorials. They came up with a new definition of “personal attack.” Under the newly articulated policy, a personal attack has taken place “when, during the presentation of views on a controversial issue of public importance, an attack is made upon the honesty, character, integrity or like personal qualities of an identified person or group.” The FCC also outlined specific remedies for when such an attack occurs. The policy provided that the broadcaster shall notify the person or group of the broadcast, provide them with a transcript or tape of the attack, and offer a reasonable opportunity to respond on the air.

The court of appeals upheld the FCC’s Red Lion ruling, but at almost the same time the U.S. District Court of Appeals for the Seventh Circuit ruled the amended fairness doctrine unconstitutional in a separate case, United States v. Radio Television News Directors Association. With these two contradictory rulings on the books, the U.S. Supreme Court came into the picture, hearing arguments for the two cases on April 2 and 3, 1969.

Erosion of the Fairness Doctrine

The Supreme Court ruled unanimously on June 9, 1969, that the fairness doctrine was consistent with the First Amendment, and actually enhanced rather than infringed upon the freedoms of speech protected by the Constitution. The Court’s position was that a license to broadcast over the airwaves did not give a radio or television station the right to monopolize its licensed frequency with its own opinions on important issues, and that it was perfectly justifiable to impose regulations to ensure that broadcasters fulfill their public interest obligation to present diverse viewpoints on controversial matters.

Red Lion marked the high point for the fairness doctrine. Portions of the doctrine were chipped away through a variety of later decisions, including Miami Herald Publishing Co. v. Tomillo (1974), in which the Supreme Court ruled that the right to reply did not extend to print media. The doctrine was dealt its deathblow in 1987, when President Ronald Reagan (1911–2004), a staunch opponent of most forms of industry regulation, vetoed legislation that would have written the fairness doctrine into the law of the land. The FCC gave up on the doctrine altogether shortly afterward.

Bibliography

Books

Abernathy, Ralph. And the Walls Came Tumbling Down. New York: Harper and Row, 1989.

Ambrose, Stephen E. Eisenhower. 2 vols. New York: Simon and Schuster, 1983–1984.

Clark, Hunter R. Justice Brennan: The Great Conciliator. Secaucus, N.J.: Carol Publishing Group, 1995.

Collier, Peter, and David Horowitz. The Kennedys: An American Drama. New York: Summit Books, 1984.

Dallek, Robert. Flawed Giant: Lyndon Johnson and His Times, 1961–1973. New York: Oxford University Press, 1998.

Davis, Flora. Moving the Mountain: The Women’s Movement in America since 1960. New York: Simon and Schuster, 1991.

Halberstam, David. The Best and the Brightest. New York: Random House, 1972.

Herring, George C. America’s Longest War: The United States and Vietnam, 1950–1975. New York: Knopf, 1986.

LaFeber, Walter. America, Russia, and the Cold War, 1945–1996. 8th ed. New York: McGraw-Hill, 1996.

McCoy, Donald R. The Presidency of Harry S. Truman. Lawrence: University Press of Kansas, 1984.

Painter, David S. The Cold War: An International History. New York: Routledge, 1999.

Quirk, Robert E. Fidel Castro. New York: Norton, 1993.

Report of the President’s Commission on the Assassination of President John F. Kennedy. Washington, DC: Government Printing Office, 1964.

Salmond, John A. My Mind Set on Freedom: A History of the Civil Rights Movement, 1954–1968. Chicago: Ivan R. Dee, 1997.

Schrecker, Ellen. Many Are the Crimes: McCarthyism in America. Boston: Little, Brown, 1998.

Schwartz, Bernard, ed. The Warren Court: A Retrospective. New York: Oxford University Press, 1996.

Sundquist, James L. Politics and Policy: The Eisenhower, Kennedy, and Johnson Years. Washington, D.C.: Brookings Institution, 1968.

Thompson, Robert Smith. The Missiles of October: The Declassified Story of John F. Kennedy and the Cuban Missile Crisis. New York: Simon and Schuster, 1992.

Tushnet, Mark V. Making Constitutional Law: Thurgood Marshall and the Supreme Court, 1961–1991. New York: Oxford University Press, 1997.

Unger, Irwin. The Best of Intentions: The Triumphs and Failures of the Great Society under Kennedy, Johnson, and Nixon. New York: Doubleday, 1996.

Young, Andrew. An Easy Burden: The Civil Rights Movement and the Transformation of America. New York: HarperCollins, 1996.

Periodicals

Calabresi, Guido. “The Exclusionary Rule.” Harvard Journal of Law and Public Policy 26 (2003): 111.

MacDonald, Maurice. “Food Stamps: An Analytical History.” Social Service Review 51 (December 1977): 642–658.

Web Sites

American Meteorological Society. “A Look at U.S. Air Pollution Laws and Their Amendments.” http://www.ametsoc.org/sloan/cleanair/cleanairlegisl.html (accessed April 18, 2007).

U.S. Department of Veterans Affairs. “GI Bill Website.” http://www.gibill.va.gov/GI_Bill_Info/benefits.htm (accessed April 19, 2007).

About this article

The Postwar Era (1945–1970)

Updated About encyclopedia.com content Print Article