The Watergate Era (1968–1979)

views updated

The Watergate Era (1968–1979)

How They Were Governed

The Drug Enforcement Administration

The Drug Enforcement Administration (DEA) is the lead federal agency in charge of enforcing the nation’s narcotics and controlled substances laws, both domestically and abroad.

Origins of the DEA and the Controlled Substances Act

The DEA was established in 1973 by President Richard Nixon (1913–1994) as a means of consolidating and coordinating the federal government’s drug control efforts. While illicit drug use in the 1970s did not reach the high numbers experienced in the years following, the problem was considered significant enough for Nixon to declare “an all-out global war on the drug menace.” According to Nixon, “Certainly, the cold-blooded underworld networks that funnel narcotics from suppliers all over the world are no respecters of the bureaucratic dividing lines that now complicate our anti-drug efforts.”

At the time Nixon was making his arguments, the Department of Justice’s Bureau of Narcotics and Dangerous Drugs (BNDD) was responsible for enforcing federal drug laws, as were the U.S. Customs Service and several other Department of Justice divisions (including the Office of National Narcotics Intelligence). Advocates for creating the DEA maintained that a single agency, housed within the Department of Justice, would put an end to interagency competition, better involve the Federal Bureau of Investigation (FBI), make a single agency accountable, and provide a focal point for coordinating federal drug enforcement efforts with those of state, local, and foreign police authorities.

Prior to the inception of the DEA, Congress passed the Controlled Substances Act (CSA), Title II of the Comprehensive Drug Abuse Prevention and Control Act of 1970, which established a single system of control for both narcotic and psychotropic drugs. The act also established five schedules that classify controlled substances according to how dangerous they are, their potential for abuse and addiction, and whether they possess legitimate medical value. Though amended on several occasions, the CSA’s legal framework for drug classification remains in effect today, with the most dangerous drugs being classified as Schedule 1 narcotics.

The Rise of Cocaine Use

When drug use began to escalate among Americans in the mid-1970s, President Gerald Ford (1913–2006) asked his vice president, Nelson Rockefeller (1908–1979), to assess the extent of drug abuse in the United States and to make recommendations for handling it. In his resulting report, Rockefeller maintained that “all drugs are not equally dangerous. Enforcement efforts should therefore concentrate on drugs which have a high addiction potential.” The report described marijuana as a minor problem and declared that “cocaine is not physically addictive … and usually does not result in serious social consequences such as crime, hospital emergency room admissions, or death.” The report recommended that “priority in both supply and demand reduction” should focus on the more dangerous drugs of heroin, amphetamines, and mixed barbiturates. Several years later even the chief drug policy adviser to President Jimmy Carter (1924–) considered cocaine to be “probably the most benign of the illicit drugs currently in widespread use. At least as strong a case should be made for legalizing it as for legalizing marijuana.”

As a result of Rockefeller’s assertions, and the prevailing belief that cocaine was not addictive (an assumption that by the 1980s would be proved false), the DEA shifted its focus away from marijuana and cocaine and toward heroin and other opiates, most of which came from Mexico. The DEA’s lack of emphasis on marijuana and cocaine laid the foundations for what would become by the 1980s the powerful Medellín and Cali Colombian drug cartels. In 1972, 5.4 million Americans admitted to having used cocaine at least once. By the early 1980s that figure had risen to 22 million.

Drug Use Rates since the 1970s

According to the annual Monitoring the Future drug use survey conducted by the U.S. National Institutes of Health (NIH), from the late 1970s to the early 1990s rates of marijuana use remained steady, but the use of other illicit drugs declined appreciably among young people. The NIH credits the improvements to “changes in attitudes about drug use, beliefs about the risks of drug use, and peer norms against drug use” and, as a result, a decrease in consumer demand for illicit narcotics.

While advocating the importance of social policies and educational efforts to encourage young people not to use drugs, the report also warns against taking declines in drug use for granted. “Relapse is always possible and, indeed, just such a ‘relapse’ in the longer term epidemic occurred during the early to mid-1990s, as the country let down its guard on many fronts. The drug problem is not an enemy that can be vanquished, as in a war. It is more a recurring and relapsing problem … that requires an ongoing, dynamic response from our society.”

In the decades that followed its founding, the DEA has been in pursuit of such narcotics as crack cocaine, Ecstasy (MDMA), methamphetamines, and even legal pharmaceuticals (such as the painkiller OxyContin) that are misused and abused. The DEA reports that in 1972 it employed 2,775 workers, including 1,470 special agents. In 2006 those numbers had increased to 10,891 total employees and 5,320 special agents. By the turn of the twenty-first century the DEA had offices throughout the United States and in more than fifty foreign countries.

U.S. Department of Energy

The U.S. Department of Energy (DOE) is responsible for energy policy and security in the United States. The department was established in 1977 by the Department of Energy Organization Act, which was signed by President Jimmy Carter (1924–) and consolidated many energy-related governmental functions into one umbrella organization. The secretary of the DOE is a member of the president’s cabinet.

Background

The origins of the DOE can be traced to World War II and the federal government’s secret “Manhattan Project” to develop the atomic bomb. After the war the Atomic Energy Commission (AEC) was created to maintain civilian government control over the nation’s atomic energy research and development, such as designing and producing nuclear weapons and reactors for naval propulsion.

In the mid-1970s the AEC was replaced by two new agencies: the Nuclear Regulatory Commission (NRC), charged with regulating the nuclear power industry, and the Energy Research and Development Administration, which managed the programs for nuclear weapons, naval reactors, and energy development. But the energy crisis of the 1970s—due to the Arab oil embargo of 1973–74 and later, during the presidency of Jimmy Carter, unrest in Iran—lead the federal government to better coordinate its myriad energy efforts by consolidating them into a single department.

DOE Mission

The DOE’s overall mission focuses on energy, national security, and technology. Its responsibilities include producing and destroying the nation’s nuclear weapons; managing the U.S. Navy’s nuclear reactors; pursuing fossil fuels and renewable energy sources; and sponsoring weapons and energy research and performing weapons cleanup programs. The DOE is also in charge of the U.S. Strategic Petroleum Reserve, which is the nation’s backup oil supply.

During the late 1970s the DOE emphasized energy development, conservation, and regulation. When the oil shortages ended, the United States deemphasized its efforts to find alternative energy sources, and during the 1980s nuclear weapons research, development, and production took priority. Since the end of the cold war in the early 1990s, the DOE has focused on the environmental cleanups of nuclear weapons complexes, nonproliferation and stewardship of the nuclear stockpile, energy efficiency and conservation, and technology transfer and industrial competitiveness.

The Federal Emergency Management Administration

The Federal Emergency Management Administration (FEMA) is an agency within the federal government that is responsible for emergency planning and preparedness, as well as recovery assistance and coordination following natural or human-caused disasters. Depending on the situation, FEMA either provides direct assistance to those in need or works with various federal, state, local, and nonprofit agencies to coordinate responses to emergency situations.

FEMA’s Founding

In response to a request from the National Governor’s Association, FEMA was established by executive order on March 31, 1979, by President Jimmy Carter (1924–) as a means of consolidating the nation’s emergency-related assistance programs. FEMA reported directly to the White House, and in 1993 its secretary was made a cabinet-level position. In March 2003 FEMA lost its cabinet stature when the agency was subsumed into the Department of Homeland Security, which was created by President George W. Bush (1946–) in reaction to the terrorist attacks of September 11, 2001.

The concept that the federal government should directly assist the citizenry in times of crisis stems from legislation called the Congressional Act of 1803, which provided aid to a New Hampshire town devastated by a huge fire. Subsequently, the federal government stepped in with assistance following hurricanes, floods, earthquakes, and similar natural disasters. But the efforts were often disorganized, with various federal agencies taking the lead at different times. The creation of FEMA coordinated all of these activities under one roof.

Among other agencies, FEMA absorbed the Federal Insurance Administration, the National Fire Prevention and Control Administration, the National Weather Service Community Preparedness Program, the Federal Disaster Assistance Administration, and the Defense Department’s Defense Civil Preparedness Agency.

Disaster Responsiveness

Soon after its founding, FEMA had to manage the federal government’s emergency response to the hazardous waste contamination of Love Canal, New York; a Cuban refugee crisis in Miami; and a nuclear reactor accident at the Three Mile Island nuclear power plant near Harrisburg, Pennsylvania. In 1992 the devastation wreaked upon South Florida by Hurricane Andrew was another high-profile disaster requiring FEMA’s involvement.

FEMA was one of the many relief and recovery agencies to respond to the terrorist attacks of September 11, 2001. But nearly five years later, when Hurricane Katrina struck New Orleans and the Gulf Coast states of Mississippi and Alabama in August 2005, FEMA was revealed to be wholly unprepared for the task it was assigned. Experienced staff members who had been appointed by President Bill Clinton (1946–) had been replaced by less-qualified appointees named by President George W. Bush (1946–). The most notorious of these appointments was a new director, Michael Brown, an attorney and Republican fundraiser with no emergency management experience. Blame for FEMA’s lack of preparedness in the aftermath of the storm was placed, at different times, on President Bush, Director Brown, and the usurpation of FEMA’s authority by the newly created Department of Homeland Security.

The FEMA Bureaucracy

In defending its failures after Katrina, FEMA officials emphasized that the agency is not a “first responder.” Instead, FEMA expects state and local emergency teams to handle most disasters, with the federal government principally serving as the rescuer of last resort. If requests cannot be fulfilled by state or local agencies, FEMA can then respond, but only by the direct request of a state’s governor and on the recommendation of that FEMA region’s director. Those requests are then passed to the director of FEMA, who makes recommendations to the secretary of the Department of Homeland Security, who then consults with the president. Even when FEMA is granted permission to assist, the agency defers to rather than directs the state government involved.

According to the agency, in 2007 FEMA employed roughly twenty-six hundred people as part of its permanent staff and had access to more than four thousand reservists who could be deployed in an emergency. First responders at the local level, including firefighters, police officers, and emergency managers, often work with FEMA but are not employed by it.

Important Figures of the Day

Daniel Patrick Moynihan

Daniel Patrick Moynihan (1927–2003) was a social scientist, senator, college professor, and ambassador to the United Nations and to India. He is the only person to have served in the administrations of four consecutive presidents, Democrat as well as Republican.

Early Years

Moynihan was born in Tulsa, Oklahoma, the eldest of three children born to John Henry (a journalist) and Margaret Ann Moynihan. By the time Moynihan was ten, his father had abandoned the family, and from then on he lived in relative poverty as his mother supported her three children by working as a nurse. Living in the slums of Manhattan, the Moynihans moved yearly because, as Moynihan later explained, the first month’s rent was free if the tenant signed a one-year lease. Moynihan did well in school, graduating at the top of his high school class in Harlem in 1943. He then worked as a longshoreman on the Hudson River docks until a friend persuaded him to take the entrance examination for the City College of New York. “I swaggered into the test room with my longshoreman’s loading hook sticking out of my back pocket. I wasn’t going to be mistaken for any sissy college kid. But I passed the test and decided to go to City—and that was the beginning of a lot of things in my life,” he said.

Education and Early Career

After a year in college, he enlisted in the Navy, which sent him to Tufts College for officer training and then to an assignment as a gunnery officer. He was seventeen. In 1947, after his discharge from the military, Moynihan enrolled in Tufts on the GI Bill and graduated cum laude a year later. He received a master’s degree from the Fletcher School of International Law and Diplomacy at Tufts in 1949. Moynihan then won a Fulbright scholarship to the London School of Economics. He enjoyed England so much he stayed two more years, supporting himself by working at the U.S. Air Force base in Middlesex. He returned to the United States in 1953 and worked for Robert F. Wagner Jr. (1910–1991), the Democratic candidate for mayor of New York City. That effort led to work on the gubernatorial campaign of W. Averell Harriman (1891–1986) and a job as special assistant to Harriman after he was elected governor of New York. In 1960 Moynihan worked on the presidential campaign of Democratic candidate John F. Kennedy (1917–1963) and became an assistant to Secretary of Labor Arthur Goldberg (1908–1990) in Kennedy’s administration. While working at the White House he also completed work on a doctoral degree, awarded by Tufts University in 1961.

In March 1963 Moynihan became assistant secretary of labor for policy planning and research and collaborated that same year with sociologist Nathan Glazer (1923–) on the study Beyond the Melting Pot: The Negroes, Puerto Ricans, Jews, Italians, and Irish of New York City. Kennedy asked Moynihan to draft the government’s first antipoverty legislation; after Kennedy’s assassination, Moynihan stayed on with the project under President Lyndon Johnson (1908–1973). A year later Moynihan and others presented a draft of the Economic Opportunity Act of 1964, which led to the governmental programs known as the War on Poverty. Moynihan joined Johnson’s 1964 presidential campaign, writing his speeches and helping to develop the Democratic platform.

The Moynihan Report

In 1965 Moynihan’s department produced “The Negro Family: The Case for National Action.” Commonly known as the Moynihan Report, it urged the federal government to adopt a national policy for the reconstruction of the black family. The report focused on the way government policies affected the stability of urban black families, noting that black children—boys in particular—who grew up in fatherless homes did not do well, particularly if their problems were complicated by racial prejudice and poverty. “From the wild Irish slums of the nineteenth-century Eastern seaboard, to the riot-torn suburbs of Los Angeles, there is one unmistakable lesson in American history: A community that allows large numbers of young men to grow up in broken families, dominated by women, never acquiring any stable relationship to male authority, never acquiring any set of rational expectations about the future—that community asks for and gets chaos. Crime, violence, unrest, disorder…are not only to be expected, they are very near to inevitable,” explained Moynihan in a 1966 article in America magazine. The Moynihan report, which was supposed to be confidential, generated controversy at the time; however, many policy analysts have since come to see Moynihan’s observations as prescient.

Later Career

For three years he was director of the Joint Center for Urban Studies of Harvard University and the Massachusetts Institute of Technology. He then went to work for Republican President Richard Nixon (1913–1994) as an urban-affairs adviser. Moynihan returned to teaching at Harvard in 1970, but three years later accepted Nixon’s appointment as U.S. ambassador to India, where he served for two years. President Gerald Ford (1913–2006) then appointed him as U.S. Permanent Representative to the United Nations.

In 1976 Moynihan was elected to the U.S. Senate from New York. A Democrat, in his four terms he became known for speaking his own mind. For example, he criticized the health-care plans proposed by Democratic President Bill Clinton (1946–) and surprised left-wing members of his party by supporting legislation to ban partial-birth abortion. In 2000 he decided not to run for re-election and retired to teach and write books. He wrote or edited eighteen books during his career and died following surgery in March 2003.

Richard M. Nixon

Richard Milhous Nixon (1913–1994), the thirty-seventh president of the United States, was the only president to resign from the office. His career is overshadowed by the Watergate scandal, which ended his political career. However, his presidency is also remembered for several important foreign policy decisions, including approval of a plan to bomb targets in Cambodia and Laos, which were neutral during the Vietnam War; reestablishment of relations with the People’s Republic of China; and treaty negotiations to end the arms race with the Soviet Union.

Early Years and Education

Nixon, the son of a citrus farmer, was born in Yorba Linda, California, and taught himself to read before he went to first grade. When he was nine, the family moved to Whittier, California, where they regularly attended meetings of the Religious Society of Friends. Also known as Quakers, the Society of Friends is a pacifist religious sect that stresses the importance of a personal relationship with God and the equality of all people. Nixon’s younger brother Arthur died of meningitis when Nixon was twelve, an experience that affected him profoundly. When Nixon graduated from high school he turned down a scholarship from Harvard because his parents could not afford the other expenses necessary to send him there. He saved money by attending Whittier College, graduating summa cum laude with a degree in government and history in 1934. Nixon then enrolled at Duke University Law School on a scholarship and graduated third in his class in 1937.

In 1942, with a recommendation from a former law school professor, Nixon went to work for the Office of Price Administration in Washington, D.C. When the Navy issued a call for lawyers during World War II, Nixon signed on and served in the Pacific, although he never saw combat. He resigned his commission as lieutenant commander in 1945 to run for Congress.

From Congress to the White House

Nixon’s first public attention came during his work on the House Committee on Un-American Activities, where he helped bring about the conviction of Alger Hiss (1904–1996), a high-ranking State Department official, for being a Soviet spy. Nixon repeatedly used the fear that Communists had infiltrated the government—it was the era of the Red Scare—to his political advantage, often claiming that his opponents had Communist sympathies. After two terms in the House, he was elected to the Senate. In 1952 he was chosen by Dwight D. Eisenhower (1890–1969) as his vice presidential running mate. They won and served two terms in the White House.

Nixon ran for president against John F. Kennedy (1917–1963), a Democratic senator from Massachusetts, in 1960. In the first-ever televised debate Nixon, who was recovering from an illness and declined makeup to lighten his skin and cover the stubble on his face, stood out in stark contrast to a rested, clean-shaven, and tanned Kennedy. Kennedy also appeared to many viewers to be confident and competent, allaying fears that he was too young and did not have as much experience as Nixon. Many historians have considered the televised debate to be the deciding moment in the close race: Nixon lost the election.

Nixon returned to California, where he ran for governor and lost dramatically to Edmund G. “Pat” Brown Sr. (1905–1996). He thought the media had favored his opponent in their reporting, so the morning after the election he announced his “last press conference” and told the press, “[You] won’t have Dick Nixon to kick around anymore.”

In 1968 he surprised many by becoming a candidate for president again. With Maryland Governor Spiro Agnew (1918–1996) as his running mate, Nixon defeated the incumbent vice president, Democrat Hubert H. Humphrey (1911–1978), and the former Alabama governor, George C. Wallace (1919–1998), who ran as a third-party candidate.

President Nixon

Once in office Nixon initiated a program of “Vietnamization” that was designed to gradually withdraw U.S. troops from Vietnam. Their role was to be filled by South Vietnamese troops, bolstered by U.S. military supplies and aid. However, he also increased U.S.-led air attacks on North Vietnam and ordered secret bombing campaigns of North Vietnamese supply areas in politically neutral Laos and Cambodia. In 1970 those attacks led to public protest, which prompted Nixon to order government agencies to collect intelligence information on antiwar organizations and individuals who criticized his Vietnam tactics.

In February 1972 Nixon stunned his supporters, as well as the world, by reopening communications with the Communist leadership of the People’s Republic of China and making an official visit. Tension between China and the Soviet Union had grown over the previous decade, especially along their mutual border. Nixon and his national security adviser, Henry Kissinger (1923–), decided to improve relations with China to gain a strategic advantage over the Soviet Union. The “China card,” as it became known, increased pressure on the Soviets to improve relations with the West. In May Nixon became the first U.S. president to visit Moscow, where he and Soviet leader Leonid Brezhnev (1906–1982) signed the Strategic Arms Limitation Talks (SALT I) agreement, a treaty to halt the nuclear arms race. Nixon’s visit also led to combined scientific and aeronautic ventures and bilateral trade accords.

The Watergate Scandal

Nixon was reelected in 1972, defeating George S. McGovern (1922–), a Democratic senator from South Dakota. His second term was marked by the scandal that became known as Watergate, which involved illegal activities by Nixon and his aides during his re-election campaign. Five men, who had been hired by the Republican Party’s Committee to Reelect the President, were arrested for burglary on June 17, 1972, at Watergate, an office-apartment-hotel complex in Washington, D.C., where they had been attempting to wiretap the headquarters of the Democratic National Committee. Shortly after the break-in, Nixon asked White House counsel John Dean (1938–) to oversee a cover-up of the administration’s involvement and authorized secret payments to the Watergate burglars to discourage them from talking. In February 1973 a special Senate committee, chaired by Democratic Senator Sam Ervin (1896–1985) of North Carolina, was set up to investigate the situation. By April several White House aides had resigned, and Dean had been fired. In May a special prosecutor for Watergate-related matters was appointed.

During the Watergate committee’s televised hearings, Dean linked Nixon to the cover-up, and staff members attested to illegal activities of both the Nixon administration and his campaign staff. On July 13 a former White House aide, Alexander Butterfield (1926–), told the committee that Nixon routinely taped conversations in the Oval Office, in the White House cabinet room, in his office in the Executive Office Building, and on four of his personal telephones. Five days later Nixon had the taping system disconnected. The special prosecutor ordered Nixon to turn the tapes over to the Senate Watergate Committee, but he refused, citing executive privilege. On July 24 the Supreme Court ordered Nixon to turn the tapes over to the committee. Three days later the House Judiciary Committee passed three articles of impeachment against Nixon. On August 9 he resigned, and Vice President Gerald R. Ford (1913–2006) took office.

Citizen Nixon

One month after Nixon left office, Ford pardoned him for his role in the Watergate scandal. In retirement Nixon wrote books about foreign policy and his experiences in public life and traveled to Asia, Africa, Europe, and the Middle East. He went to the Soviet Union in 1986 to meet with Mikhail Gorbachev (1931–), then its leader, and was later credited for helping to bring Soviet leaders and the administration of Ronald Reagan (1911–2004) to their arms limitation agreement.

See also Watergate

See alsoUnited States v. Nixon

Warren Burger

Warren Earl Burger (1907–1995) was the fifteenth chief justice of the United States, serving from 1969 to 1986. Under his leadership the court delivered landmark decisions on school desegregation, obscenity, abortion, and religious freedom. It also hastened the resignation of President Richard Nixon (1913–1994) in 1974, when the court ordered Nixon to submit audiotapes of White House meetings for Senate review during the Watergate scandal.

Early Years

After high school, Burger was offered a scholarship to Princeton University, but he declined it because it did not cover enough of his educational expenses. Instead, he took a job at an insurance company and attended classes at the University of Minnesota at night. After two years he transferred to St. Paul College of Law (later renamed William Mitchell College of Law) and graduated magna cum laude in 1931.

Burger joined a leading law firm and during the 1930s became active in Republican Party politics. He helped to organize the Minnesota Young Republicans and managed the successful campaign for governor of another Minnesota lawyer, Harold Stassen (1907–2001). While serving as floor manager for Stassen at the 1948 Republican National Convention, Burger was first introduced to Nixon, then a freshman congressman.

In 1953 Burger was appointed assistant attorney general of the United States by President Dwight D. Eisenhower (1890–1969). Before long Burger gained public notice for his successful prosecution of Greek shipowner Stavros Niarchos (1909–1996) and others for illegally buying surplus U.S. war vessels. Burger resigned this position in 1956, planning to return to his law practice in St. Paul, but Eisenhower appointed him instead to the U.S. Court of Appeals for the District of Columbia. “I never have had a passion to be a judge,” Burger once admitted to friends, but he accepted the position in part because he thought the East Coast climate was better for his wife’s health.

During the thirteen years Burger served on the appeals court, he was critical of a number of decisions regarding criminal procedure that were made by the Supreme Court under the leadership of Chief Justice Earl Warren (1891–1974). He attracted Nixon’s attention in 1967 when U.S. News and World Report reprinted a speech in which Burger insisted the U.S. criminal justice system needed to be changed because, he said, it was slanted toward the criminal. While a candidate for president, Nixon later told Burger, he used two points from the article in a campaign speech. Nixon was no fan of the Warren Court and was critical of it during his 1968 campaign. He thought justices should be strict constructionists—that is, jurists who believed in applying the text of the Constitution or law as written, inferring no additional meaning. Such a judicial philosophy opposes more interpretive approaches—for example, those that attempt to understand the original intentions of the authors of the Constitution or those that consider the social context of the decision. In 1969 Nixon nominated Burger to be chief justice when Warren retired. The honor both surprised and mystified Burger. “I hardly knew Nixon,” he admitted.

The Burger Court

Part of Burger’s influence on the court was administrative. For example, he added staff librarians, clerks, and administrative assistants, and he upgraded the court’s law library and technology. He established the Supreme Court Fellows Program, through which individuals from various professional and academic backgrounds work and study at the Supreme Court to see firsthand how the federal judiciary operates and how it relates to the other branches of government. Burger also helped develop the Federal Judicial Center, which is the education and research agency for the federal courts, and the National Center for State Courts, which provides research, education, and publications for lower courts. As important, he successfully lobbied Congress to limit the court’s case docket, for he believed the court was overloaded.

During his seventeen years as chief justice Burger wrote 265 majority opinions in addition to dissenting or concurring opinions. As a Nixon appointee he was expected to follow a strict constructionist approach to constitutional law and guide the court in that direction. Social conservatives had been critical of the Warren court for its judicial activism—using the court’s authority to extend or effect laws—and expected Burger to reverse this trend. However, the Burger court implemented a pragmatic, case-by-case approach rather than one adhering strictly to a single judicial philosophy.

Many of the cases decided by the Burger Court became landmarks. It upheld the 1966 Miranda v. Arizona decision, which said that individuals under arrest have the right to an attorney and the right to remain silent, and that nothing they say can be admitted in court or used against them unless they agreed to waive their rights. Burger wrote the unanimous opinion for Swann v. Charlotte-Mecklenburg Board of Education (1971), which upheld the use of busing or redistricting to integrate public schools, and for Wisconsin v. Yoder (1972), which defended freedom of religion and refused to force Amish parents to send their children to public schools. In Lemon v. Kurtzman (1971), which established the basic standard that courts should use to ensure the separation of church and state, Burger devised a three-part constitutional test: The government’s action had to have a legitimate secular purpose; the action could not have the primary effect of either advancing or inhibiting religion; and it could not result in what Burger called “excessive government entanglement” with religion. He delivered the defining opinion in Miller v. California (1973), in which he said that local “contemporary community standards” must decide what is considered obscene for each community. He also voted with the majority in Roe v. Wade (1973), which said that most laws against abortion violated a constitutional right to privacy under the Due Process Clause of the Fourteenth Amendment.

Perhaps most memorably, in United States v. Nixon (1974), Burger showed a loyalty to constitutional law instead of to the president who had appointed him. On behalf of a unanimous court, he delivered the decision that ruled against President Nixon, who had insisted that executive privilege gave him the right to maintain the confidentiality of audiotaped recordings of White House conversations. A special prosecutor who was investigating the Watergate scandal wanted the tapes for evidence of criminal activity among Nixon and his associates, and the president was ordered to turn over the recordings “forthwith.” Nixon admitted that he was “disappointed” by the decision, but said he would comply. He resigned seventeen days later. Burger retired from the Supreme Court in 1986.

See alsoUnited States v. Nixon

See also Watergate

Gloria Steinem

Gloria Steinem (1934–) was one of the most important leaders of the women’s movement in the late twentieth century. She cofounded Ms. magazine in 1972 and helped establish numerous feminist organizations beginning in the 1960s. A prominent media figure and public speaker, Steinem tirelessly lobbied for equality for women and for political action on the social and legal issues that most affect women’s lives.

Early Life

Steinem is the granddaughter of suffragist Pauline Perlmutter Steinem (1863–1940), who once addressed Congress about women’s right to vote. Steinem’s early years were spent traveling the country in a mobile home while her father bought and sold antiques. Because the family never stayed long in one place, she was home schooled by her mother, who had a teaching certificate. When Steinem graduated from Smith College in 1956 with a degree in government she embarked on a fellowship for two years of study in India.

Career as a Journalist

In 1960 Steinem moved to New York City, where she worked as a freelance journalist writing for newspapers and magazines. An assignment from Show magazine was her first major investigative-reporting job. Armed with a diary she applied incognito for a position as a “bunny,” or waitress, at the New York Playboy Club. After working there for three weeks, she produced a lengthy article that exposed the degrading treatment and poor wages of the young women hired to work as waitresses in a luxury club for wealthy men. Initially the story resulted in Steinem’s not being taken seriously as a journalist, but it did help to improve the working conditions for the women at the club. In 1984 the story was turned into a film for television, A Bunny’s Tale. Years later, after watching a rerun of the show, Steinem said she finally stopped regretting writing the article and “began to take pleasure in the connections it made with women who might not have picked up a feminist book or magazine, but who responded to the rare sight of realistic working conditions and a group of women who supported each other.”

In 1968 Steinem became a founding member of New York magazine, where she worked as a political columnist until 1972. She covered a meeting of a feminist group and listened as women shared their experiences of having abortions, which at that time were in almost all cases illegal. “I heard women standing up and saying in public and taking serious something that only affected women. Up until then, I’d never heard that. If it didn’t also affect men, it couldn’t be serious,” she said. For some time Steinem had been politically active, working in political, civil-rights, and peace campaigns, but the women’s event, she said, “was the beginning of the unraveling.” Steinem turned her focus to women’s causes.

Steinem, who is white, began speaking across the country with Dorothy Pitman Hughes (1938–), an African-American childcare activist, and later Florynce Kennedy (1916–2000), an African-American lawyer, and Margaret Sloan (1947–2004), an African-American civil-rights activist. “It was a time when even one feminist speaker was a novelty, and interracial teams of feminists seemed to be unheard of since the days of Sojourner Truth,” said Steinem. The activists found dialogue with their audiences after their talks to be the most important. Women spoke honestly about their experiences, and both men and women in attendance heard their stories. “Most of all,” said Steinem, “women in those audiences discovered they were neither crazy nor alone. And so did we.”

Ms. Magazine

In 1971 Steinem helped start Ms. magazine, which began as an insert in New York magazine. When the magazine was launched in 1972, most magazines geared to women had articles about marriage, fashion, and beauty. Ms. had articles that challenged prevailing attitudes about women and their roles, examined stereotypes in child rearing, and questioned sexist language. It was intended, Steinem said, to be “a friend that comes into your house and is something that really tells the truth about your life and will help change life instead of just offering an escape.” Ms. was the first mainstream national magazine to be completely run and managed by women. The three hundred thousand copies of the premiere issue sold out in eight days and elicited twenty-six thousand subscription orders.

A Legacy of Organizations

In 1971 Steinem, along with Shirley Chisholm (1924–2005), Bella Abzug (1920–1998), Myrlie Evers (1933–), and Betty Friedan (1921–2006), formed the National Women’s Political Caucus, an organization dedicated to increasing women’s participation in politics. Steinem helped found other organizations, including Voters for Choice, the Women’s Action Alliance, the Coalition of Labor Union Women, and the Women’s Media Center. She also participated in the creation of Take Our Daughters to Work Day in 1993, an effort to expose young women to the realities of the workplace and to help them determine future career goals.

Over the course of her career Steinem produced several books, including Outrageous Acts and Everyday Rebellions (1983) and Revolution from Within: A Book of Self-Esteem (1992). In 1993 she was inducted into the National Women’s Hall of Fame.

On September 3, 2000, when she was sixty-seven, she surprised many observers by marrying David Bale (1941–2003), a South African pilot and entrepreneur, who had once been banned from South Africa because of his antiapartheid work. Steinem herself was astonished by the public reaction to her marriage: “What surprised me was that no one saw how much marriage has changed since the ’60s, when I would have had to give up most of my civil rights.” In the ceremony, the word partners was used instead of husband and wife. Bale died three years later from brain lymphoma.

Gerald R. Ford

Gerald Rudolph Ford (1913–2006) was the thirty-eighth president of the United States and the first person to become president without having been elected either vice president or president. Ford had aspired to be speaker of the House; instead, as scandals caused the removal of first the vice president and then the president from office, Ford assumed the executive office in August 1974.

Early Years and Career

Ford was born Leslie Lynch King Jr. in Omaha, Nebraska. Two weeks later his mother separated from her husband and moved to Grand Rapids, Michigan, to live with her parents. She divorced King, and two years later married Gerald R. Ford, a paint salesman. The Fords began calling her son Gerald Jr., but his name was not legally changed until 1935. When he was thirteen Ford learned that Gerald Sr. was not his biological father, and when he was seventeen he finally met King, who casually stopped by on a road trip to Detroit.

Ford attended the University of Michigan, where he played center on the football team and was voted Most Valuable Player during the 1934 season. In 1935 he played in the East-West college game in San Francisco and the Chicago Charities College All-Star game against the Chicago Bears of the National Football League. He graduated with a bachelor’s degree in economics and political science in 1935. The Detroit Lions and the Green Bay Packers offered him professional football contracts, but Ford declined both in order to take a coaching job at Yale, where he hoped to attend law school. Initially denied admission to Yale Law School because of his full-time coaching jobs, Ford was accepted in the spring of 1938. He worked on the presidential campaign of Republican Wendell Willkie (1892–1944) in 1940.

Ford returned to Grand Rapids after his graduation from law school in 1941 and opened a practice with Philip A. Buchen, a friend from his time at the University of Michigan. Ford also became active in local politics and helped launch a reform group of like-minded Republicans, called the Home Front, who were opposed to a local political boss.

Military Career

When the United States entered World War II, Ford enlisted in the Navy and was sent first to the naval academy in Annapolis, Maryland, as a physical training instructor and later to Chapel Hill, North Carolina, as an athletic training officer. He served on the carrier USS Monterey as a gunnery officer and athletic officer, later becoming an assistant navigator. During Ford’s tour of duty the ship was involved in operations in the Pacific theater, including the Battle of Makin in November 1943 and the Battle of the Philippine Sea in April 1944.

A Run for Congress

Encouraged by his associates in the Home Front; by his stepfather, who was county Republican Party chair; and by Senator Arthur Vandenberg (1884–1951), Ford ran for the House of Representatives in 1948 and won. He was reelected twelve times. During his first term he, along with former Navy lieutenant commander Richard Nixon (1913–1994) and a number of other young House Republicans, opposed monthly bonuses for war veterans, which they deemed too costly. In 1951 Ford was appointed to the House Appropriations Committee, where he served for many years. By 1961 he had become the ranking Republican on the Defense Appropriations Subcommittee.

The Warren Commission

On November 29, 1963, one week after the assassination of President John F. Kennedy (1917–1963), Ford was appointed by President Lyndon Johnson (1908–1973) to a commission to investigate Kennedy’s murder. In their report the seven members of the commission—which became known as the Warren Commission after its chair, Chief Justice Earl Warren (1891–1974)—concluded that Lee Harvey Oswald (1939–1963) acted alone in killing Kennedy and that there was no evidence of a conspiracy in the assassination. They proposed the strengthening of Secret Service protection for the president and legislation that would make it a federal offense to kill either the president or vice president.

Rise to the Presidency

In 1964 Ford was elected House minority leader, the top position among Republicans in Congress. He held this position until 1973, when he was appointed by Nixon to replace Vice President Spiro Agnew (1918–1996). Agnew had resigned in a scandal that emanated from his earlier years as governor of Maryland. Ford was the first vice president nominated under the Twenty-fifth Amendment to the Constitution, which clarified procedures for filling vacancies in the offices of vice president and president. When Nixon resigned the following year as a result of the Watergate scandal, Ford became president.

The Chief Executive

On August 9, 1974, Ford took the oath of office, acknowledging “I have not sought this enormous responsibility, but I will not shirk it.” In a reference to Watergate, he said, “My fellow Americans, our long national nightmare is over. Our Constitution works; our great Republic is a government of laws and not of men.… As we bind up the internal wounds of Watergate, more painful and more poisonous than those of foreign wars, let us restore the golden rule to our political process, and let brotherly love purge our hearts of suspicion and of hate.” He selected former New York governor Nelson Rockefeller (1908–1979) to be vice president.

Less than a month later Ford granted a full pardon to Nixon, which resulted in public outcry and accusations of a prearranged deal. Ford maintained that no arrangement had been made and that the pardon was the right thing to do for the country. He feared that ongoing legal proceedings against Nixon would intensify partisan rancor, impede progress on other issues, and damage U.S. credibility abroad. Ford wanted the Watergate era to end. In announcing Nixon’s pardon, he declared “My conscience tells me that only I, as President, have the constitutional power to firmly shut and seal this book. My conscience tells me it is my duty, not merely to proclaim domestic tranquility but to use every means that I have to insure it.”

During his administration Ford issued limited amnesty for men who had evaded the draft during the Vietnam War. Under Ford’s plan, those who had resisted service were to be pardoned in exchange for two years of civilian service. In 1975 he announced the Vietnam War was “finished as far as America is concerned” and ordered the evacuation of U.S. personnel and high-risk South Vietnamese officials from Saigon. Soon afterward, in April 1975, Saigon fell to Communist forces.

Ford signed the Federal Election Campaign Act Amendments of 1974, which placed limits on political campaign contributions; the Privacy Act of 1974, which prohibited the unauthorized release by federal agencies of information about individual citizens; and the Government in the Sunshine Act, which required that meetings of government agencies be open to public view. Ford vetoed the Freedom of Information Act Amendments, which clarified procedures for public access to information from government agencies, but Congress overrode his veto and the bill became law.

In 1975 two assassination attempts were made on Ford’s life, the first by Lynette “Squeaky” Fromme (1948–), a follower of cult leader Charles Manson (1934–), and the second, only a few weeks later, by Sara Jane Moore (1930–), a member of left-wing radical groups. He was not injured.

Ford was defeated by Jimmy Carter (1924–) in the 1976 presidential election. Many believe he lost because he had pardoned Nixon. However, Carter acknowledged Ford’s role in getting the country past the Watergate scandal when he thanked Ford in his inaugural address “for all he has done to heal our land.” In 2001 the John F. Kennedy Foundation presented Ford the Profiles in Courage Award for his pardon of Nixon, which “ended the national trauma of Watergate,” according to Caroline Kennedy Schlossberg (1957–), the president’s daughter. In doing so, she said, “he placed his love of country ahead of his own political future.”

Jimmy Carter

James Earl “Jimmy” Carter Jr. (1924–), the thirty-ninth president of the United States and former governor of Georgia, is a human-rights activist and recipient of the Nobel Peace Prize for his accomplishments after leaving the White House. When he was elected president in 1976, he became the first chief executive from the Deep South since the election of Zachary Taylor (1784–1850) in 1848.

Early Years and Career

Carter was born in Plains, Georgia, into a strongly religious family devoted to evangelical Christianity and Democratic politics. He graduated from the local public school in 1941 and attended Georgia Southwestern College and Georgia Institute of Technology before being accepted to the U.S. Naval Academy. Ranked in the top 10 percent of his class, he received a bachelor’s degree in engineering in 1946, then went to submarine officer training school. He later served in both Atlantic and Pacific fleets and ultimately achieved the rank of lieutenant. Under Admiral Hyman Rickover (1900–1986) Carter served on the USS Seawolf, one of the nation’s first nuclear submarines.

From State Senator to Governor

In 1953 Carter resigned his naval commission and took over the family peanut farm after the death of his father. He began serving on local governing boards that oversaw education and hospital management and decided to run for state senator, an office his father had held a year before he died. Carter lost the primary by 139 votes, but was declared the winner after a recount that exposed corruption and racial discrimination in election administration. He subsequently won the seat and was sworn in as a Georgia state senator in 1963.

Three years later Carter joined the race for the governor’s office. He lost the Democratic nomination to Lester Maddox (1915–2003), a restaurant owner and segregationist, who later won the general election. In response to this political disappointment, Carter turned to religious works. He served as a missionary in Pennsylvania and Massachusetts, taught Sunday school in Plains, and gave talks about Christianity.

In 1970 Carter again ran for governor, and a contentious debate on racial segregation raged throughout the campaign. Because he refused to condemn such segregationists as Governor George C. Wallace (1919–1998) of Alabama during his campaign appearances, voters inferred that Carter approved of policies that maintained separation of the races. Carter won the election. At his inauguration, the newly elected governor startled his constituents and garnered national attention when he declared, “[The] time for racial discrimination is over.… No poor, rural, weak, or black person should ever have to bear the additional burden of being deprived of the opportunity of an education, a job, or simple justice.” Undeterred by criticism, Carter appointed blacks and women to state government positions and had portraits of Martin Luther King Jr. (1929–1968) and other notable black Georgians hung in the state capitol. Time magazine put him on its May 31, 1971, cover with the headline, “Dixie Whistles a Different Tune.” Carter created biracial groups to manage racial tension, promoted prison reform, and developed new programs in health care and education. He instituted a governmental reorganization plan that combined some three hundred state agencies into twenty-two.

Two years into his term Carter decided to run for president against incumbent President Gerald Ford (1913–2006). While he had a network of Democratic Party leaders throughout the country, he still was not widely known to the vast majority of voters. With the words “Hi! My name is Jimmy Carter, and I’m going to be your next president” and his characteristic toothy grin, he achieved recognition. During the campaign he focused on his character, presenting himself as an honest man and not a Washington insider. He chose Minnesota Senator Walter F. Mondale (1928–) as his vice presidential running mate, and together they won the election. On Inauguration Day, in keeping with his folksy, approachable manner, Carter startled everyone by getting out of the presidential limousine and walking with his wife and daughter along Pennsylvania Avenue to the White House.

The Presidency

The day after his inauguration Carter extended conditional amnesty to Vietnam draft resisters who had either not registered or had fled the country. The pardon, however, excluded deserters and soldiers who received dishonorable discharges. Response was mixed, mostly negative from veterans’ groups, but, as Carter said later, “I thought the best thing to do was to pardon them and get the Vietnam War behind us.” During his presidency Carter increased the number of women, blacks, and Hispanics appointed to government positions. He taught Sunday School classes in Washington, D.C., and sent his daughter to public school.

As an administrator he divided the Department of Health, Education, and Welfare into the Department of Health and Human Services and the Department of Education, and he created the Department of Energy. When a nuclear reactor at the Three Mile Island power plant in Pennsylvania experienced a partial meltdown, Carter personally went to the site and assessed the reactor in an effort to calm public anxiety. One month later, however, some sixty-five thousand demonstrators marched in Washington, demanding all U.S. nuclear plants be shut down. Carter held firm, saying that such a policy was “out of the question.”

Many of Carter’s most significant accomplishments were in foreign policy. In 1978 he invited Israeli Prime Minister Menachem Begin (1913–1992) and Egyptian President Anwar Sadat (1918–1981) to meet with him at Camp David, the presidential retreat in Maryland, where he brokered a peace accord between the two countries. A year later he watched as they signed the Egypt-Israel Peace Treaty at the White House. He instituted full diplomatic relations with the People’s Republic of China, building on the work begun during the presidency of Richard M. Nixon (1913–1994), and obtained ratification of the Panama Canal treaty, which established a timetable for transferring the canal to Panamanian sovereignty. He also completed negotiation of the second Strategic Arms Limitation Treaty (SALT II), which he signed with Soviet leader Leonid Brezhnev (1906–1982); however, the Soviets’ invasion of Afghanistan in 1979 forced Carter to ask the Senate to table the treaty.

The Iran Hostage Crisis

Carter’s foreign policy achievements are often overshadowed by a crisis that developed in 1979 after he allowed Mohammad Reza Pahlavi (1919–1980), the shah of Iran, to enter the United States for medical treatment. The shah had fled Iran after a year of public unrest led by Muslim cleric Ayatollah Ruhollah Khomeini (1902–1989) in response to alleged human rights abuses during Pahlavi’s regime. Angered by Carter’s perceived support of the shah, militant Iranian students overtook the U.S. embassy in Tehran on November 4, 1979, seizing sixty-three hostages and demanding that the shah be returned to Iran for trial. Carter froze all Iranian financial assets in the United States. A few days later the Iranians freed thirteen African-American and female hostages, but continued to hold the others. On April 24, 1980, a U.S. military mission to rescue the hostages had to be aborted when a helicopter crash resulted in the deaths of eight servicemen and injuries to three. Days before, Secretary of State Cyrus Vance (1917–2002) had resigned over the decision to proceed with the mission. On July 11 the Iranians released one more prisoner who had developed a serious medical condition. On July 27 the shah died. Two months later the Iranian government indicated it was willing to discuss releasing the hostages.

Meanwhile, former California governor Ronald Reagan (1911–2004) was running against Carter for president. On November 2, the Iranian parliament issued a statement, saying the hostages would not be released before the election. Two days later Reagan won a landslide victory. Minutes after Reagan was inaugurated in January 1981, the Iranians released the hostages. On behalf of Reagan, Carter flew to Germany to meet the freed hostages. Although the ordeal had ended, many critics blamed Carter’s inaction and lack of resolve in ending the hostage crisis with both prolonging the suffering for the hostages and ending his own political career.

Human Rights Activism

After his presidency Carter established the Carter Center in Atlanta, Georgia, with the goal of advancing human rights and alleviating human suffering internationally. It offers a variety of programs to resolve conflicts and fight diseases in many parts of the world. He and his wife also became involved in Habitat for Humanity, an interfaith organization that helps people of limited means build their own homes. Each year the couple devotes one week to construction projects in the United States and elsewhere. In 2002 Carter was awarded the Nobel Peace Prize for “his decades of untiring efforts to find peaceful solutions to international conflict.”

See also The Iran Hostage Crisis

Political Parties, Platforms, and Key Issues

Polling and Public Opinion

Public opinion polling is used to assess the views of a specific segment of the population—or of the entire nation—by asking questions of a representative sample of people. Public opinion polling is used by marketers trying to sell services or products and by politicians to understand the will of the people and sell themselves and their policies to the electorate at large.

How Polls Are Conducted

In the earliest days of public opinion polling—believed to have occurred in 1824 during the presidential election between Andrew Jackson (1767–1845) and John Quincy Adams (1867–1848)—the survey method was a “straw poll,” which in many ways is like asking a group of people gathered in a particular place to indicate their views with a show of hands, and then counting and comparing those votes. Straw polls are not scientific and they are generally not representative of overall public opinion.

Later polling methods came to include mailings, in which one or many questions are sent to a selection of addresses and the results are tallied from the responses received. While these types of surveys can be sent to thousands of people at once, it is difficult to determine whether the results represent a true sampling of opinion. One potential problem, for example, is that the only people who respond might be those who were willing or able to spend the time and the cost of postage.

Public opinion polls can also be conducted face to face (either on the street or at people’s homes), by telephone, or, since the late 1990s, via e-mail or the Internet.

The Gallup Methodology

Although there are many outlets through which to conduct an opinion poll, only those that survey what has come to be called “a scientific sample” of the population are considered accurate. In the 1930s a sociologist named George Gallup (1901–1984) devised a polling method in which he could survey a small but scientifically selected sample of individuals that could accurately represent the population at large. For example, if it is known that 51 percent of the adult population in the United States is female and 49 percent is male, rather than survey every man and woman in the country, Gallup could survey just 1,000 people, making sure that 510 of the participants were female and 490 male. The results of Gallup’s small survey could be extrapolated to what the citizenry at large is thinking.

Polls based on such scientific samples are subject to a “sampling error” or estimated level of imprecision; for example, a result may be considered accurate to within plus or minus three percentage points. What cannot be controlled by any polling method is the “bias” effect, which acknowledges that there may be people that polling just cannot reach, such as individuals who will not accept phone calls from strangers. Another variable is called “response bias,” which is an acknowledgment that some respondents do not tell pollsters their true beliefs—perhaps out of embarrassment, fear, impatience, or because the questions asked were manipulative and written in a way meant to elicit a certain response.

George Gallup founded the American Institute of Public Opinion in 1936. Since then, his polling methodology has dominated marketing and political polling. Agencies similar to Gallup include the Roper Poll, the Harris Poll, and the National Opinion Research Center.

Political Polling

Abraham Lincoln (1809–1865) once said, “What I want to get done is what the people desire to have done, and the question for me is how to find that out exactly.” Since the 1930s and the fine-tuning of polling techniques by George Gallup, U.S. presidents and presidential candidates have used public opinion polls to take the pulse of the country. While both Franklin D. Roosevelt (1882–1945) and Dwight D. Eisenhower (1890–1969) sought the assistance of public opinion polling during their campaigns and administrations, Harry Truman (1884–1972) did not. John F. Kennedy (1917–1963) was the first presidential candidate to rely heavily on polling during the campaign, and every presidential candidate since has included pollsters as members of the campaign team. Presidents Lyndon Johnson (1908–1973), Richard Nixon (1913–1994), Jimmy Carter (1924–), and George H. W. Bush (1924–) reached out to pollsters during their presidencies, and Ronald Reagan (1911–2004), Bill Clinton (1946–), and George W. Bush (1946–) routinely used polling as a tool before and during their respective two terms in the White House.

Media polls conducted by news organizations and nonpartisan polling agencies such as Gallup and Roper are generally considered to provide fair and accurate assessments of the subject at hand, but beginning with Kennedy, presidents also hired their own pollsters. Such private polling has become increasingly political and partisan, with public opinion polls specifically commissioned and scripted to support certain policies and points of view.

Current Events and Social Movements

The Black Panther Party

The Black Panther Party was a civil rights organization that espoused black nationalism and, if necessary, violent revolution as a means for African-Americans to achieve liberation from white oppression. Strongly socialist in nature, the Black Panthers sought to end police brutality, reduce the number of African-American men incarcerated in American prisons, provide meals and education for African-American children and, ultimately create a separate society for blacks in the United States.

The Origins of the Black Panther Party

Originally called the Black Panther Party for Self Defense, the Black Panthers were founded by Huey P. Newton (1942–1989) and Bobby Seale (1936–) in Oakland, California, in 1965 after riots in the impoverished Watts neighborhood of Los Angeles and the assassination of Nation of Islam leader Malcolm X (1925–1965). Although by 1965 the federal government had codified equal rights and protections for people of all races into the law, actual improvements in the lives of poor African-Americans were minimal, if not nonexistent. To the Panthers, the guarantees of equality secured by the Civil Rights Act of 1964 were inadequate—especially when considered in the context of the country’s long history of slavery, segregationist laws, and tolerance for the murderous actions of white supremacists in the Ku Klux Klan.

The Black Panther Movement

Police brutality against African-Americans was the specific impetus for the organization of the Black Panthers. The Panthers considered the ghettos where poor African-Americans lived to be “occupied territories” brutalized by a racist government. Hence, they believed that jailed black men were “prisoners of war” and that convictions made by white juries were invalid because the defendants had not been tried by a jury of their peers. The Black Panther Party argued that African-Americans should not be drafted to serve in Vietnam because both African-Americans and the people of Vietnam were at war with the U.S. government.

Unlike the more mainstream activists of the civil rights movement—the National Association for Colored People (NAACP) and the Southern Christian Leadership Conference (SCLC) headed by the Reverend Martin Luther King Jr. (1929–1968)—the Black Panther Party did not seek as its ultimate goal equality and integration with whites. Nor did the Panthers’ founders accept King’s peaceful resistance and the NAACP’s strategy of securing equality and a guarantee of Constitutional rights through the federal courts. Instead, the Black Panthers advocated violence, revolution, and armed resistance in order to gain complete black independence from whites or any other authority. Whereas King and other Southern civil rights activists dressed in business attire, the Panthers’ wardrobe consisted of black leather jackets and black berets, often accessorized with shotguns. There were frequent confrontations between the Panthers and law enforcement officials, which resulted in shoot-outs, deaths on both sides, and the arrests of Panther members.

In the mid-1970s a rift developed between Black Panther members who wanted to scale back the violence and expand the party’s mission to serve all oppressed people and those who wanted to maintain the “Black Power” focus. But the Panthers’ stated mission did expand to include community service work, in part to quell its reputation for being solely a paramilitary organization. The Panthers started a free breakfast program for schoolchildren, founded free health clinics, distributed clothing to the needy, and established transportation programs for the elderly and the families of prisoners.

Public Enemy #1

Critics alleged that the goods and money distributed by the Panthers were attained by intimidating local businesses and individuals. The Panthers maintained that the Federal Bureau of Investigation (FBI) harassed the service program’s volunteers and vandalized its facilities in order to undermine their good works. In the late 1960s the Black Panther Party was labeled “Public Enemy #1” by the FBI. The Panthers were the primary concern of the agency’s counterintelligence program, commonly referred to as COINTELPRO. FBI Chief J. Edgar Hoover (1895–1972) even declared the Black Panthers to be “the greatest threat to the internal security of the United States.”

In 1967 Huey P. Newton was shot in a gun battle with police and was charged with manslaughter in the death of a white police officer. He was convicted and served two years before his case was overturned on appeal. Two more trials ended in hung juries. In 1968 Bobby Seale was charged with trying to violently disrupt the Democratic National Convention in Chicago. He was convicted, but the verdict was later overturned. Both men resigned from the Black Panther Party in 1974. In 1980 Newton earned a Ph.D. in social philosophy from the University of California at Santa Cruz. He continued to have run-ins with the law, including prison time, before he was fatally shot at the age of forty-seven in Oakland in 1989. The Dr. Huey P. Newton Foundation keeps his name and the Black Panther history alive.

Lasting Impact

By the late 1970s the party’s infighting and the FBI’s pursuit of its members had weakened the organization. The Panthers are considered to have fully disbanded in 1982. Although the group’s official membership is believed to have never exceeded five thousand, the Black Panthers had chapters around the country and achieved worldwide fame. Their efforts inspired militant movements among other minority groups, including Mexican Americans in Southern California (who formed the Brown Berets), Chinese Americans in San Francisco (the Red Guards), and even a band of disgruntled senior citizens (the Gray Panthers). The Black Panther Party remains a symbol of Black Power and the counterculture movements of the 1960s.

Black Panther Party for Self-Defense: The Ten-Point Plan

Echoing the prose of Thomas Jefferson (1743–1826) in the Declaration of Independence, in 1966 Huey P. Newton (1942–1989) and Bobby Seale (1936–) issued the Black Panther Party’s manifesto, which they titled The Ten-Point Plan. In 1972 the points were amended to expand the Panthers’ mission to benefit “all oppressed people.”

The cofounders of the Black Panther Party devised the plan while sitting in an Oakland, California, jail after an altercation with police. Their final text was a combination of both lofty and tangible demands. Great emphasis was placed on the principals of self-determination, liberty, peace, justice, and the “immediate end to police brutality” and “all wars of aggression.” In addition, the Panthers called for full employment, decent housing, food, education, and “completely free health care” for all oppressed people inside the United States. They wanted blacks to be exempt from military service and called for the release of “all black and oppressed people now held in U.S. federal, state, city, and military prisons and jails.”

Speaking to a college audience in 2006, four decades after issuing the Ten-Point Plan, Seale commented: “A lot of people thought the Black Panther Party started because we wanted to be macho with some guns, but we were readers and researchers.… We not only captured the imagination of the African-American community, we captured the imagination of everybody.”

The Environmental Movement

Concerns about the environment began to get attention in the 1960s, but the year 1970 marked a turning point in the public’s awareness and concern about pollution and other environmental hazards. This acknowledgement of the problem, coupled with a building energy crisis, gave birth to the environmental movement.

Early Warnings

In 1962 biologist Rachel Carson (1907–1964) published Silent Spring, a book about the potentially deadly effects of chemical fertilizers and pesticides on animals, plants, and humans. Carson’s study alerted readers to the possibility that even eating a salad could have fatal consequences. The book became an unexpected bestseller and inspired President John F. Kennedy (1917–1963) to order an investigation into the author’s claims. In May 1963 Kennedy’s Science Advisory Committee supported Carson’s findings. These efforts eventually led to a ban on DDT and other chemicals, as well as to legislation establishing auto emission reduction efforts and standards. For the first time, federal, state, and local governments were addressing environmental problems and concerns.

Environmentalism Takes Off

The alarm raised by Carson’s Silent Spring was only the first of many environmental concerns of the time. One tactic used by the United States in the Vietnam War was the dropping of herbicides from airplanes to defoliate the jungles in which the Vietcong and North Vietnamese were hiding. These chemicals—particularly the most notorious one, known as Agent Orange—degraded over time and released toxins into the environment that are believed to have caused cancers and genetic defects in the populations of the region as well as American war veterans who came into contact with them. Much was also made of the assertion that the United States, while comprising only 6 percent of the world’s population, was responsible for consuming more than 30 percent of its resources.

At the time of the first Earth Day, on April 22, 1970, many of America’s rivers and lakes were so polluted they were considered to be dying. Smog caused by industry and automobiles plagued cities. The bald eagle was on the verge of extinction, and communities such as Love Canal, in upstate New York’s Niagara Falls region, were discovering that their homes had been built above toxic waste buried decades earlier, resulting in severe birth defects of many residents’ children. The Love Canal crisis would lead to the 1980 passage of the Comprehensive Environmental Response, Compensation, and Liability Act, commonly called the Superfund Act.

Although largely hobbled by the quagmire of the Vietnam War and a stumbling economy, President Richard M. Nixon (1913–1994) responded to the rising environmental movement with numerous initiatives, the most significant of which was the establishment, in December 1970, of the Environmental Protection Agency (EPA), an independent federal agency charged with promoting, safeguarding, and enforcing environmental protection.

Less than two weeks later, the EPA’s first administrator, William D. Ruckelshaus (1932–), told the mayors of the heavily polluted cities of Cleveland, Detroit, and Atlanta that if they did not come into compliance with water regulations, the EPA would take court action. Also that December, Congress passed the Clean Air Act of 1970, which brought substantive changes to the federal air quality program. The law set statutory deadlines for reducing automobile emission levels: 90 percent reductions in hydrocarbon and carbon monoxide levels by 1975 and a 90 percent reduction in nitrogen oxides by 1976.

The Movement’s Lasting Impact

Because environmentalism is a cause that has extreme supporters and equally extreme detractors, the EPA has often been in the middle of a tug-of-war, with the health and safety concerns of the citizenry and science communities on one side, and the corporate world and labor unions on the other. Looking back, the EPA’s Ruckelshaus commented that although the EPA was to be the government’s environmental advocate, the agency was typically forced into the position of a mediator “caught between two irresistible forces. [There] was one group, the environmental movement, pushing very hard to get emissions down no matter where they were—air, water, no matter what—almost regardless of the seriousness of emissions. There was another group … pushing just as hard in the other direction and trying to stop all that stuff, again almost regardless of the seriousness of the problem.”

Some presidential administrations have strengthened the EPA’s authority, while others have diminished its credibility and impact. Economic issues, such as the energy crises in the 1970s, pitted the pursuit of nuclear energy and drilling for oil in Alaska against public fears about the safety of nuclear power and the objections of environmentalists concerned about destroying natural habitats.

Environmental legislation flowed steadily from the Nixon administration and those of his successors Gerald R. Ford (1913–2006) and Jimmy Carter (1924–). In addition to establishing regulations and standards for safety and clean-up, federal environmental efforts also included land preservation, whereby millions of acres throughout the United States were set aside as protected lands, never to be disturbed, exploited or developed. In 1980 the Alaska National Interests Lands Act set aside in perpetuity the preservation of one hundred million acres of wild lands.

According to the EPA, between 1970 and 2004 total emissions of the six major air pollutants in the United States dropped by 54 percent. During the same period, through land restoration efforts, six hundred thousand acres of contaminated land were reclaimed. Although many environmental hazards and energy challenges remain (chief among them global warming and dwindling oil supplies), since the celebration of the first Earth Day, twice as many rivers and lakes are safe for fishing and swimming today as in 1970. Drinking water is safer and auto emissions and ozone-damaging gases have been reduced. Natural lands have been protected, toxic waste sites have been cleaned up, and the bald eagle has been removed from the Endangered Species List.

Watergate

A bungled 1972 break-in at the Democratic party’s Washington, D.C., headquarters in the Watergate hotel complex lead to the resignation two years later of U.S. President Richard M. Nixon (1913–1994). The incident, which occurred five months before Nixon’s landslide victory for a second term, had been executed by members of his administration and reelection committee. The scandal became a textbook example of how “the cover-up is worse than the crime.” The Watergate scandal undermined the nation’s trust in its leaders and raised lasting debates about the Constitution and the powers of the presidency. The repercussions of Watergate continued into the twenty-first century.

The Break-in and the Cover-up

At 2:30 a.m. on June 17, 1972, police caught five men attempting to break into and wiretap the offices of the Democratic National Committee (DNC). During court proceedings later that year, it was revealed that the burglars and two accomplices had connections to the Committee to Re-elect the President (CRP; commonly referred to as “Creep”). All seven men were convicted in January 1973, and two months later one of them, former CIA agent James McCord Jr. (1924–), wrote to trial judge John J. Sirica (1904–1992) that the burglary was in fact part of a larger, high-level political conspiracy and cover-up.

The Washington Post

During the summer of the June break-in, Bob Woodward (1943–) and Carl Bernstein (1944–), two reporters for the Washington Post, had begun their own investigation of the Watergate break-in and its political connections. Their efforts were helped by leads from a government source acquired by Woodward and identified only as “Deep Throat,” who told them to “follow the money.” In doing so, they uncovered connections between Nixon’s attorney general, John Mitchell (1913–1998), and funds used to finance espionage operations against Democrats. Woodward and Bernstein’s reporting helped unravel the Watergate conspiracy.

The Watergate Hearings

During the June 1973 Senate hearings on the incident, John Dean (1938–), the former White House counsel who had been dismissed by Nixon two months earlier, testified that the Watergate break-in was approved by John Mitchell—who had resigned his post as attorney general in January 1972 to head CRP—with the involvement of White House advisers John Ehrlichman (1925–1999) and H. R. (Bob) Haldeman (1926–1993). Dean asserted that Nixon knew about and approved of the resulting cover-up. He also revealed that the Nixon White House routinely spied on political rivals.

Under pressure from Congress, Nixon’s third attorney general, Elliot Richardson (1920–1999), appointed a special prosecutor, Archibald Cox (1912–2004), to investigate the Watergate affair. In doing so, Cox’s staff uncovered evidence of spying by CRP, the illegal wiretapping of citizens by the Nixon administration, and bribes from corporate contributors to the Republican Party. When Cox demanded in July 1973 that the White House turn over Oval Office tape recordings of the president’s conversations, Nixon claimed “executive privilege.” After a summer of court battles over the tapes, Nixon ordered his attorney general to fire the prosecutor. Both Richardson and his assistant, William Ruckelshaus (1932–), resigned their jobs rather than dismiss Cox. On October 20, 1973, in what was dubbed by the press the “Saturday Night Massacre,” Robert Bork (1927–), the U.S. solicitor general who was serving as acting attorney general, succeeded in firing Cox.

Cox’s dismissal backfired on Nixon. The citizenry, the press, and congressional leaders began calling for the president’s impeachment. In November 1973 Nixon’s Justice Department appointed a new special prosecutor, Leon Jaworski (1905–1982). At a November 17 press conference, Nixon defended himself against the growing accusations of his misdeeds by declaring, in what would become an iconic moment of his presidency, “I am not a crook.” Within a week, two of the subpoenaed White House tapes were declared to be missing and another was revealed to contain an 18-½-minute-long erased gap. Investigators believed that the White House was destroying evidence.

Over the next several months Jaworski indicted and convicted several administration officials, including Ehrlichman and Haldeman. A grand jury investigation cited Nixon as an “unindicted conspirator,” believing it was unconstitutional for a prosecutor to indict a sitting president. Jaworski referred any further inquiries into Nixon’s actions to the Judiciary Committee of the House of Representatives.

Ignoring numerous subpoenas for the Oval Office tapes, in April 1974 Nixon provided edited transcripts of the recordings to the Judiciary Committee. On July 24 of that year, the U.S. Supreme Court rejected Nixon’s “executive privilege” defense and unanimously affirmed a lower court ruling in The United States v. Nixon that the White House must turn over the subpoenaed tapes.

Articles of Impeachment and Resignation

During the last week of July 1974, the House Judiciary Committee adopted three articles of impeachment against Nixon, for obstructing the Watergate investigation; for misusing power and violating the oath of office; and for failing to comply with House subpoenas.

On August 5, 1974, Nixon released transcripts of conversations that had occurred between himself and H. R. Haldeman six days after the Watergate break-in. The transcripts, which become known as the “smoking gun,” confirmed that Nixon both ordered the FBI to stop its Watergate break-in investigation and directed a cover-up of the White House’s involvement.

On August 9, 1974, under threat of certain impeachment, Nixon resigned the presidency. Vice President Gerald Ford (1913–2006) succeeded him. (Nixon’s original vice president, Spiro Agnew (1918–1996), had resigned in October 1973 due to federal tax evasion charges.)

On September 8, 1974, President Ford granted Nixon a “full free and absolute” pardon for “all offenses against the United States” committed between January 20, 1969, and August 9, 1974. In January 1975 Haldeman, Ehrlichman, and Mitchell, among others, were convicted for their roles in the Watergate scandal. Nearly forty officials in the Nixon administration were convicted for crimes involving Watergate or other offenses.

See alsoUnited States v. Nixon

See also The Independent Counsel

Granting Pardon to Richard Nixon by the President of the United States: A Proclamation

On September 8, 1974, Vice President Gerald Ford (1913–2006), who took the presidency after the resignation of Richard M. Nixon (1913–1994) on August 9, officially pardoned Nixon, absolving him of all criminal liability in the Watergate scandal. Below is a transcript of Ford’s pardon.

Richard Nixon became the thirty-seventh President of the United States on January 20, 1969, and was reelected in 1972 for a second term by the electors of forty-nine of the fifty states. His term in office continued until his resignation on August 9, 1974. Pursuant to resolutions of the House of Representatives, its Committee on the Judiciary conducted an inquiry and investigation on the impeachment of the President extending over more than eight months. The hearings of the Committee and its deliberations, which received wide national publicity over television, radio, and in printed media, resulted in votes adverse to Richard Nixon on recommended Articles of Impeachment. As a result of certain acts or omissions occurring before his resignation from the Office of President, Richard Nixon has become liable to possible indictment and trial for offenses against the United States. Whether or not he shall be so prosecuted depends on findings of the appropriate grand jury and on the discretion of the authorized prosecutor. Should an indictment ensue, the accused shall then be entitled to a fair trial by an impartial jury, as guaranteed to every individual by the Constitution. It is believed that a trial of Richard Nixon, if it becomes necessary, could not fairly begin until a year or more has elapsed. In the meantime, the tranquility to which this nation has been restored by the events of recent weeks could be irreparably lost by the prospects of bringing to trial a former President of the United States. The prospects of such a trial will cause prolonged and divisive debate over the propriety of exposing to further punishment and degradation a man who has already paid the unprecedented penalty of relinquishing the highest elective office of the United States. NOW, THEREFORE, I, Gerald R. Ford, President of the United States, pursuant to the pardon power conferred upon me by Article II, Section 2, of the Constitution, have granted and by these presents do grant a full, free, and absolute pardon unto Richard Nixon for all offenses against the United States which he, Richard Nixon, has committed or may have committed or taken part in during the period from January 20, 1969 through August 9, 1974. IN WITNESS WHEREOF, I have hereunto set my hand this eighth day of September, in the year of our Lord nineteen hundred and seventy-four, and of the Independence of the United States of America the one hundred and ninety-ninth. Gerald R. Ford

Bibliography

“Presidential Proclamation 4311,” September 8, 1974, by President Gerald R. Ford, General Records of the United States Government, 1778–1992; U.S. National Archives and Records Administration, Washington, D.C.

Three Mile Island

A reactor malfunction at the Three Mile Island nuclear power plant near Harrisburg, Pennsylvania, in March 1979 resulted in the release of radioactive materials, the panicked evacuation of nearby residents, and several days of uncertainty about the severity of the situation. Although no one was killed and the amount of leaked radioactivity into the surrounding environment was minimal, the Three Mile Island incident is considered to be the most serious mishap ever in the U.S. commercial nuclear power industry. It also alerted the nation to the potential hazards of nuclear energy, despite earlier assurances from the industry and its government regulators.

What Happened

Three Mile Island’s second reactor had been in operation for less than a year when, at about 4 a.m. on March 28, 1979, a problem was detected in the turbine building. Soon after, as the alarm lights were flashing and the warning bells were ringing, technicians realized that the reactor had overheated, causing portions of the uranium core to melt and hydrogen gas to accumulate. The primary fear during the crisis was that such a severe core meltdown, the most dangerous kind of nuclear accident, would cause a steam explosion and the dispersal of deadly radioactivity.

The White House was notified of the incident at 9:15 in the morning. At 11 a.m. all nonessential staff was told to leave the power plant. Pennsylvania’s governor, Richard Thornburgh (1932–), soon ordered the evacuation of all pregnant women and preschool-age children within a five-mile radius of the plant. Ultimately, some one hundred thousand residents of all ages would choose to evacuate. Reports about the situation at the plant and its threats to the surrounding area were often confused and contradictory, which further frightened people in the area. Adding to the panic was the odd coincidence of the motion picture The China Syndrome—about an accident at a nuclear power plant—opening in theaters just days before the Three Mile Island meltdown.

The overriding fear of an explosion at Three Mile Island diminished on Sunday, April 1, when scientists determined that a lack of oxygen in the reactor’s pressure vessel meant a fire or explosion could not occur.

A federal investigation would later find that the Three Mile Island reactor meltdown was caused by human error as well as serious mechanical and design flaws. One of the key components leading to the crisis was a malfunctioning automatic release valve, which prevented the reactor from cooling itself and caused highly explosive hydrogen gases to accumulate.

Health Consequences of the Accident

Although the worst did not occur in the Three Mile Island incident, tests at the time and since did reveal that some radioactive water and gas was released. During the crisis, no plant workers or civilians suffered any injuries, and there were no deaths. Several long-term studies by both government and independent researchers have failed to link any deaths or disabilities to the crisis. Nor were any correlations made between radiation levels in the agricultural animals and products in the area and the incident at the plant.

Experts estimate an exposure dose of about 1 millirem for each of the two million people who were in the rural and agricultural areas near the power plant and the cities of York, Lancaster, and Harrisburg (the state’s capital, located ten miles south of Three Mile Island). By comparison, exposure from a full set of chest X-rays is about 6 millirem, and in that part of Pennsylvania residents were believed to be exposed to at least 100 millirem per year of naturally generated radioactivity.

In a study that took place from 1979 to 1998, researchers at the University of Pittsburgh stated there is “no consistent evidence” that radiation affected the mortality rates of people living within five miles of the reactor at the time of the accident.

Lasting Impact

One of the lasting effects of the Three Mile Island accident is that the nation, which had spent much of the 1970s struggling through gas and oil shortages, stopped embracing nuclear power as an alternative energy source. The construction of new nuclear power plants was significantly reduced. But when the energy crisis ended, the United States no longer gave priority to weaning itself from fossil fuels.

As a result of the Three Mile Island incident, the federal government did revise its standards for nuclear power plant licensing, employee training, and emergency preparedness. It also revamped the structure and purpose of the Nuclear Regulatory Commission (NRC), which is responsible for licensing and regulating the nonmilitary use of nuclear energy. “What shook the public most,” observed Victor Gilinsky (1934–), the commissioner of the NRC during the Three Mile Island event, “was seeing the men in the white lab coats standing around and scratching their heads because they didn’t know what to do.”

Clean-up and monitoring efforts of the Three Mile Island site lasted for fifteen years. After the accident, Reactor No. 2 was permanently shut down, defueled, and decontaminated. The Three Mile Island facility is expected to be decommissioned when the operating license for Reactor No. 1 expires in 2014.

The Iran Hostage Crisis

On November 4, 1979, Iranian militants in Iran’s capital of Tehran stormed the United States Embassy, taking as hostages the Americans inside. The ordeal lasted 444 days, during which time Americans back home in the United States became acutely aware of their nation’s vulnerability to the anger of extremists abroad and of the power foreign governments had over the country’s energy needs. As U.S. President Jimmy Carter (1924–) struggled with both the hostage situation and a concurrent energy crisis, the American electorate chose Ronald Reagan (1911–2004) to be its next president in 1980.

A Brief History of U.S. Involvement in Iran

The build-up to the hostage taking in 1979 began a generation earlier, in the 1950s, when the young shah of Iran, Mohammed Reza Pahlavi (1919–1980), faced a challenge to his throne, which he had inherited from his father in 1941. Fearing a loss of access to Iran’s oil fields, the United States took the shah’s side in the throne conflict, ousting his opponent, Prime Minister Mohammed Mossadegh (1882–1967), and providing the shah with economic and military aid. By the early 1960s Iranians had begun objecting to the cultural “westernizing” of their nation, the uneven distribution of wealth among its citizenry, and the shah’s continued refusal to grant political freedoms to the people. During upheavals in 1963, the shah cracked down on protestors and suppressed his opposition, including the popular Muslim cleric Ayatollah Ruhollah Khomeini (1900–1989), whom he arrested and exiled. In addition to his opposition to the shah, Khomeini—who escaped to neighboring Iraq—despised the United States. With the rebellion quelled, the shah continued to spend Iran’s oil revenues on the country’s military and his own lavish lifestyle. Less-fortunate Iranians, meanwhile, were growing poorer and angrier. In January 1979 the shah and his family, claiming to be going on a vacation, fled from Iran, fearful for their lives. They temporarily settled in Egypt.

The Hostage Crisis

Within weeks of the shah’s exit from Iran, the exiled Ayatollah Khomeini returned. After the shah, who had eventually moved on to Mexico, was allowed to enter the United States for cancer treatment in October 1979, an angry mob stormed the U.S. Embassy compound on November 4, 1979. At the start of the siege, there were nearly one hundred captives. After the ringleaders released almost all the women, non-Americans, and blacks, sixty-six American captives remained at the embassy and in an Iranian ministry building nearby. The hostage-takers wanted the shah and his money returned to Iran so he could stand trial and his money distributed to poor Iranians. While Khomeini was not believed to be directly involved in the hostage taking, he did endorse the act.

Carter’s Conflicts

Deciding that military action against Iran was too risky, the Carter administration responded to the hostage drama with diplomatic efforts, freezing Iranian assets in the United States, and an embargo against the importation of Iranian oil. Rather than frighten the Iranians, the measures inspired further protests against the United States. Carter, who felt personal responsibility for the safety of the American hostages, then severed all diplomatic relations with Iran and in April 1980 imposed a complete economic embargo on the country. Also in April, Carter approved of a secret U.S. mission, based out of the Iranian desert, to rescue the hostages. The effort failed miserably when mechanical problems plagued the U.S. aircraft involved, some of which collided and killed eight soldiers. The disastrous rescue attempt and ineffective diplomatic efforts damaged President Carter’s political capital, causing him to be viewed as hesitant and weak.

The student militants holding the hostages stood firm despite the death in July 1980 of the exiled shah, who had returned to Egypt, and the invasion of Iran by Iraq on September 22, 1980, which resulted in a full-scale war between the nations. That fall, Iran, under the rule of the Ayatollah Khomeini, resumed negotiations with the United States. On inauguration day in January 1981, the United States agreed to return $8 billion in frozen Iranian assets and lift trade sanctions in exchange for the release of the hostages.

444 Days in Captivity

Although the Carter administration and President Carter himself had agonized and negotiated for more than a year over the safety and release of the hostages—up until and during the inauguration day drive to the U.S. Capitol—it was Reagan who got to make the announcement, moments after being sworn in, that the 444-day Iranian hostage crisis was over and the Americans, who had been blindfolded, isolated, and abused during their captivity, had been released. Of the sixty-six hostages taken, thirteen had been released a few weeks after the crisis began, one was returned the following July, and the remaining fifty-two were freed that inauguration day. The next day former President Carter flew to the U.S. Air Force base in Wiesbaden, West Germany, to greet the hostages on Reagan’s behalf.

Legislation, Court Cases, and Trials

The National Traffic and Motor Vehicle Safety Act and Highway Safety Act

The National Traffic and Motor Vehicle Safety Act and Highway Safety Act of 1966 established safety standards for motor vehicles in order to reduce accidents and the deaths and injuries caused by them.

The Purpose of the Legislation

Signed into law by President Lyndon B. Johnson (1908–1973) in 1966, the National Traffic and Motor Vehicle Safety Act and Highway Safety Act placed the federal government in the leadership role of a comprehensive national program to reduce the number of injuries and deaths on U.S. highways. The act created the first federally mandatory safety standards for motor vehicles. Starting with vehicles and tires built after the 1968 model year, the standards required manufacturers to protect the public against “unreasonable risk of accidents occurring as a result of the design, construction, or performance of motor vehicles” and also against “unreasonable risk of death or injury … in the event accidents do occur.”

Authority for the act was first assigned to the Department of Commerce. In 1970 the Highway Safety Act was amended to establish the National Highway Traffic Safety Administration (NHTSA) to carry out the safety programs developed and mandated by the National Traffic and Motor Vehicle Safety Act.

After early opposition, the automobile industry worked to implement the standards required by the new law, which has undergone numerous revisions over the years. Speaking for many for his Detroit colleagues, Henry Ford II (1917–1987) had initially complained that the new auto safety standards were “unreasonable, arbitrary, and technically unfeasible.” But in 1977 Ford conceded on the television news show Meet the Press that “we wouldn’t have the kinds of safety built into automobiles that we have had unless there had been a federal law.”

The Need for Auto Safety Standards

The impetus for the legislation was the growing evidence that auto accidents were caused by unsafe vehicles rather than just bad drivers. An influential proponent of this theory was Ralph Nader (1934–), a consumer advocate who detailed automobile safety issues in the book Unsafe at Any Speed, published in November 1965. In his book Nader charged that the automobile industry subordinated safety concerns to “power and styling.” His chief target was the General Motors Corvair, a sports car prone to violent skidding and rollovers. According to Nader, many auto-related injuries were caused not by the “nut behind the wheel,” but by the engineering and design flaws that made many vehicles high-speed death traps, regardless of a driver’s skill. Traffic fatalities had increased by nearly 30 percent between 1960 and 1965, and experts predicted that by 1975 the nation could be looking at one hundred thousand motor vehicle-related deaths each year.

In signing the legislation on September 9, 1966, President Johnson remarked: “Over the Labor Day weekend, twenty-nine American servicemen died in Vietnam. During the same Labor Day weekend, 614 Americans died on our highways in automobile accidents.… In this century more than one-and-a-half million of our fellow citizens have died on our streets and highways: nearly three times as many Americans as we have lost in all our wars.”

Provisions of the Act

The National Traffic and Motor Vehicle Safety Act and Highway Safety Act lead regulators to issue more than a dozen standards for passenger cars, including seat belts for all occupants, impact-absorbing steering columns, padded dashboards, safety glass, and dual braking systems. Over time, requirements and standards were added for such items as windshield wipers, lights, rearview mirrors, door locks, and head restraints. Safety standards were also developed for trucks, buses, motorcycles, and other vehicles. In 1974 the act was amended to require manufacturers to remedy safety-related defects at no cost to consumers. Afterward, auto manufacturers issued so many recalls to repair safety issues that the number of recalled cars between 1977 and 1980 exceeded the number of new cars sold.

Since the act’s signing, traffic fatalities and the fatality rate (measured in fatalities per million vehicle miles traveled) declined 17 percent and 71 percent, respectively, between 1967 and 2001. While the new regulations and standards certainly deserve credit, additional factors have had an impact as well. Over the years, speed limits have been reduced, vehicle inspections have improved, driver education programs have become more comprehensive, road and traffic control systems have been improved, medical care has advanced, the use of child safety seats are required, and public attitudes generally condemn drunk driving and other risky behaviors.

See also Ralph Nader

Estimated Lives Saved by Auto Safety Technologies, 1969–2002

When researchers with the National Traffic and Highway Safety Administration (NTHSA) compared the auto-related death rates in the United States before and after the enactment of the automobile and highway safety standards acts of 1966, the results showed that some 328,551 lives had been saved life-saving technologies (Charles J. Kahane, Lives Saved by the Federal Motor Vehicle Safety Standards and Other Vehicle Safety Technologies, 1960–2002—Passenger Cars and Light Trucks, October 2004).

Dual master cylinders and front disc brakes were believed to have saved 13,053 lives. Energy absorbing steering columns, a 1960s technology, were credited with protecting 53,017 drivers. Improved door locks prevented 28,902 fatalities. Frontal air bags, a 1990s improvement, prevented 12,074 deaths. The most effective safety addition to the automobile was seat belts, especially the three-point belt with a chest restraint that was introduced in the 1970s. Over the course of twenty-two years, seat belts alone were believed to have saved 168,524 lives.

The Racketeer Influenced and Corrupt Organizations Act

The Racketeer Influenced and Corrupt Organizations Act, commonly referred to as RICO, was signed into law on October 15, 1970. The specific purpose of RICO was “the elimination of the infiltration of organized crime and racketeering [extorting money] into legitimate organizations operating in interstate commerce.” But because Congress mandated that the statute “be liberally construed to effectuate its remedial purposes,” RICO has been used to prosecute a variety of illegal activities affecting interstate or foreign commerce. Under RICO, successful prosecutions result in extended sentences for crimes committed as part of an ongoing criminal organization.

In Pursuit of Organized Crime

Congress passed the RICO as part of the Organized Crime Control Act of 1970. The intent was to go after criminal organizations that were using legitimate businesses as fronts for criminal activity. With the Organized Crime Control Act of 1970, law enforcement could pursue and prosecute individuals for participating in organized criminal acts. Previously, such criminals were pursued solely for the acts (such as gambling, loan sharking, etc.) themselves.

Congress’s efforts to pursue organized crime via a RICO-type statute began in the 1950s with hearings conducted by Tennessee Senator Estes Kefauver. One of the original purposes of RICO was to eliminate organized crime families (specifically the Mafia). Because Congress could not legislate against specific persons or groups, it used far-reaching language in order to toss a broad net in which to pursue organized crime’s racketeering activities.

The Act and its Penalties

While RICO has very specific requirements for who can be charged under the statute, the menu of offenses is expansive. Under RICO, a person or group that commits any two of among thirty-five crimes within a ten-year period and, in the opinion of the U.S. attorney bringing the case, has committed those crimes with similar purpose or results, can be charged with racketeering; these crimes include murder, gambling, bribery, extortion, bankruptcy, mail fraud, prostitution, narcotics trafficking, and loan sharking.

A RICO-qualifying defendant must also be connected with an “enterprise,” which is defined by the statute as “any individual, partnership, corporation, association, or other legal entity, and any union or group of individuals associated in fact although not a legal entity.”

The punishment for violating the criminal provisions of RICO is intentionally harsh. If convicted, a defendant is fined and sentenced to not more than twenty years in prison for each offense. In addition, the racketeer must forfeit all monies acquired through a pattern of “racketeering activity.” The act also contains a civil component that allows plaintiffs to sue for triple damages.

Uses of RICO

In addition to prosecuting acts of extortion and blackmail, RICO has been used against individuals and corporations who intimidate or threaten witnesses, victims, and whistleblowers in retaliation for cooperating with law enforcement. RICO has also been used to prosecute the sexual abuse scandals involving the Catholic Archdiocese in the United States. RICO laws also were successfully cited in National Organization for Women v. Scheidler, a lawsuit that sought an injunction against antiabortion activists who physically blocked access to abortion clinics.

RICO has proven to be a powerful tool in the federal government’s fight against organized crime. As a back-up to RICO, many states have enacted their own RICO-type statutes, which are used in the rare instances when the federal law does not apply.

Some critics charge that RICO’s reach is too far, especially when it is used to convict nonviolent criminals who are sentenced to long prison stays. Advocates of RICO counter that crimes committed by organized networks of people are more dangerous, and more damaging, than crimes committed by individuals—in part because they are harder to stop. Because the crime is more serious, the theory goes, the punishments should be more severe.

When it comes to RICO’s civil applications, however, many say the law is too easily abused. Because RICO’s civil provisions, which allow for triple damages, can be a source of great profit, some civil attorneys have filed RICO suits on behalf of plaintiffs willing to sue accountants, bankers, insurance companies, securities firms, and major corporations, believing that the defendants might settle rather than risk having a judge or jury grant a triple-damages judgment.

As a tool of criminal prosecutions, however, RICO has been successful in securing convictions against organized crime leaders. In the mid-1980s the bosses of all five New York City Mafia families—among them the famed John Gotti, head of the Gambino crime family—were convicted under RICO, and each was sentenced to at least one hundred years in prison.

The Acts and Threats Prosecutable under RICO

Under the Racketeer Influenced and Corrupt Organizations Act (RICO), a person and/or group involved in committing any two of among thirty-five crimes within a ten-year period, with similar purpose or results, can be prosecuted under RICO. Such crimes include:

  1. Murder
  2. Kidnapping
  3. Gambling
  4. Arson
  5. Robbery
  6. Bribery
  7. Extortion
  8. Dealing in obscene materials
  9. Dealing in controlled substances
  10. Counterfeiting
  11. Embezzlement (particularly from pension, welfare, or union funds)
  12. Extortionate credit transactions
  13. Fraud (involving identification documents, mail fraud, wire fraud, financial fraud, securities fraud, passport, visa or citizenship fraud, financial fraud)
  14. Obstruction of justice
  15. Peonage (involving a debtor having to work for a creditor)
  16. Slavery
  17. Human trafficking
  18. Interference with commerce
  19. Transportation of wagering paraphernalia
  20. Money laundering
  21. Auto theft
  22. Sexual exploitation (including of children)
  23. Murder-for-hire involvement
  24. Possession of stolen property
  25. Electronic piracy
  26. Trafficking in contraband cigarettes
  27. Trafficking or using biological, nuclear, or chemical weapons
  28. The felonious manufacture, importation, receiving, concealment, buying, selling, or otherwise dealing in a controlled substance or listed chemical (as defined by the Controlled Substances Act)

The Occupational Safety and Health Act

The Occupational Safety and Health Act (OSHA) was approved on December 30, 1970, and enacted three months later to help ensure the safety and health of workers in the U.S. labor force.

The Need for OSHA

The introduction to the Occupational Safety and Health Act states its intended purpose: “To assure safe and healthful working conditions for working men and women; by authorizing enforcement of the standards developed under the Act; by assisting and encouraging the States in their efforts to assure safe and healthful working conditions; by providing for research, information, education, and training in the field of occupational safety and health.”

The hazards of the American workplace garnered attention in the 1960s due to the concerns of employees and labor unions seeking protection from the dangers of unsafe working conditions. Worker advocates found supporters in Congress and even in President Richard M. Nixon (1913–1994), who passed the legislation despite the opposition of influential business leaders. One theory for why Nixon sided with workers over management is that he wanted to curry favor with working-class voters and George Meany (1894–1980), the powerful president of the American Federation of Labor-Congress of Industrial Organizations (AFL-CIO). A concession made to business interests is that OSHA described workplace safety as a responsibility shared by the employer and the employee.

At the time, congressional leaders noted that, annually, more than fourteen thousand workers were being killed and two million more disabled due to job-related accidents, most commonly in manufacturing. U.S. Representative William S. Broomfield (1922–) declared that “75 out of every 100 teenagers now entering the workforce can expect to suffer a disabling injury sometime in his working career.” Representative William A. Steiger (1938–1978), the bill’s chief advocate in the House, added: “In the last 25 years, more than 400,000 Americans were killed by work-related accidents and disease, and close to 50 million more suffered disabling injuries on the job.”

Although some states had already established workplace safety standards, the protections were inconsistent. Invoking its powers to regulate interstate commerce, Congress decided to nationalize workplace protections and impose minimum safety standards for protecting the American workforce.

OSHA, NIOSH, and OSHRC

The Occupational Safety and Health Act of 1970 established three permanent agencies, and an expansive bureaucracy, charged with protecting worker safety and health:

  1. The Occupational Safety and Health Administration (OSHA) operates within the Labor Department. It creates and enforces workplace safety and health standards.
  2. The National Institute for Occupational Safety and Health (NIOSH) is part of the Department of Health and Human Services (then called the Department of Health, Education, and Welfare). NIOSH’s mandate is to conduct research on occupational safety and health.
  3. The Occupational Safety and Health Review Commission (OSHRC) is an independent agency. Its role is to adjudicate enforcement actions challenged by employers.

Impact of OSHA

According to the Occupational Safety and Health Administration, since OSHA’s inception in 1971, workplace fatalities have decreased by more than 60 percent and occupational injury and illness rates by 40 percent. New cases of diseases such as brown lung and asbestosis are now extremely rare. Exposure levels to dangerous toxins such as cotton dust, lead, arsenic, beryllium metal, and vinyl chloride have been greatly reduced.

Observers do argue over whether or not OSHA should be credited with such successes. Advances in science, technology, and medicine may also explain the improvements. As was the case when the act was first signed, pro-business interests continue to feel that OSHA is unnecessarily restrictive and costly to companies, especially small ones.

While OSHA did set safety standards, its enforcement of such standards has been frequently criticized by worker advocates. By the agency’s own admission, in the 1980s its focus was to reduce the regulatory burdens its standards had caused, and the OSHA inspection efforts were limited to “the most hazardous companies within the most hazardous industries.” Under President Ronald Reagan (1911–2004) the agency’s impact was lessened by the administration’s policy of limited government, particularly where business interests were involved. During these years OSHA received no funding increases and lost 20 percent of its staff. Additionally, many of Reagan’s appointees to the Department of Labor and OSHA had strong business connections and little interest in producing or enforcing federal safety standards.

Labor activists acknowledge that the existence of OSHA has raised awareness about workplace safety and health issues, but many consider the agency little more than a “paper tiger,” whose bark is worse than its bite. In states that have Occupational Safety and Health agencies, local efforts have sometimes been more effective than federal ones.

The “General Duty Clause” of the Occupational Safety and Health Act

The “General Duty Clause” of the Occupational Safety and Health Act (OSHA) states in basic terms the overall responsibilities of employers to protect their workers.

(a) Each employer— (1) shall furnish to each of his employees employment and a place of employment which are free from recognized hazards that are causing or are likely to cause death or serious physical harm to his employees; (2) shall comply with occupational safety and health standards promulgated under this Act. (b) Each employee shall comply with occupational safety and health standards and all rules, regulations, and orders issued pursuant to this Act which are applicable to his own actions and conduct.

Bibliography

Occupational Safety and Health Act of 1970, Section 5, U.S. Department of Labor, Occupational Safety and Health Administration, http://www.osha.gov/ (accessed July 13, 2007).

Lemon v. Kurtzman

In 1971 the Supreme Court found in Lemon v. Kurtzman that state laws that enabled payment of teachers of secular subjects in parochial schools from public funds were unconstitutional. Two state laws that permitted state government to supplement the salaries of teachers of nonreligious subjects in parochial schools were at issue: Rhode Island’s Salary Supplement Act of 1969 and Pennsylvania’s Nonpublic Elementary and Secondary Education Act of 1968. The Court found that both laws violated the establishment clause of the First Amendment, which guarantees that “Congress shall make no law respecting an establishment of religion.”

The Laws

Pennsylvania’s Nonpublic Elementary and Secondary Education Act, passed in 1968, provided financial aid to private elementary and secondary schools—the overwhelming majority of which were parochial schools—by paying portions of teachers’ salaries as well as the costs of textbooks and instructional materials used in nonreligious subjects. Rhode Island’s law, the Salary Supplement Act, paid teachers of secular subjects in parochial schools a supplemental salary of up to 15 percent of their annual salaries, subject to maximum limits set by the state.

The Suit

Alton J. Lemon, who gave his name to the case, was a resident and taxpayer in Pennsylvania and father of a student in the state public school system. Arguing that the Nonpublic Elementary and Secondary Education Act violated the establishment clause of the First Amendment, Lemon challenged the Pennsylvania law in federal court and filed suit against David Kurtzman, the state superintendent of schools. When the federal court dismissed his case, he appealed to the U.S. Supreme Court. The Rhode Island statute had been found unconstitutional by a district court and was then appealed by the appellants to the Supreme Court. The two cases were combined into one case in Lemon.

The Opinion

Chief Justice Warren Burger (1907–1995) delivered the opinion of the Supreme Court. He outlined a three-part “Lemon Test” to determine whether the laws met the requirements of the First Amendment. For a law to be constitutional under the Lemon Test, it must not have a religious purpose; second, the law must neither advance nor inhibit religion; and third, the law must not foster entanglement of church and state to an excessive degree. The Court admitted that some relationship between religious organizations and the government was inevitable and in fact, required. Parochial schools, for example, were covered under compulsory school attendance laws. “In order to determine whether the government entanglement with religion is excessive,” Burger stated in the opinion, “we must examine the character and purposes of the institutions that are benefited, the nature of the aid that the State provides, and the resulting relationship between the government and the religious authority.”

The two state statutes at issue in the case were found in compliance with the first and second requirements; if anything, the state laws were meant to improve the quality of secular education offered at parochial schools. However, the laws failed the third requirement of no excessive entanglement. The Court found that the amount of oversight the state would be obliged to engage in to guarantee that the programs did not further a religious purpose involved an “excessive entanglement between government and religion.” Burger wrote, “A comprehensive, discriminating, and continuing state surveillance will inevitably be required to ensure that these restrictions are obeyed and the First Amendment otherwise respected.… These … contacts will involve excessive and enduring entanglement between state and church.” Therefore, the Supreme Court struck down both state laws as unconstitutional violations of the establishment clause of the First Amendment.

The Court opinion in Lemon v. Kurtzman provided guidelines to legislators and courts—the “Lemon Test”—to ensure the constitutionality of laws. However, in the intervening years, the Court has since modified its understanding of the “Lemon Test.” Since 1971, justices have shifted their understanding of the second criterion. In 1971 justices understood the test to be whether or not the law advanced or inhibited religion; today, justices understand the test to be whether or not the law conveys a message that the government endorses or disapproves of religion. In addition, the wording of the third criteria, “excessive government entanglement,” leaves what constitutes excessive entanglement as a matter open for debate.

The Pledge of Allegiance—Is Its Recitation in Public School Constitutional?

The Pledge of Allegiance was first published in The Youth’s Companion in 1892 in this form: “I pledge allegiance to my Flag and the Republic for which it stands; one nation indivisible, with liberty and Justice for all.” By 1924 the words “the flag of the United States of America” had been substituted for “my Flag.” The U.S. government officially adopted the Pledge of Allegiance in 1942.

By the 1950s, however, many Americans believed that the Pledge did not reflect what they viewed as the singularly religious purpose of the United States. At the request of President Dwight D. Eisenhower (1890–1969), Congress added “under God” to the Pledge. Congress intended that students would recite the Pledge each day to “proclaim in every city and town, every village and rural schoolhouse, the dedication of our Nation and our people to the Almighty.”

Several legal challenges to the practice of reciting the Pledge daily in public schools have been mounted in recent decades. In one case of note the Seventh Circuit Court upheld an Illinois law in 1992 that required teachers to lead the Pledge every day, as long as students could opt not to recite it themselves.

Later, Michael Newdow sued the Elk Grove Unified School District challenging its policy of having the Pledge recited every day to satisfy a California state law that every public school begin its day with “patriotic exercises,” claiming that it violated his daughter’s First Amendment rights. Newdow stated that his daughter was injured because she was forced to “watch and listen as her state-employed teacher in her state-run school leads her classmates in a ritual proclaiming that there is a God.” While the district court had dismissed Newdow’s case, the Ninth Circuit Court of Appeals agreed with him, ruling in 2003 that the phrase “under God” signified a government sanctioned religious purpose.

The Supreme Court agreed to hear the case on appeal. The central issue was whether daily recitation of the Pledge of Allegiance violated the establishment clause of the First Amendment. In the end, the Supreme Court sidestepped the issue. Rather, they challenged Newdow’s right to sue because he did not have primary legal custody of his daughter. In a unanimous decision the justices reversed the lower-court decision that the daily recitation of the Pledge of Allegiance in public schools is unconstitutional.

While the path would seem to be clear for any parent with legal custody of his or her child to challenge the constitutionality of the daily recitation of the Pledge of Allegiance in public schools, several justices stated in their concurring opinions that the Pledge does not violate the Constitution. Justice William H. Rehnquist (1924–2005) went a step farther. “To give the parent of such a child a sort of ‘heckler’s veto’ over a patriotic ceremony willingly participated in by other students, simply because the Pledge of Allegiance contains the descriptive phrase ‘under God’ is an unwarranted extension of the establishment clause, an extension which would have the unfortunate effect of prohibiting a commendable patriotic observance,” he wrote.

The Pentagon Papers

The Pentagon Papers detailed three decades of U.S. involvement in Vietnam and, when leaked to the press, caused a public outcry against the war that eventually led to the largest political scandal in U.S. history and the downfall of the administration of Richard M. Nixon (1913–1994). The Pentagon Papers had been commissioned by the administration of Lyndon B. Johnson (1908–1973) and consisted of forty-seven volumes analyzing the history of U.S. policy in Vietnam. The papers included four thousand pages of internal documents from four presidential administrations along with three thousand pages of analytical commentary. Among other things, the papers demonstrated that the government had deliberately obfuscated the nation’s military actions in Vietnam. Daniel Ellsberg (1931–), a consultant at the Rand Corporation, whose analysts contributed to the report, leaked the Pentagon Papers to the New York Times in June 1971. President Nixon vainly tried to block their publication. His administration’s illegal tactics in its attempts to discredit Ellsberg were early elements of the Watergate scandal.

The Contents of the Papers

Secretary of Defense Robert S. McNamara (1916–) had ordered the compilation of the Pentagon Papers at a time when the Vietnam War seemed to be at an impasse. Three dozen Pentagon officials and civilian analysts gathered classified documents and wrote an additional three thousand pages of analysis. While the papers were by no means a complete record of American involvement in the Vietnam War—most significantly, they lacked internal White House memoranda—they did draw upon sealed files of the Defense of Department, presidential orders, and diplomatic files.

The Pentagon Papers showed that the government had taken definite steps to increase its involvement in the Vietnam conflict through four presidential administrations. Government leaders throughout the period believed that if one Asian nation “fell” to communism, others would fall in turn, a philosophy called the “domino theory.” However, civilian government leaders refused to authorize the commitment of troops and escalation of the war that military advisers urged was necessary in order to decisively defeat the Communist forces. Each presidential administration balanced the fear of being defeated by an inferior enemy without the commitment of a larger military force in Southeast Asia with the fear of drawing in Communist China or the Soviet Union if the war was drastically escalated.

Above all, the Pentagon Papers demonstrated that the government had deliberately and systematically misled the American people about the extent of U.S. involvement in Southeast Asia and the prospects for success in Vietnam. Daniel Ellsberg, who worked for the Rand Corporation and had served as a pro-war adviser to the administration of John F. Kennedy (1917–1963), had come to believe that U.S. policy in Vietnam was a grave mistake. He believed that the government actions detailed in the Pentagon Papers should be made public. He secretly copied the papers, with the help of Anthony J. Russo (1937–), and released them to the New York Times.

Publication of the Pentagon Papers

The New York Times created an uproar when it began publication of the papers in June 1971. Antiwar protesters immediately seized upon the papers, arguing that they highlighted a “credibility gap” and demonstrated that the government had deliberately misled Congress and the American people in order to expand the country’s involvement in a foreign conflict. President Nixon was enraged by the leak. He believed that the publication of the Pentagon Papers would undermine his administration’s ability to wage war in Vietnam.

Court Actions

The U.S. Department of Justice succeeded in obtaining an injunction halting the publication of the papers in the New York Times. Ellsberg, however, leaked the documents to other newspapers, and the Washington Post and the Boston Globe continued their publication. In an extraordinarily swift decision, the Supreme Court ruled in New York Times v. United States on June 30 that the government had violated the Constitution’s guarantee of freedom of the press, rejecting the government’s argument that the papers should be censored because of national security concerns.

Subsequently, Ellsberg was indicted for leaking the papers on charges of espionage, theft, and conspiracy. Russo was also charged. Determined to obtain a conviction, the Nixon administration organized a secret White House group called the “Plumbers” to try to discredit Ellsberg. To that end the Plumbers broke into the office of Ellsberg’s psychiatrist, looking for information in the psychiatrist’s files to use against Ellsberg. These illegal activities, once exposed, caused U.S. District Judge William M. Byrne Jr. to declare a mistrial because of gross government misconduct. The case was dropped.

The Pentagon Papers had a profound impact on American history. First, they offered an inside view of the government’s actions in Vietnam. Once made public, the papers generated a huge public outcry and led to the eventual withdrawal of American troops from Vietnam. The government’s illegal activities in pursuit of a conviction of Ellsberg on charges of espionage were early events in the Watergate scandal that soon led to the downfall of the Nixon administration and the resignation of the president himself.

See also Richard M. Nixon

See alsoUnited States v. Nixon

See also Watergate

Daniel Ellsberg: From Determined Cold Warrior to Antiwar Hero

As a young man, Daniel Ellsberg (1931–) served for two years as an officer in the Marine Corps before receiving a Ph.D. in economics from Harvard University and becoming a consultant for the Rand Corporation, a conservative think tank located in California that did regular consulting work for the U.S. Department of Defense. In several published papers, Ellsberg recommended a military buildup in Vietnam in order to advance U.S. policy in Southeast Asia and halt the spread of communism there. In 1964 and 1965 the Pentagon hired Ellsberg as an adviser to the assistant secretary of defense, during which time he lobbied on Capitol Hill for continued and increased military involvement in Southeast Asia.

In 1965 Ellsberg traveled to Vietnam in order to be able to better recommend policy in the area; it was during this trip that his views on military policy in Vietnam changed. He accompanied army battalions in the Mekong Delta in order to assess the effectiveness of the U.S. policy of pacification, which involved painstakingly searching South Vietnamese communities to root out Communist insurgents. He saw firsthand the destruction of Vietnam’s land and people as well as the corruption of the U.S.-supported regime in South Vietnam. His developed grave moral concerns about the nation’s involvement in and perpetuation of the Vietnam conflict.

Upon his return to the United States, Ellsberg was asked, along with thirty-five other analysts, to produce the Pentagon Papers, the multivolume, top-secret analysis of the nation’s policy in Vietnam. Ellsberg himself did not write much of the study. However, he was further disturbed by the pattern of deceiving of the American people and increasing involvement in Vietnam that the papers documented. Ellsberg believed that the U.S. government had deliberately misled the American public about the nation’s involvement in Southeast Asia for twenty years. In his view the government practiced this deceit in order to ensure that public attention would not turn against the war and force a troop withdrawal and a humiliating military defeat.

Ellsberg photocopied the entire report, file by file, and began leaking the papers to the New York Times in 1971. After the Department of Justice received an injunction banning the publication of the papers in the New York Times, Ellsberg then leaked them to both the Washington Post and the Boston Globe. He later said of his actions, “I felt as an American citizen … I could no longer cooperate in concealing this information from the American people. I took this action on my own initiative, and I am prepared for all the consequences.”

After Ellsberg’s indictment on charges of conspiracy and espionage, his case became a cause célèbre. His conversion from “hawk” to “dove” provided inspiration for the antiwar movement, and he spoke to groups around the country. After charges against him were dismissed, Ellsberg continued his activism, supporting antinuclear demonstrations, advocating nuclear disarmament, and criticizing U.S. policy in Central America. He was arrested many times for acts of civil disobedience. He remains an activist and a popular lecturer.

Furman v. Georgia

Furman v. Georgia effectively abolished the death penalty as it was used in the United States prior to 1972. In a 5 to 4 decision handed down on June 29, 1972, the Supreme Court severely limited the death penalty’s use based on the Eighth Amendment, which prohibits “cruel and unusual” punishment. The decision resulted in more than six hundred prisoners leaving death row while states rewrote their statutes to meet the constitutional requirements indicated in the decision.

The History of the Death Penalty

The death penalty’s use has been accepted practice in the United States since the nation’s founding. Periodically, activists have attempted to abolish capital punishment. One such period was in the first half of the nineteenth century, when reformers worked to reduce the number of crimes punishable by death. After the Civil War these reform efforts diminished, and Americans generally accepted capital punishment. Before Furman, however, the death penalty’s use had declined. Only 56 people were executed in 1960, compared to 155 persons in 1930.

The Legal Defense and Educational Fund Challenges Capital Punishment

While the Supreme Court had affirmed a rapist’s death sentence in a 1963 decision, lawyers in the Legal Defense and Educational Fund (known as the LDF), a branch of the National Association for the Advancement of Colored People (NAACP), were encouraged by the lone dissenting justice’s statement in that case that the Supreme Court’s task was to decide whether or not the death penalty was constitutional. LDF lawyers viewed the death penalty as a form of legally sanctioned lynching, especially when applied to African-American men who had been convicted of raping white women. By 1967, the Legal Defense Fund was representing all inmates on death row, who were disproportionately African-American men. The LDF brought several class action suits on behalf of these inmates, hoping the Supreme Court would eventually agree to hear a case and rule on the constitutionality of capital punishment.

An initial victory came in 1968, when the Supreme Court ruled in Witherspoon v. Illinois that a potential juror who had reservations about sentencing a convicted person to death could not be automatically dismissed in a capital case. This ruling resulted in many death row inmates receiving new trials. The LDF followed up this victory by mounting several challenges to the constitutionality of the death sentence itself, all of which failed. Finally, the LDF decided to challenge capital punishment on the grounds that it was cruel and unusual punishment prohibited by the Eighth Amendment.

William Henry Furman

The Supreme Court agreed to hear this challenge in the case of Furman v. Georgia. This case combined the cases of three African-American men who had been sentenced to death, including two rapists as well as William Henry Furman, who had been convicted of murder. Furman, who gave his name to the case, had entered his victim’s home intending to commit burglary. Furman tried to run when discovered, but his gun accidentally went off. The bullet hit and killed the victim. The victim’s family called the police immediately; when police searched the neighborhood, they found Furman still carrying the murder weapon.

Before his trial Furman was committed to a state mental hospital and found to be both mentally deficient and prone to psychotic episodes. Nevertheless, the Superior Court of Chatham County, Georgia, found Furman competent to stand trial and denied his insanity plea. Furman had a court-appointed lawyer, and his trial, including jury selection, lasted just one day. The judge in the case made clear to the jury that Georgia’s death penalty statute allowed capital punishment for any killings that occurred while the defendant was committing a criminal act. He instructed jurors to convict Furman of murder whether or not he had intended to kill his victim if they believed he had intended to break into and enter the victim’s home. Furman was found guilty and sentenced to death, despite the fact that the shooting had been accidental.

The Supreme Court Decision

On June 29, 1972, the Supreme Court handed down its decision that the death penalty, as it was currently applied in the United States, was cruel and unusual punishment prohibited by the Eighth Amendment to the Constitution. The justices, however, were deeply divided on the issue, and each justice took the unusual step of writing a separate opinion in the case. Justices William J. Brennan Jr. (1906–1997) and Thurgood Marshall (1908–1993) stated that the death penalty was unconstitutional in every case. Justices William O. Douglas (1898–1980), Potter Stewart (1915–1985), and Byron R. White (1917–2002) stated that capital punishment was unconstitutional because it was applied arbitrarily. These justices stated that Furman was sentenced to death unfairly because he was poor, African-American, had received a quick, one-day trial, and was uneducated and mentally ill. Douglas wrote: “It would seem to be incontestable that the death penalty inflicted on one defendant is ‘unusual’ if it discriminates against him by reason of his race, religion, wealth, social position, or class, or if it is imposed under a procedure that gives room for the play of such prejudices…. One searches our chronicles in vain for the execution of any member of the affluent strata of this society.”

Four justices dissented from the majority view—Chief Justice Warren Burger (1907–1995), Harry A. Blackmun (1908–1999), William H. Rehnquist (1924–2005), and Lewis F. Powell Jr. (1907–1998). They argued that capital punishment had a long tradition and was implicitly authorized in the Constitution under the Fourteenth Amendment.

The Fate of Capital Punishment

The wording of the opinions of Justices Douglas, Stewart, and White left open for debate the question of whether the death penalty could ever be considered constitutional. As a result, thirty-five states responded to Furman by immediately beginning the process of rewriting their death penalty statutes to attempt to eliminate the arbitrary nature of capital punishment’s application. By 1976 there were 450 inmates on death row, although none had been executed since 1967.

In 1976 the Supreme Court heard Gregg v. Georgia to determine if Georgia’s new death penalty statute was constitutional. Georgia had overhauled its death penalty laws, instituting two separate trials in capital cases—one to determine guilt and one to determine punishment—to allow the accused to testify at the penalty phase without being forced to incriminate himself at the first trial. The statute also required the presence of aggravating circumstances (circumstances that increase the severity of the crime) and the absence of any mitigating circumstances (circumstances that do not excuse the crime but provide a reason for reducing the punishment for it) in order to impose the death penalty. And finally, any death sentence would be automatically appealed to the state’s highest court. The Supreme Court found this statute constitutional, and in 1977 Gary Gilmore was the first person to be executed in the nation in ten years.

Most states followed the Court’s findings in the Gregg decision and wrote death penalty statutes meant to protect the poor, minorities, mentally ill people, and members of other disenfranchised groups. Most states also repealed the death penalty for accidental killings, like the one in Furman. Subsequent Supreme Court decisions upheld the constitutionality of various rewritten death penalty statutes. In 1987 the Court found that the disproportionate number of African-Americans sentenced to death was not necessarily due to racial bias, and therefore capital punishment was not unconstitutional on the grounds of discrimination.

While executions did resume in 1977, Furman v. Georgia still had far-reaching effects. It had led immediately to the release of more than six hundred inmates from death row due to the “arbitrary and capricious” application of death penalty statutes. As a result of the Court’s decision, states rewrote their death penalty statutes to place stringent requirements on the imposition of the death sentence in capital cases.

The Legal Fight against Discrimination

The Legal Defense and Educational Fund (known as the LDF) was incorporated by the National Association for the Advancement of Colored People (NAACP) in 1940 to administer tax-exempt donations for legal defense work. Thurgood Marshall (1908–1993), who sat on the Supreme Court during the Furman v. Georgia case, led the LDF in its early years. Initially, the LDF defended African-Americans in cases where there was clear evidence of racial discrimination. Lawyers often put themselves in danger by traveling throughout the South to defend African-Americans in criminal cases in small southern towns. While the lawyers often lost these cases, they frequently won them on appeal to higher courts. In addition, the lawyers’ presence at these trials helped bring some measure of fairness to the criminal procedures.

At the same time, the LDF fought for civil rights—especially in education—at the national level and won a variety of landmark cases before the Supreme Court. Throughout the American South, African-American children were segregated into schools with inferior buildings, scanty books and instructional materials, and poorly paid teachers. LDF focused on gradually dismantling this “separate but equal” doctrine in education that had been established in Plessy v. Ferguson in 1896.

Lawyers won a series of cases argued before the Supreme Court between 1948 and 1950 that ruled that segregation of graduate schools and law schools was discriminatory. The LDF then challenged segregation in public schools, which culminated in the landmark Supreme Court decision in Brown v. Board of Education (1954) in which the court ruled that schools must be desegregated “with all deliberate speed.” In the years thereafter, the LDF concentrated on bringing lawsuits that would force southern states to desegregate their schools as ordered in Brown, a slow and frustrating process. Due to the continued litigation of the LDF, the Court finally did away with the “all deliberate speed” idea in 1968 and ordered immediate and total desegregation of all public schools in Green v. County School Board of New Kent County.

The LDF formally split from the NAACP in 1954 due to disagreements over the focus of the organization. While the LDF continued to pursue equality in education, it also took on other civil rights cases. By 1970, capital punishment had become a focus of the LDF’s efforts. While Furman turned out to be a temporary victory, the LDF had more success in the fight against capital punishment in rape cases. A study funded by the LDF had found that 89 percent of convicted rapists sentenced to death between 1930 and 1962 were African-American. In a case represented by the LDF in 1977, Coker v. Georgia, the Supreme Court barred the death penalty for rape.

Title IX of the Education Amendments of 1972

Title IX of the Education Amendments of 1972, which were amendments to the Civil Rights Act of 1964, prohibits gender discrimination by institutions of higher education that receive federal funds (almost every educational institution receives at least some federal funding). The Office for Civil Rights within the U.S. Department of Education is charged with enforcing the rights and regulations detailed in the legislation.

The Need for Title IX

With the women’s rights movement of the late 1960s and early 1970s, the nation began to recognize the inequities and discriminatory practices that prevented women and girls from achievements in education and, ultimately, the workforce. For example, until a court order was issued in 1970, Virginia state law prohibited women from being admitted to the University of Virginia’s College of Arts and Sciences, the top-rated public college in the state. In 1966 Georgetown University’s School of Nursing refused to admit married women. Even Luci Baines Johnson (1947–), the daughter of President Lyndon Johnson (1908–1973), was refused readmission to the program after her marriage.

Congressional and White House recognition of the need for protective statutes was widespread by 1971, when several education bills included language disallowing gender discrimination. Because the proposals differed in their language and provisions, it took several months for the final legislation—with a provision against sex discrimination—to develop. President Richard M. Nixon (1913–1994) signed Title IX on June 23, 1972. The law went into effect on July 1.

Title IX was the first comprehensive federal law to prohibit gender discrimination against students, faculty, and staff of educational institutions, which are defined in the act as elementary and secondary schools, colleges and universities, and other educational programs that receive federal funds. The law dictated that males and females receive fair and equal treatment in all areas of publicly financed schooling, including recruitment, admissions, course offerings, counseling, financial aid, housing, scholarships, and protection from sexual harassment. Also, women could no longer be discriminated against based on marital status or maternity.

Title IX and Athletics

Title IX has been most notably used to acquire increased financing from colleges and universities for women’s collegiate sports. The statute provides specific criteria for determining whether or not a school’s athletic programs are in compliance with Title IX. To be in compliance, the following criteria must be met:

  1. A school must demonstrate that it offers proportionate athletic opportunities for male and female athletes, has continued to expand opportunities for the underrepresented sex, or has effectively accommodated the interests and abilities of the underrepresented sex. In other words, schools do not have to offer identical sports—such as male and female football teams—yet they do need to provide an equal opportunity for males and females to play in sports of interest.
  2. The amount of money provided in athletic scholarships must be substantially proportionate to the ratio of female and male athletes. For example, at a college with forty female athletes and sixty male athletes and a scholarship budget of $1 million, an equitable distribution of the funds would provide $400,000 in scholarship aid to female athletes and $600,000 to males.
  3. Activities and staffing related to athletics must also be equal. In this category would be items such as coaching, equipment, facilities, medical services, travel allowances, and tutoring. For purposes of comparison, however, the compliance standard measures the quality of the services rather than the quantity of dollars spent. Spending more on men’s basketball uniforms than women’s, is fine, as long as both teams are properly outfitted. Giving the men’s basketball team its own luxury motorcoach bus with air conditioning, televisions, and toilet facilities while making female basketball players ride in a standard yellow school bus is not acceptable.

When Gender Distinctions Are Allowed

There are certain situations in which a school provides separate instruction and activities based on gender. These include sex education classes at the elementary and secondary school levels, and physical education classes or after-school programs during which bodily contact sports, such as wrestling, boxing, rugby, or football, will be played. Similarly, choruses where a specific vocal range may be required can be single sex or disproportionate in terms of gender. Nor does the law apply to fraternities and sororities, youth service organizations (such as Boy Scouts and Girl Scouts), or activities such as school-based father-son, mother-daughter events.

The Impact of Title IX

As with many policies involving mandates and gender issues, Title IX has its champions and its detractors. It has been credited with enhancing collegiate sports programs and blamed for undermining them (especially when it comes to men’s athletics). Title IX has been supported by some presidential administrations and demeaned by others. Yet, a generation after its creation, Title IX is still law. What is not debated is that the 1972 legislation vastly expanded the opportunities for women in sports. In 1971 fewer than 100,000 girls played organized high school sports, accounting for about 5 percent of high school athletes. By the 2002–2003 school year, that number had increased to more than 2.8 million. In 1972 fewer than 30,000 women played intercollegiate athletics (compared with 170,000 men). By 2002, 151,000 women played on National Collegiate Athletic Association (NCAA) teams and thousands more participated at the intramural and community college level.

The Equal Rights Amendment

The equal rights amendment (ERA) sought to affirm that women and men are equally granted the rights guaranteed by the U.S. Constitution. Its passage would prevent gender discrimination against both women and men and give equal legal status to women for the first time in American history.

Congress approved the ERA in 1972, sending the proposed text to the states for ratification. When the ratification period expired in 1983, only thirty-five states had voted in support. The ERA fell three votes shy of the two-thirds—or thirty-eight states—required for an amendment to be added to the Constitution.

A Long History

The equal rights amendment was drafted in 1923 by Alice Paul (1885–1977), founder of the National Woman’s Party. Having achieved the passage of the Nineteenth Amendment, giving American women the right to vote, the next step, according to suffragists, was to secure for women the Constitutional protection of equal rights under the law. The amendment was introduced into Congress that year.

Many groups, especially labor and women’s organizations, initially opposed the equal rights amendment because they feared it would end protective labor legislation for women. As the years passed, Congress had different concerns and priorities, including the stock market crash of 1929, the Great Depression, World War II, and the Vietnam War, which kept the ERA on the back burner. But as the women’s rights movement gained momentum in the 1960s, several organizations, particularly the National Organization for Women (NOW), took up the amendment as one of their major issues. But by the time of the amendment’s ratification deadline of June 30, 1982, the ERA was still three states short of the thirty-eight necessary to make it law.

Why Did the ERA Fail?

There are many theories for why the ERA failed. Some say it was the timing and a weakening momentum due to the lengthy ratification process. A larger issue was American society’s response to the changing roles and attitudes of women in the late 1960s and beyond. The ERA became subsumed by the tensions that existed toward and among feminists (those in support of women’s rights) and traditionalists (those opposed to feminism). The essential principals of the amendment—securing the freedoms and rights afforded Americans regardless of gender—were thrown into a large basket with hot-button issues such as abortion, birth control, homosexuality, and women in the workforce. There were concerns about what would happen to the traditional structure of the family (husband as breadwinner, wife as caregiver). Fears were expressed that official equality under the law would be used by men as a free pass to walk out on a marriage and not financially support the wife and children left behind.

Although national polls consistently showed that the majority of Americans favored the equal rights amendment, the extremism within both camps, as with many social issues, came to dominate the issue. The perception became that only women who rejected marriage and motherhood would support the ERA.

The most effective opposition to the ERA came from religious conservatives, the right-wing John Birch Society, and the activist group STOP ERA headed by Phyllis Schlafly (1924–), all of whom presented the ERA as an effort by radical feminists to undermine the worth of women as wives and mothers. Such opponents charged that supporters of the ERA were so vehement in their call for equality that, if the ERA passed, divorced women would no longer receive alimony, women would be drafted into the military, and males and females would have to share public bathrooms. Men who were threatened by women’s equality applauded the anti-ERA lobby, whose strategies convinced women who did not aspire to a career that their way of life was at risk. ERA supporters, many of whom were, by comparison, more provocative in their appearance and lifestyle than more traditionalist women, were never able to overcome charges that women either did not need or should not want equal rights with men.

During the years the ERA was working its way through the states for ratification, the political winds changed in the United States. In 1980 Republican Ronald Reagan (1911–2004) was elected to the presidency with enormous support from conservatives and the religious right. That same year the equal rights amendment, which for decades had been supported by both political parties, was dropped from the Republican Party Platform.

The ERA, Today and Tomorrow

The equal rights amendment continues to be introduced into each session of Congress, where it has typically languished in a committee or been otherwise set aside without a vote. The hope of supporters is that when or if the ERA finally passes, no deadline will be placed on the ratification process. Up for debate is whether the amendment will be allowed to keep the thirty-five states that already ratified it, or if the entire process will need to begin again. The fifteen states that have not ratified the equal rights amendment are Alabama, Arizona, Arkansas, Florida, Georgia, Illinois, Louisiana, Mississippi, Missouri, Nevada, North Carolina, Oklahoma, South Carolina, Utah, and Virginia.

Full Text of the Equal Rights Amendment

First proposed in 1923, the equal rights amendment (ERA) was approved by Congress and sent to the states for ratification in 1972. Adding an amendment to the U.S. Constitution requires the affirmative vote of two-thirds of the nation’s fifty states. When the ratification process expired in 1982, the ERA had only thirty-five of the necessary thirty-eight states. Below is the full text of the amendment:

Section 1. Equality of rights under the law shall not be denied or abridged by the United States or by any state on account of sex. Section 2. The Congress shall have the power to enforce, by appropriate legislation, the provisions of this article. Section 3. This amendment shall take effect two years after the date of ratification.

Bibliography

Equal Rights Amendment, http://www.equalrightsamendment.org/overview.htm (accessed April 15, 2007)

Roe v. Wade

Roe v. Wade, decided by the Supreme Court on January 22, 1973, struck down state laws that restricted women’s access to abortions. Attorneys Linda Coffee (1942–) and Sarah Weddington (1945–) had brought the suit on behalf of the so-called Jane Roe, challenging antiabortion laws in Texas on the grounds that the statutes violated the due process clause of the Fourteenth Amendment as well as the Constitution’s implied right to privacy in the Ninth Amendment. The Supreme Court agreed that the right to privacy encompassed a woman’s right to terminate a pregnancy, striking down antiabortion laws throughout the nation. After Roe v. Wade, controversy and debate over women’s reproductive choice raged for decades.

The History of the Movement to Legalize Abortion

Antiabortion laws had been enacted throughout the United States since the late nineteenth century. By the 1960s the modern women’s right movement had begun in part because women had been politicized in other social movements of the era, including the student antiwar movement and the civil right movement.

As women joined the work force in larger and larger numbers, they called for more reproductive choices. Many women chose to obtain abortions illegally; estimates of the number of abortions performed in the 1960s range from 200,000 to 1,200,000 annually. Doctors publicized the dangers women faced with these “back alley” abortions, and public attitudes began to change. By 1970 it was estimated that 60 percent of Americans believed that the choice to have an abortion should be a private decision. Women’s rights activists began calling for reforms in the statutes regulating abortion and eventually the repeal of all antiabortion laws. Several states reformed their abortion laws in this public climate, including New York, California, Colorado, and Hawaii. Abortion rights activists were looking for a case with which they could challenge the laws that outlawed abortion.

Enter Norma McCorvey

In 1969 Norma McCorvey (1947–), a twenty-one-year-old single woman, found out she was pregnant. She already had a five-year-old child who was being cared for by her mother. She had little money and worked as a waitress in a bar. She did not think she could care for another child and wanted an abortion, but Texas law prevented her from doing so legally unless her life was endangered by the pregnancy. Her search for an illegal abortion was fruitless.

Linda Coffee and Sarah Weddington, both lawyers in Dallas, wanted to challenge the Texas laws regulating abortion as unconstitutional. They believed that the time was ripe because the Supreme Court had decided several cases in recent history that indicated that they might consider ruling against state abortion statutes. In the most important of these, Griswold v. Connecticut (1968), the Court found that states could not make the sale of contraceptives to married couples illegal, ruling that such laws violated the right to privacy. (Later, in Eisenstadt v. Baird (1971), the Court extended the right to buy birth control to unmarried people.)

McCorvey met Coffee and Weddington during her search for an illegal abortionist. McCorvey agreed to be the plaintiff representing all pregnant women in a class-action suit challenging the Texas laws, even though she knew the verdict would not come fast enough for her to be able to have an abortion (and, in fact, McCorvey ultimately gave birth to the baby). Her one condition for the case was that she remain anonymous; thus she became Jane Roe.

Filing suit against Dallas County District Attorney Henry B. Wade (1916–) representing the state of Texas, Coffee and Weddington attacked the Texas abortion statutes on the grounds that they violated the Fourteenth Amendment’s due process clause, which guaranteed equal protection to all citizens under the law, and the Ninth Amendment, which had been used in the Griswald v. Connecticut case to show that rights not specifically discussed in the Constitution were retained by the people—specifically, the right to privacy. Coffee and Weddington argued that a woman should have the right to decide whether or not to become a mother, because that decision was protected by the right to privacy.

The Case

The case was first argued in the Fifth Circuit Court in Dallas, Texas, on May 23, 1970. Coffee and Weddington represented McCorvey, the plaintiff, while Jay Floyd defended the Texas antiabortion law. Anticipating the state argument that the case should be dismissed because “Roe” must have already reached the point in her pregnancy when an abortion would be unsafe, Coffee argued that McCorvey did in fact have “standing to sue.” The judges agreed with her. Weddington specifically argued against the state claim that a fetus had legal rights that should be protected. “Life is an ongoing process,” she argued. “It is almost impossible to define a point at which life begins.” When asked by the judges whether she believed the abortion statutes were weaker under the Fourteenth Amendment or Ninth Amendment, Weddington said she believed antiabortion laws were more vulnerable under the Ninth Amendment.

The judges of the Fifth Circuit Court agreed with Coffee and Weddington. The court issued its opinion on June 17, 1970: “The Texas abortion laws must be declared unconstitutional because they deprive single women and married couples, of their right, secured by the Ninth Amendment, to choose whether to have children.” Because the Fifth Circuit Court, however, did not order the state to stop enforcement of the abortion law, Coffee and Weddington appealed to the Supreme Court. The Court agreed to hear the case.

Weddington and Floyd first argued the case before the Supreme Court on December 13, 1971, and again on October 10, 1972. Weddington argued that the Constitution declared people “citizens” at the moment of birth; at that moment persons were entitled to protection under the law. She contended that women who were compelled to bear children under the Texas law were left without control over their lives. Floyd repeated the state’s argument that Roe could not represent pregnant women in a class action suit, because she had certainly given birth by then. When asked to define when life began according to the state of Texas, Floyd could not answer.

The Supreme Court Decision

Judge Harry A. Blackmun (1908–1999) authored the majority opinion in Roe. He stated that the choice to abort was protected by a fundamental right of privacy: “This right of privacy, whether it be founded in the Fourteenth Amendment’s concept of personal liberty … or … in the Ninth Amendment’s reservation of rights to the people, is broad enough to encompass a woman’s decision to terminate her pregnancy.” He disagreed with the state of Texas’s claim that it had the right to protect the fetus by infringing on the rights of pregnant women. According to the Court, the fetus was not a “person” under the Constitution.

The Court held that the state had a compelling interest in restricting a woman’s right to choose an abortion in two instances—when a mother’s health was at risk or when the well-being of a viable fetus was at risk. The decision relied on trimesters. During the first trimester, the Court ruled, choices must be left to the woman in consultation with her doctor. In the second trimester the state was given a very limited ability to restrict a woman’s access to abortion—to protect maternal health. In the third trimester, when the fetus became viable, or able to live outside of the womb, the state’s interest in regulating abortion became compelling. A state was allowed to regulate or prohibit abortion entirely in this trimester, except when an abortion was necessary to preserve the life of the mother.

Justice William H. Rehnquist (1924–2005) wrote a strong dissenting opinion to the Court’s decision. He rejected the right to choose abortion as protected by the right to privacy. He also argued that abortion regulations should be decided locally and be respected by the Court.

On the same day as the Roe decision, the Court handed down a decision in another abortion case, Doe v. Bolton, which challenged a Georgia law that regulated the procedures involved in getting an abortion, such as where abortions could be performed, residency requirements for women seeking abortions, and the need for approval of three physicians to procure an abortion. The Court struck down these restrictions.

Effects of the Decision

Roe v. Wade had immediate and significant effects. Several states that had already reformed their abortion laws, including New York, Alaska, Hawaii, and Washington, were required to extend the period in which they allowed abortions by several weeks. Fifteen states had to completely overhaul their laws regulating abortion. In thirty-one states, including Texas, strict antiabortion laws immediately became invalid. Abortions became widely available in the United States, and the number of abortions performed yearly skyrocketed. By the late 1980s, 1.5 million legal abortions were performed annually in the country, and about three out of every ten pregnancies ended in abortions.

Roe v. Wade garnered more public controversy than perhaps any other decision in the Supreme Court’s history. Supporters of state laws restricting women’s access to abortions, drawn largely from the Catholic and Protestant fundamentalist religious right, were shocked by the Court’s decisions in Roe v. Wade and Doe v. Bolton. These activists argued that abortion was tantamount to murder. They immediately mobilized into an antiabortion, or “pro-life,” movement. Their initial demand for a “human rights amendment” to the Constitution that would ban abortion entirely failed. They did succeed, however, in getting some states to pass laws that would further restrict abortion, like requiring parental consent for minors or consent from fathers before an abortion. However, the Supreme Court struck these consent provisions down as too restrictive in Planned Parenthood v. Danforth in 1976. Other states reduced or eliminated public funding for abortions, which could limit a poor women’s ability to procure an abortion. These restrictions were held up by the Supreme Court in several cases decided in 1977, including Beal v. Doe, Maher v. Roe, and Poelker v. Doe.

Ongoing Controversy

Since the 1970s, the Republican Party has adopted the “pro-life” position as part of its party platform. This had the effect of gaining significant Catholic and fundamentalist support for the party while losing support of some women voters. The Democratic Party, on the other hand, adopted a “pro-choice” party platform and gained significant support from women voters, helping Democrat Bill Clinton (1946–) to be elected president in 1992.

That same year, the right of women to choose abortion was affirmed once again by the Court in Planned Parenthood of Southeastern Pennsylvania v. Casey. Women’s groups, clinics, and doctors challenged Pennsylvania’s Abortion Control Act, which required women to wait twenty-four hours after receiving abortion information mandated by the state before having an abortion. It also required minors to get consent from at least one parent and married women to notify their husbands that they were pregnant and intended to abort. The district court initially declared most provisions of the law unconstitutional, except for one that required physicians to notify women of the age of the embryo or fetus. The court of appeals, on the other hand, upheld all provisions of the law except the one requiring a woman to notify her spouse. The Supreme Court agreed to hear the appeals by both sides. Observers believed that Roe v. Wade might well be overturned by the conservative Court. However, conservative Justice Anthony M. Kennedy (1936–) changed his mind at the last minute, joining Justice Sandra Day O’Connor (1930–) and Justice David H. Souter (1939–) in a compromise meant to uphold Roe.

On June 29, 1992, Justices O’Connor, Kennedy, and Souter delivered the opinion of the Court, which upheld the essential holding of Roe v. Wade recognizing a woman’s right to choose to have an abortion. However, the court upheld Pennsylvania’s Abortion Control Act, except for the provision requiring spousal notification.

Roe v. Wade had a profound impact on the range of reproductive choices available to women. A large network of clinics was established where women could obtain safe and inexpensive abortions as well as counseling. The decision also proved an enduring and divisive issue in politics, and pro-choice and antiabortion forces quickly formed in the wake of the decision to fight the continuing battle over women’s freedom to choose abortion. A number of acts of violence against clinics and doctors who perform abortions have been attributed to antiabortion groups. Legal restrictions on women’s access to abortions continued to be litigated, and the Supreme Court continued to hear cases dealing with abortion matters in the following years.

Sarah Weddington Fights for the Rights of Women

The experience of Sarah Weddington (1945–) as a female in the male-dominated legal profession, as well as her personal experience of having an illegal abortion, served to politicize her and involve her in feminist causes. She had been one of only five women at the University of Texas School of Law in Austin. Although she graduated in the top 25 percent of her class in 1967, discrimination against women in law kept her from getting work in Texas. She eventually found work on the American Bar Association’s project to standardize legal ethics.

In the meantime, Weddington became involved with a network of feminists in Austin who published an underground newspaper, The Rag. These women also counseled other women about birth control, including directing women to the safest abortion clinics in Mexico. Weddington took an interest in birth control work in part because she herself had had an illegal abortion in Mexico. She became convinced that women’s reproductive rights were essential to women’s equality.

Weddington enlisted the aid of a law school classmate, Linda Coffee (1942–), and the two began working on a case against the Texas antiabortion laws based on the constitutional right to privacy established in 1965 in the birth control case Griswold v. Connecticut. They began searching for a woman for whom they could file a class-action suit on behalf of all pregnant women in order to challenge the Texas laws.

Weddington was only twenty-five years old when she filed Roe v. Wade in federal court in Dallas, challenging the Texas antiabortion statute on behalf of Norma McCorvey (1947–). Young and inexperienced, she had never before tried a case. She is believed to be the youngest person to have ever won a case before the Supreme Court.

Weddington’s success helped her win a seat in the Texas state legislature. She then served as general counsel for the Department of Agriculture during the administration of Jimmy Carter (1924–) for one year before becoming Carter’s assistant on women’s issues for the remainder of his presidency. In the early 1980s Weddington remained in Washington as a lobbyist for the state of Texas and remained active in women’s issues and reproductive rights. In 1986 Weddington returned to Austin to practice law. In 1992 she wrote A Question of Choice, which tells the story of Roe v. Wade and details her position on women’s reproductive rights. She remains a popular speaker and lecturer.

United States v. Nixon

In United States v. Nixon, the Supreme Court ruled in 1974 that President Richard M. Nixon (1913–1994) must turn over to prosecutors tape recordings of conversations and other documents related to the break-in of the Democratic National Committee headquarters at the Watergate Hotel by members of Nixon’s administration. Nixon argued that he should not be made to produce the tapes because conversations with aides and staff members needed to be conducted in a confidential environment. The Court rejected Nixon’s claim that executive privilege made him immune to the judicial process, as well as his claim that White House documents must remain confidential to protect sensitive information.

Nixon Subpoenaed

Several senior Nixon administration officials had been indicted on March 1, 1974, on charges of conspiracy to obstruct justice in United States v. Mitchell, including the U.S. Attorney General John N. Mitchell (1913–1988). Nixon was included as a coconspirator, but unindicted. On April 18 the special prosecutor conducting the investigation into Watergate asked for a subpoena ordering Nixon to turn over to the Court “certain tapes, memoranda, papers, transcripts, or other writings” related to specific meetings and conversations in the White House identified by the special prosecutor as of interest through the White House daily logs and records of appointments.

Nixon did not comply with the subpoena; instead, he handed over edited transcripts of conversations hoping he could avoid turning over the tapes. Nixon’s attorney, James D. St. Clair (1920–2001), asked the judge to rescind the subpoena; the judge denied the request and ordered the tapes be turned over by the end of May.

The Case

St. Clair filed an appeal with the U.S. Court of Appeals for the District of Columbia, but both sides were aware of the importance of the question at hand—whether the president could be subpoenaed or otherwise forced to take part in the judicial process. They were also aware of the political stakes in this particular case and unwilling to drag the public through a prolonged judicial process. On May 24 the special prosecutor asked the Supreme Court to take the case without waiting for the case to make it through the court of appeals. On June 6 St. Clair requested the same.

The Supreme Court granted the requests and took the case from the court of appeals on June 15, 1974. The case was argued before the Supreme Court on July 8. The Court issued its opinion a little more than two weeks later. The eight justices (Justice William H. Rehnquist [1924–2005] had served in the Nixon administration and excused himself from the case) wrote a unanimous decision in this case because of the important issues at stake—namely, the relationship between the judiciary branch and the executive branch.

The Court began by restating the principle of Marbury v. Madison (1803): It is the judicial branch’s responsibility to interpret the law and the Constitution. Therefore, wrote the justices, “we reaffirm that it is the province and the duty of this Court to say what the law is with respect to the claim of privilege presented in this case.”

The Court then addressed the two main arguments of Nixon’s attorney: first, that presidential communications should be kept confidential so advisers would be able to speak freely; and second, that the separation of powers gives the president immunity from the judicial process. First, the Court stated that confidentiality, while a concern, could be protected by a judge reviewing evidence privately: “Absent a claim of need to protect military, diplomatic, or sensitive national security secrets, we find it difficult to accept the argument that even the very important interest in confidentiality of Presidential communications is significantly diminished by production of such material for in camera [private] inspection with all the protection that a District Court will be obliged to provide.” Second, the Court said, a blanket executive privilege granting the president immunity from the judicial process would stand in the way of criminal prosecutions. “Unqualified privilege,” the Court stated, “would place in the way of primary constitutional duty of the Judicial Branch to do justice in criminal prosecutions would plainly conflict with the function of the courts.”

Aftermath

The Court ordered Nixon to turn over the tapes to Judge John J. Sirica (1905–1992) for in camera inspection. The question remained whether Nixon would obey the court order. Within a day Nixon assured the public he would comply with the Court’s decision: “While I am, of course, disappointed in the result, I respect and accept the court’s decision, and I have instructed Mr. St. Clair to take whatever measures are necessary to comply with that decision in all respects.”

Nixon turned over sixty-four tapes to Judge Sirica, including some containing highly incriminating conversations between Nixon and his aides shortly after the Watergate break-in. Realizing that Congress was poised to impeach him, Nixon resigned on August 8, 1974, and Gerald R. Ford (1913–2006) became president at noon the following day.

United States v. Nixon established that executive privilege did not provide the president with immunity from the judicial process. Furthermore, it established that the president does not have the power to ignore subpoenas or withhold evidence from the courts, except in cases where national security was in jeopardy.

See also Richard M. Nixon

See also Watergate

The Earned Income Tax Credit

The Earned Income Tax Credit (EITC) is a federal tax benefit that reduces taxes for low- to moderate-income workers. Even workers whose earnings are too low to owe any income tax can get the EITC, which for them will result in a payment from the government.

Origins of the Earned Income Tax Credit

The EITC was established in 1975 as a way to decrease the tax burden of low-income workers with children. Since its inception, the qualifications for and the benefit of the EITC has been expanded in order to help raise the incomes of working families above the federal poverty level.

The EITC was formulated during the presidency of Richard M. Nixon (1913–1994) as a way to ensure a minimum income level for all Americans. The chief advocate for this effort was Senator Russell Long (1918–2003), a Louisiana Democrat who believed the government should provide a monetary benefit to poor people who held a job.

The credit was approved in 1975. Because it was part of a larger tax bill signed into law by President Gerald Ford (1913–2006), at the time of its inception the EITC received minimal media coverage or public attention. Qualifying persons who were aware of the benefit were able to claim the credit for 1975 with a tax return filed in April 1976. For the 1975 tax year, the credit was claimed by 6.2 million households, at a cost of $1.25 billion. The tax credit was made permanent in 1978 and since then has been both increased in value and, after an expansion in 1986, indexed to inflation. (Additional expansions and changes have occurred during tax legislation passed in 1993 and 2001.) While the EITC was originally created solely to assist low-income earners with dependent children, the program has since been expanded to include certain childless households in which a person works but earns a minimal income.

How the Earned Income Tax Credit Works

Unlike a tax deduction, which allows taxpayers to reduce their taxable income, a tax credit is subtracted from the actual tax a person is calculated to owe. Hence, a tax bill of $500 might be reduced or eliminated if a qualifying worker applies for the EITC. Under the EITC, the federal government reimburses families that do not earn enough to pay much, or any, federal income tax. The program is unique in that it requires people to file a return, even if they are not required to pay any taxes, in order to receive a benefit check. If the amount of the credit exceeds a filer’s actual tax liability, the excess is paid directly to the applicant.

To qualify for the original EITC, an income earner must have been supporting a family that included at least one dependent child. In 1975 the benefit was available to households that earned less than $8,000 in annual income. At the time, the benefit was equal to 10 percent of the earned income, up to a maximum income of $4,000. For example, a worker earning $4,000 per year, who had a dependant child, would receive a $400 check, which was the maximum benefit available. Someone earning $5,000 would have his or her benefit reduced 10 percent by whatever portion of the earnings was above $4,000. Hence, applicants would qualify for the $400 on the first $4,000 of their earnings, but be reduced by 10 percent on that $1,000 extra, thus decreasing their $400 credit by $100, which would leave them with a reduced benefit check of $300. The average credit earned by families for the 1975 tax year was $201. By comparison, in 2003 it was $1,784.

Impact of the Earned Income Tax Credit

Since its inception in 1975, the EITC has grown into the largest federally funded means-tested cash assistance program in the United States. It is one of several programs run by the federal government whose benefits are determined by recipients’ income. Others include the Food Stamp program, Medicaid, and Temporary Assistance for Needy Families (TANF). A key difference is that the EITC requires that the qualifying person have income from gainful employment or other legal means. Because the tax credit received by applicants varies based on their earnings, the program is viewed to serve low-income families in three ways. As a “phase-in” benefit, the EITC acts as a wage subsidy for very low earners. In this range, as the family earns more, its EITC increases. In the “maximum credit range,” the credit is constant regardless of earnings. In the “phase-out range,” as the family earns more, the credit is reduced. In essence, the EITC can serve as an incentive for those with very low incomes to work. In other cases, such as with families whose incomes are close to disqualifying them for the benefit, the EITC can be viewed by some as a disincentive to increase one’s earnings.

However, in its more than three-decade existence, the EITC has largely been viewed as a positive government benefit that helps both individuals and the labor force at large, because it encourages employment over traditional welfare by providing a benefit only available to low-income people who are trying to provide for themselves by working. According to the U.S. Congress, the EITC has lifted more children above the Federal Poverty Level than any other government assistance program. Some studies have indicated that the benefit is more useful than an increased minimum wage in raising the incomes of low earners. Because the EITC is a program that promotes work, family, and self-reliance over welfare, it has been widely supported by both Democratic and Republican politicians.

Buckley v. Valeo

In Buckley v. Valeo, handed down on January 30, 1976, the Supreme Court upheld limits on donations to political campaigns but declared limits on a campaign’s expenditures unconstitutional, on the grounds that such limits violate the First Amendment’s protection of political speech. In addition, the Court found that public funding of presidential campaigns through check-off boxes on tax returns did not violate the Constitution’s free speech or due process clauses, as charged by the lawyers of Senator James Lane Buckley (1923–).

The Federal Election Campaign Act (FECA) of 1971 had reformed campaign financing by imposing limits on amounts political candidates could spend on campaigns as well as imposing limits on donations to political campaigns. It also legislated better recording of funding sources in an attempt to create greater accountability. After the Watergate scandal, which revealed political corruption that included campaign financing, Congress sought to further reduce the influence of money in political election through amendments to FECA.

The Case

Hoping to keep the amendments from taking effect before the 1976 election, Buckley and other candidates for political office filed suit in the U.S. District Court for the District of Columbia against Francis R. Valeo, secretary of the Senate and member of the Federal Election Commission. When arguments that FECA restricted constitutional freedoms were rejected by the U.S. Court of Appeals, the U.S. Supreme Court agreed to hear the case.

The Supreme Court Reshapes Campaign Finance Laws

In its ruling the Supreme Court did uphold a limit of $1,000 that an individual could donate to a presidential or congressional campaign. But as long as no official tie existed between the donor and the election campaign proper, these “independent” individual and group donors were free to spend unlimited money to support political candidates. For example, an individual can only donate $1,000 to the official election campaign committee of either a Democratic or Republican candidate. But that same individual can donate any larger amount of money to, for example, either the Democratic or Republican National Committees—independent groups not officially affiliated with candidates’ campaigns—who in turn can spend unlimited amounts of money to promote a presidential candidate in ways either similar to or different from the ways in which the official campaign committee promotes the candidate. In addition, the Court stated that campaign committees could spend freely, finding that any limits on expenditures violated the right of freedom of speech. Candidates themselves could also spend their own money, in any amount, on their campaigns to get elected to office.

The Court dismissed charges that public funding of presidential campaigns violated the First Amendment, saying that rather the First Amendment was actually furthered by the provision of funding political debates. It also dismissed challenges to the measure based on the due process clause of the Fifth Amendment. Appellants argued the measure favored major candidates and parties, but the Court disagreed. While the Court dismissed limits on campaign expenditures in general, it upheld the condition that candidates who accept public monies must agree to limits on campaign expenditures.

The Court decision in Buckley v. Valeo reshaped campaign finance laws and made it difficult to enact future legislation to limit the influence of “big money” in political campaigns. Because the Court ruled that individuals and groups may contribute freely as long as they are independent of official election campaigns, political units known as political action committees, or PACs, arose. PACs are legally allowed to raise huge amounts of money to help candidates get elected. Their influence on political campaigns has grown exponentially since Buckley v. Valeo.

“Hard” v. “Soft” Money

The Supreme Court declared in Buckley v. Valeo (1976) that contributions to and expenditures of political action committees, or PACs, could not be limited by campaign finance laws because such laws violated the right to freedom of speech. This gave rise to, in the words of one commentator, a “freewheeling, money-driven dynamic” that has since been upheld in several Supreme Court cases, including First National Bank of Boston v. Bellotti (1978), Federal Election Commission v. Massachusetts Citizens for Life (1986), and Colorado Republican Federal Campaign Committee v. Federal Election Commission (1996).

The effect of these Court decisions has been to uphold limits on so-called “hard money.” Hard money is regulated by the government through requirements that political campaigns keep good records of campaign donations and expenditures. Individuals can contribute no more than $1,000 to a political candidate’s official campaign committee.

However, the law allowed “independent” individuals and groups to raise unlimited funds, called “soft money” because it is unregulated. Individuals and groups were considered independent as long as they are not the campaign committee itself. Consequently, the Democratic National Committee, for example, which is not affiliated with the campaign committee, was allowed to raise and spend unlimited money to support the Democratic candidate for president. Other “independent” groups included labor unions, corporations, conservative or religious organizations, and other political advocacy groups. These groups could spend any sum to advocate policy positions that would benefit their chosen candidates.

Soft money was regulated somewhat with the passage of the Bipartisan Campaign Reform Act of 2002. Popularly called the McCain-Feingold Law, it finally banned the national political parties from raising soft money. It also banned advertisements produced by PACS that mentioned a candidate by name within thirty days of a primary election or sixty days of a general election. These so called “issue ads,” the Court found, could be regulated without impinging on freedom of speech. Unsurprisingly, these limits were immediately challenged in court; the Supreme Court, however, upheld the bans in McConnell v. Federal Election Commission (2003).

The Government in the Sunshine Act

In an effort to promote openness to the public and among governmental agencies, Congress created the Government in the Sunshine Act in 1976. Sometimes referred to as the Open Meetings Act, the statute established regulations and guidelines directing the leaders of certain types of government agencies and committees to conduct the public’s business in public—in other words, in the light of day, rather than in closed-door deal-making sessions.

The Need for Sunshine

The Government in the Sunshine Act was the last in a series of four “open government” statutes that included the Freedom of Information Act in 1966, the Federal Advisory Committee Act in 1972, and the Privacy Act in 1974. The bill that evolved into the Government in the Sunshine Act was first introduced in 1972, by Senator Lawton Chiles (1930–1998). In the wake of the Watergate scandal, the bill was an effort by legislators to encourage Americans to understand the decision-making process in Washington and to again have confidence and trust in the federal government. President Gerald Ford (1913–2006) signed the bill into law on September 13, 1976.

The Government in the Sunshine Act is based on the policy that, as the act states, “the public is entitled to the fullest practicable information regarding the decision-making processes of the Federal Government.” Its purpose is “to provide the public with such information while protecting the rights of individuals and the ability of the Government to carry out its responsibilities.”

The Sunshine Act requires that the policy-making deliberations and meetings of “collegially headed” federal agencies be open to public scrutiny. For purposes of the legislation, a collegially headed agency is one in which several members (a majority of whom were appointed by the president and approved by the Senate) make decisions collaboratively, through discussions and voting, such as is done on a board or commission. The more than fifty governmental organizations subject to the law are typically independent regulatory agencies, such as the Securities and Exchange Commission, the Federal Communications Commission, and the Federal Reserve Board. The statute does not apply to departments headed by one person, such as the Department of State, or to the president of the United States.

Closure of meetings covered by the Sunshine Act is allowed if the reason is in accordance with at least one of ten exemptions to the rule. For example, a meeting can be closed if it involves the discussion of classified information that would invade an individual’s privacy or impact a law enforcement investigation or pending litigation.

As much as the law helps the public to be aware of what decisions are being made and why, the statute was also created for its benefit to other arms of the government—because there is no statutory requirement for one government agency to share its information with another. Congress and the judiciary can issue subpoenas for such information, but this can be time-consuming and costly. With the Sunshine Act, and the other public information acts, numerous deliberations were made part of the public record.

What Is a “Meeting”?

Provisions similar to or the same as the act are currently in effect in all fifty states. And such laws often expand beyond government to include the meetings of organizations such as homeowner associations (HOAs). However, individuals who have wanted to circumvent the law have often managed to do so, primarily by avoiding adherence to the act’s official definition of the term “meeting.” Under the Sunshine Act, a meeting must:

  1. Include a quorum, which is the minimum number of members who need to be present in order to vote and take action on behalf of the board, agency, or committee, etc.
  2. Allow members to discuss the issues, which would not be the case if one member made a speech and the others were in attendance only to listen
  3. Consist of “deliberations [that] determine or result in the joint conduct or disposition of official agency business”

By this definition, it is possible to have meetings that are not officially meetings. When an organization does not want to have an open meeting, there are ways to keep it closed, such as by not having a quorum or not voting. Contemporary technology that did not exist at the time of the Sunshine Act’s passage now leaves room for the circumvention of the rules by allowing gathering for discussions via telephone conference call or e-mail. In essence, a board can have its discussions behind closed doors but meet in public only for a formal vote.

In FCC v. ITT World Communications (1984), the U.S. Supreme Court determined that a meeting occurs only when a quorum of members actually conducts or resolves official business. Hence, member discussions about issues, or to implement policies already voted upon, are not necessarily “meetings.”

Another unintended consequence of the act, according to some observers, is that it has heightened the influence of an agency’s chairperson (who typically runs the public meetings) and lessened the inclination and ability of other board members to speak freely. However, the Sunshine Act has proven particularly beneficial to cable news and twenty-four-hour television programming. By opening the governmental meetings to the public, cable networks such as C-SPAN (and its spin-offs) can broadcast even the most routine meetings of Congress and numerous federal agencies.

Exceptions to the Government in the Sunshine Act

A meeting of a government agency, board, or commission that is covered by the Government in the Sunshine Act can be closed to the public if the release of such information is likely to do any of the following:

  1. Reveal national defense or foreign policy secrets
  2. Relate solely to the internal personnel rules and practices of an agency
  3. Disclose matters specifically exempted from disclosure by statute
  4. Disclose privileged or confidential trade secrets and commercial or financial information obtained from a particular person
  5. Involve formally censuring a person or accusing an individual of a crime
  6. Disclose information of a personal nature where disclosure would constitute a clearly unwarranted invasion of personal privacy
  7. Disclose investigatory records compiled for law enforcement purposes, or information that would interfere with enforcement proceedings by depriving a person of a right to a fair trial or an impartial adjudication, constituting an unwarranted invasion of personal privacy, disclosing the identity of a confidential source or confidential information furnished by such source, disclosing investigative techniques and procedures, or endangering the life or physical safety of law enforcement personnel
  8. Disclose information contained in or related to examination, operating, or condition reports by, for, or on behalf of an agency responsible for the regulation or supervision of financial institutions
  9. Disclose certain types of information that would lead to significant financial speculation in currencies, securities, or commodities, significantly endanger the stability of any financial institution and/or significantly frustrate the implementation of a proposed agency action
  10. Specifically concern an agency’s issuance of a subpoena or the agency’s participation in a legal proceeding in the United States or abroad

Bibliography

The Government in the Sunshine Act, 5 U.S.C. § 552(b), U.S. Department of Justice, http://www.usdoj.gov/oip/gisastat.pdf (accessed on July 14, 2007).

City of Philadelphia v. New Jersey

In City of Philadelphia v. New Jersey, Philadelphia argued that a New Jersey law, passed in 1973, that prohibited waste disposal in New Jersey from out of state violated the commerce clause of the Constitution. In 1978 the Supreme Court handed down its decision that New Jersey did in fact violate the commerce clause by banning waste disposal from other states within its borders. The commerce clause grants Congress the power to enact legislation to control interstate commerce, and therefore limits individual states’ ability to regulate interstate commerce and trade. The decision was a blow to the emerging environmental movement.

Environmental Protectionism

Environmental concerns in the 1960s led to a proliferation of environmental protection laws in the 1970s in a number of states. New Jersey’s law prohibiting the dumping of out-of-state wastes within its borders was especially notable because the large eastern seaboard cities of New York and Philadelphia had used New Jersey as a dump for decades. The cities took their case to court.

The New Jersey Supreme Court found that the economic burden imposed on New York and Philadelphia was “slight” and that the state had a legitimate interest in protecting its residents and environment from the hazards posed by the “cascade of rubbish.” Philadelphia appealed to the U.S. Supreme Court, citing Cooley v. Board of Wardens (1851), which had established that state laws could not regulate interstate trade. The Court had previously used that precedent to overturn state protectionist laws of all kinds, including laws designed to protect a state’s jobs, industries, and other resources.

The Supreme Court Decision

The Supreme Court found the New Jersey law was an unconstitutional violation of the commerce clause. Justice Potter Stewart (1915–1985) wrote the majority opinion in the case. He stated that “whatever New Jersey’s ultimate purpose, it may not be accomplished by discriminating against articles of commerce coming from outside the State, unless there is some reason, apart from their origin, to treat them differently.” In other words, if New Jersey allowed its own trash to be landfilled in the state, and New Jersey trash was indistinguishable from out-of-state-trash, then New Jersey was obliged to permit garbage from other states to be landfilled within its borders.

The Fate of Environmental Protectionism

City of Philadelphia v. New Jersey was significant because the Supreme Court essentially found a state’s desire to protect its environment from harm as not a legitimate reason to regulate interstate commerce. State legislatures were no longer able to give residents of their state rights over other state residents, protect the natural environment, or guard against fears of “contagion.” It was a major setback for the environmentalist movement.

See also The Environmental Movement

Tennessee Valley Authority v. Hiram G. Hill

In Tennessee Valley Authority v. Hiram G. Hill, the Supreme Court found in 1978 that the Endangered Species Act prohibited the Tennessee Valley Authority from closing the gates of the Tellico Dam on the Lower Tennessee River because the flooding ensuing from the closed dam would destroy a “critical habitat” of the snail darter fish, an endangered species. The snail darter was listed as endangered under the Endangered Species Act, which had been enacted in 1973. The case was filed to protect the fish’s habitat, and it reached the Supreme Court in 1978. The Supreme Court decided that the Endangered Species Act applied to this case and that the Tellico Dam could not complete its mission of closing the dam’s gates, despite the economic losses that would ensue from abandoning an almost completed dam. However, subsequent legislation allowed the dam project to be completed.

History of the Tellico Dam Project

The Tennessee Valley Authority began constructing the Tellico Dam on the Little Tennessee River in 1967. Environmentalists opposed construction of the dam, which would create a flooded reservoir covering 16,500 acres of land and stretching over thirty miles, providing, environmentalists argued, very few economic benefits. Although they succeeded in obtaining an initial injunction against the dam’s completion under the terms of the National Environmental Policy Act of 1969, environmentalists were unsuccessful in permanently shutting down the project until the discovery of the snail darter fish, whose habitat was believed to be limited to the Little Tennessee River.

The Endangered Species Act

Concerns about humans damaging the environment had been rising and become a major political issue in the late 1960s. In response, Congress had passed the Endangered Species Act of 1973 in order to prevent human impacts on ecosystems to endanger or cause the extinction of animal species. Environmentalists (including Hiram G. Hill, a member of the Tennessee Endangered Species Committee) filed suit in 1976 seeking protection of the snail darter fish’s habitat under the provisions of the Endangered Species Act, beginning a three-year court battle over the closing of the gates of the Tellico Dam.

The Case

What was at issue in the case was whether the requirements of the Endangered Species Act, passed in 1973, applied to a federal project that had been authorized before the passage of the act and which was virtually completed. A district court initially affirmed that the snail darter’s habitat would be destroyed by the completion of the dam but found that the economic losses involved in scrapping a project that was almost finished at the time the fish was discovered were too great to stop the project from completion.

Environmentalists took their case to the court of appeals, which ruled that the current project status had no bearing on whether the provisions of the Endangered Species Act applied. The court of appeals issued an injunction against the completion of the dam. “It is conceivable,” the court wrote, “that the welfare of an endangered species may weigh more heavily upon the public conscience, as expressed by the final will of Congress, than the writeoff of those millions of dollars already expended for Tellico.”

The Tennessee Valley Authority appealed, and in 1978 the Supreme Court agreed to hear the case. In a 6–3 decision the Court affirmed the injunction against the closing of the doors of the dam that had been issued by the court of appeals. The Court found that because the area of the Little Tennessee River that would be flooded by the closing of the dam doors was the snail darter’s “critical habitat,” construction on the dam could not continue. Despite the economic losses from abandoning the Tellico project, the Court found, no exemptions included in the Endangered Species Act applied to the project, despite the fact that Congress had continued appropriating money for the completion of the project even after the Little Tennessee River was found to be a “critical habitat.” Noted the Court, “Congress has spoken in the plainest words, making it clear that endangered species are to be accorded the highest priorities.”

Chief Justice Warren Burger (1907–1995) wrote the majority opinion in the case. He addressed the apparent incongruity of halting a multimillion dollar government project to save a tiny fish. “It may seem curious to some that the survival of a relatively small number of three-inch fish among all the countless millions of species extant would require the permanent halting of a virtually completed dam for which Congress has expended more than $100 million,” he wrote. “The paradox is not minimized by the fact that Congress continued to appropriate large sums of public money for the project, even after congressional Appropriations Committees were apprised of its apparent impact upon the survival of the snail darter. We conclude, however, that the explicit provisions of the Endangered Species Act require precisely that result.”

Subsequent Political Maneuvering Allows the Dam’s Completion

One month after the Court handed down its decision, Congress passed an amendment to the Endangered Species Act that allowed projects to be exempted from its restrictions for certain economic reasons. The Tennessee Valley Authority immediately sought an exemption from the Endangered Species Committee but was denied because the Tennessee Valley Authority had not taken necessary action to avoid a negative impact on the snail darter’s “critical habitat.”

Subsequently, two legislators representing Tennessee, Senator Howard H. Baker (1925–) and Representative John James Duncan, Sr. (1919–1988), included an amendment to the Energy and Water Development Appropriations Act in 1979 that would exempt the Tellico Dam from the requirements of the Endangered Species Act. Once passed, the gates of the Tellico Dam were allowed to be closed and the reservoir filled, destroying the snail darter fish’s habitat in the Lower Tennessee River. Fortunately, within the following years, snail darter fish were found in other areas of the Tennessee River watershed, and the classification of the species was changed from endangered to threatened.

Tennessee Valley Authority v. Hiram G. Hill came to signify both the obstacles faced by the environmental movement and the movement’s supposed excesses. The 1978 ruling was the first Supreme Court decision interpreting the 1973 Endangered Species Act and remains an important case in environmental law. However, the political maneuvering that followed on the heels of the decision illustrated that economic considerations often trump over environmental ones, despite the requirements of the Endangered Species Act. Still, the Endangered Species Act has had far-reaching ramifications, protecting not just species of animals but also the ecosystems on which they depend.

See also The Environmental Movement

Regents of the University of California v. Bakke

In Regents of the University of California v. Bakke, the U.S. Supreme Court ruled in 1978 that the “reverse discrimination” embodied in some affirmative action programs using quota systems was unconstitutional. Alan Paul Bakke (1940–), a white student twice denied admission to the University of California, Davis, School of Medicine while several places reserved for minority students went unfilled, filed suit charging his civil rights had been violated because he had been discriminated against based on his race. Bakke won the case. The Supreme Court ordered Bakke be admitted and that the University of California’s special admissions program for minority students be dismantled. In the decision, the Court generally affirmed the constitutionality of affirmative action programs but declared that policies designed to increase minority representation through quota systems were unconstitutional because they discriminated against white students.

Affirmative Action

Affirmative action programs are intended to increase the representation of traditionally underrepresented groups in employment or educational institutions. These programs are rooted in the Civil Rights Act of 1964, which prohibited job discrimination and required employers to provide equal opportunities to all employees, empowering workers and job applicants who believed they had been discriminated against based on their race, religion, or national origin to sue in federal courts. Executive Order 11375, signed by President Lyndon B. Johnson (1908–1973) in 1967, added sex to that list.

Affirmative action programs can take many forms. A program might be designed to hire or admit members of underrepresented groups in proportions equal to their proportion in the applicant pool. Programs might target women and minorities to increase their number in an applicant pool. Or an affirmative action program might establish numerical goals, which are quotas. All of these programs attempt to mitigate discriminatory effects on underrepresented groups.

Criticism of Affirmative Action

While advocates of preferential affirmative action programs argue that these are temporary programs designed to remedy a long history of discrimination against minority groups, critics of affirmative action argue that the “preferential treatment” given to members of disadvantaged groups is discriminatory. The law, they argue, should only remedy discrimination by protecting individuals, rather than trying to remedy historical discrimination against groups of people. These critics charge that giving preference to members of minority or other disadvantaged groups is “reverse discrimination,” mainly against white men.

The Case

That was the point at issue in Regents of the University of California v. Bakke: Did reserving a certain number of spots in the University of California, Davis, School of Medicine for members of disadvantaged groups discriminate against white males? The case was brought by Allan Bakke, a white male who had been denied admission to the school in 1973 and 1974 even though his admission scores were higher than several minority students who had been accepted and even though several places reserved for minorities remained unfilled. The University of California, Davis, School of Medicine had begun an affirmative action program in 1968 in an attempt to increase the proportion of minority students in the student body. The school had set aside sixteen of its one hundred annual admissions for minority students, who were admitted under a separate process than were the general pool of candidates.

Bakke filed suit, charging that the university had not admitted him because of his race. He argued that the special admission program maintained by the school was unconstitutional because it violated his rights under the equal protection clause of the Fourteenth Amendment, which states that the laws must treat all individuals in the same manner as other individuals in similar circumstances.

The Superior Court of California agreed with Bakke but refused to order the medical school to admit him. Bakke appealed. The California Supreme Court upheld the lower court’s finding that the admissions program was unconstitutional and ordered Bakke be admitted. The University of California appealed this ruling to the U.S. Supreme Court.

Reverse Discrimination?

In a 5–4 split decision, the Supreme Court ruled that in fact the California program violated Bakke’s civil rights under the equal protection clause and the Civil Rights Act because he was discriminated against based on his race. They ordered the University of California’s special admissions program for minority students be dismantled and that Bakke be admitted to the medical school.

Justice Lewis F. Powell Jr. (1907–1998), writing for the Court, stated that universities could take race into account as a “plus” factor when making admissions decisions, because the First Amendment allowed educational institutions to promote cultural diversity. However, universities could not set aside a quota of spots for minority students that excluded whites. Powell argued that affirmative action programs could successfully recruit minority students without relying on a quota system. He wrote, “The experience of other university admissions programs, which take race into account in achieving the educational diversity valued by the First Amendment, demonstrates that the assignment of a fixed number of places to a minority group is not a necessary means toward that end.”

Bakke was significant because it was the first time the Supreme Court had held that there could be such a thing as “reverse discrimination.” The decision, while upholding the principle of affirmative action, effectively limited how those programs could operate in a university setting. Universities could no longer set aside a number of enrollment spots for minority groups, or, presumably for women, although they might consider race and gender as one factor in the admissions decision. A series of other decisions by the Supreme Court—for example, United Steelworkers of America v. Weber (1979), United States v. Paradise (1987), Johnson v. Santa Clara County Transportation Agency (1987), and Watson v. Fort Worth Bank and Trust (1988)—likewise accepted affirmative action plans that gave members of disadvantaged groups preference in the workplace but rejected outright quotas.

See also Affirmative Action

The Independent Counsel

An independent counsel is an attorney appointed by a panel of federal judges to investigate and prosecute criminal activity by high-level government officials. The independent counsel position, originally referred to as a “special prosecutor,” was created to avoid the conflict of interest that could emerge if the U.S. Department of Justice, which is part of the executive branch of the government, had to investigate its own officials. The independent counsel as established by the 1978 Ethics in Government Act was inspired by the Watergate crisis. The legislation was in place for twenty-one years before Congress allowed it to lapse in 1999.

Watergate and the Independent Counsel

When the Watergate scandal involving President Richard M. Nixon (1913–1994) erupted in the early 1970s, the initial investigation, authorized by Nixon, was conducted by the U.S. Department of Justice, under the supervision of the U.S. attorney general. But because the president appoints, and can dismiss, the attorney general, a possible conflict of interest existed—and in October 1973, it actually did exist. Angry about the questions being asked by Special Prosecutor Archibald Cox (1912–2004), Nixon had him fired, in an incident that came to be called the “Saturday Night Massacre” and lead to the belief that the executive branch could not be trusted to enforce the law against itself.

After Watergate, legislators sought to develop a procedure by which the parameters of the special prosecutor’s existence and purview would be defined: Who would appoint such a person? Under what circumstances? How could that person be dismissed? One reason for the independent counsel legislation was to restore public confidence in the government, particularly the Justice Department, by ensuring that the president and other powerful members of the government could not interfere with a high-level investigation or prosecution.

The Independent Counsel Statute

Facing certain impeachment, Nixon resigned the presidency in August 1974. It took until 1978 for Congress to come up with the solution of having a judicial panel appoint an impartial, nonpartisan independent counsel to investigate and prosecute illegal acts by high-level government officials. The resulting Ethics in Government Act was signed into law by President Jimmy Carter (1924–) on October 26, 1978.

Under the Independent Counsel Act (as the reform was sometimes referred), if there was sufficient evidence of wrongdoing, the attorney general would refer the situation to a judicial panel, which would then appoint an independent attorney to investigate any allegations and, when appropriate, prosecute the case. Under the act, the attorney general was to request an outside prosecutor in cases involving high-level government officials where the “personal, financial, or political conflict of interest” is too great.

The act required reauthorization every five years. Because of Republican discontent with it, Congress allowed the act to lapse for eighteen months beginning in 1992. It was reenacted in 1994 and signed by President Bill Clinton (1946–), with the added authority of allowing independent counsels to investigate members of Congress.

When the evidence warranted doing so, the independent counsel had the authority to advise the House of Representatives of possible grounds for impeachment and, if such existed, be ordered by Congress to begin impeachment proceedings. While free from many of the supervisory constraints placed upon regular Justice Department attorneys, the independent counsel was bound by the same laws as all other prosecutors. As oversight, the independent counsel could be called to testify before Congress to explain his or her methods. The independent counsel was also audited twice per year by the General Accounting Office (now the Government Accountability Office). While the independent counsel did not report to the Justice Department, the attorney general, or the president, he or she could be ordered to close an investigation by the same judicial panel that made the appointment, or by the attorney general, provided there was “good cause.”

Controversies

At one time or another, both Democrats and Republicans have either favored or opposed the Independent Counsel Act. When independent counsel Lawrence E. Walsh (1912–) spent the late 1980s and early 1990s investigating the Iran-Contra affair and successfully prosecuted members of the administration of Ronald Reagan (1911–2004), Republicans were incensed and Democrats delighted. Outraged that Walsh’s Iran-Contra report—which contained an unflattering portrayal of then-incumbent President George H. W. Bush (1924–)—was released just days before the 1992 election, Republicans in Congress let the law lapse.

When the presidency of Bill Clinton was bogged down by five years of investigations (involving alleged financial and sexual improprieties), lead by independent counsel Kenneth Starr (1946–), it was the Democrats’ turn to rail against the Independent Counsel Act, arguing that special investigations had no limits on cost, length, and scope. Starr, they said, was inappropriate and overzealous in his pursuit of Clinton.

Ironically, it was Clinton who had revitalized the expired independent counsel legislation that came to consume both terms of his presidency—and led to his November 1998 impeachment in the House of Representatives for committing perjury during the investigation of an extramarital affair with a White House intern. (Clinton was acquitted by the Senate in February 1999.)

In Morrison v. Olson (1988), the U.S. Supreme Court upheld the independent counsel statute, despite arguments that the legislation violated the constitutional separation of powers. Many felt the statute was too broad and should be redefined to apply to fewer executive branch positions and for only the most serious offenses. Others feared that the broad powers and unlimited budget bestowed upon the independent counsel had the potential to be corrupted by partisan politics. Some claimed the statute was both overreaching and unnecessary, because Congress has always had the power to conduct hearings into executive branch conduct and impeach a president and members of his cabinet.

Expiration

On June 30, 1999, the Independent Counsel Act was allowed to lapse. From 1978 through 1999, twenty-one independent counsels were appointed, with seven leading to convictions. The investigations cost a combined total of more than $160 million. The Whitewater-related investigations of Bill Clinton and Hillary Clinton (1947–) accounted for more than $64 million of that amount.

While the intent may have been noble, many considered the independent counsel legislation as it was actually applied to be flawed and in need of an overhaul. With the media and partisans eager for drama, independent counsel investigations had become political lightening rods. Even Kenneth Starr, who doggedly pursued allegations against the Clintons, testified before Congress that the act should expire, “for a cooling-off period, or perhaps more aptly, a cease fire.”

With the independent counsel statute dissolved, the power to appoint (and remove) special counsels has reverted to the U.S. attorney general, who has sole discretion as to whether, and by whom, an investigation will be pursued. Hence, the responsibility for investigating official misconduct has gone back to where it was during and before Watergate.

Bibliography

Books

Abrams, Floyd. Speaking Freely: Trials of the First Amendment. New York: Viking Penguin, 2005.

Ball, Howard. The Bakke Case: Race, Education, and Affirmative Action. Lawrence: University Press of Kansas, 2000.

Bernstein, Carl, and Bob Woodward. All the President’s Men. New York: Simon and Schuster, 1974.

Bollier, David. Citizen Action and Other Big Ideas: A History of Ralph Nader and the Modern Consumer Movement. Washington, DC: Center for the Study of Responsive Law, 1991.

Carter, Jimmy. An Hour Before Daylight. New York: Simon and Schuster, 2001.

Dole, Robert, and George J. Mitchell, co-chairs. Report & Recommendations for the Project on the Independent Counsel Statute. Washington, DC: American Enterprise Institute and the Brookings Institution, May 1999.

Drew, Elizabeth. Richard M. Nixon. New York: Times Books, 2007.

Favre, David S. Wildlife Law. Detroit: Lupus Publications, 1991.

Fehner, Terrance R., and Jack M. Hall. Department of Energy 1977–1994: A Summary History. Washington, DC: U.S. Department of Energy/Energy History Series, November 1994.

Garrow, David J. Liberty and Sexuality: The Right to Privacy and the Making of “Roe v. Wade”. New York: Macmillan, 1994.

Johnston, Lloyd D., et al. Monitoring the Future: National Survey Results on Drug Use, 1975–2005. Volume II: College Students and Adults Ages 19–45 (NIH Publication No. 06-5884). Bethesda, MD: National Institute on Drug Abuse, 2006.

Karnow, Stanley. Vietnam: A History, The First Complete Account of Vietnam at War. New York: Viking Press, 1983.

Katzmann, Robert A., ed. Daniel Patrick Moynihan: The Intellectual in Public Life. Washington, DC: Woodrow Wilson Center Press, 1998.

Olson, Keith W. Watergate: The Presidential Scandal That Shook America. Lawrence: University Press of Kansas, 2003.

Shabecoff, Philip. A Fierce Green Fire: The American Environmental Movement. New York: Farrar, Straus & Giroux, 1993.

Steinem, Gloria. Outrageous Acts and Everyday Rebellions. New York: Henry Holt and Company, 1983.

Stoler, Peter. Decline and Fall: The Ailing Nuclear Power Industry. New York: Dodd, Mead & Company, 1985.

Urofsky, Melvin I. Money and Free Speech: Campaign Finance Reform and the Courts. Lawrence: University Press of Kansas, 2005.

Woodward, Bob, and Scott Armstrong. The Brethren: Inside the Supreme Court. New York: Avon Books, 1979.

Periodicals

Bailey, Lisa Pritchard, et al. “Racketeer Influenced and Corrupt Organizations.” American Criminal Law Review, 36 (Summer 1999).

Baird, Benita S. “The Government in the Sunshine Act: An Overview.” Duke Law Journal, 1977, no. 2, Eighth Annual Administrative Law Issue (May 1977): 565–92.

Hotz, V. Joseph, and John Karl Scholz. “The Earned Income Tax Credit.” National Bureau of Economics, Working Paper No. 8078 (January 2001).

Lain, Corinna Barrett. “Furman Fundamentals.” Washington Law Review 82 (February 2007): 1–74.

Pomper, David. “Recycling Philadelphia v. New Jersey: The Dormant Commerce Clause, Post-Industrial ‘Natural’ Resources, and the Solid Waste Crisis.” University of Pennsylvania Law Review 137 (April 1989): 1309–1349.

Staples, Shawn. “Nothing Sacred: In Van Orden v. Perry, the United States Supreme Court Erroneously Abandoned the Establishment Clause’s Foundational Principles Outlines in Lemon v. Kurtzman.” Creighton Law Review 39 (April 2006): 783–825.

Web Sites

American Experience, Jimmy Carter: The Iran Hostage Crisis, November 1979–January 1981, Public Broadcasting System, http://www.pbs.org/wgbh/amex/carter/peopleevents/e_hostage.html (accessed April 14, 2007).

Black Panther Party/Huey P. Newton Foundation, Ten-Point Plan, http://www.blackpanther.org/TenPoint.htm (accessed July 16, 2007).

Dickinson College, Three Mile Island 1979 Emergency: Virtual Museum, Dickinson College, Carlisle, PA, http://www.threemileisland.org (accessed April 3, 2007).

The Ford Museum, The Watergate Files, Gerald R. Ford Presidential Library & Museum, http://www.ford.utexas.edu/museum/exhibits/watergate_files/index.html (accessed April 2, 2007).